Please visit Search Engine Land for the full article.
from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2aLrdTH
Posted by DaveSottimano
Let’s face it: SEO isn’t as black & white as most marketing channels. In my opinion, to become a true professional requires a broad skill set. It’s not that a professional SEO needs to know the answer for everything; rather, it’s more important to have the skills to be able to find the answer.
I’m really pleased with the results of various bits of training I’ve put together for successful juniors over the years, so I think it’s time to share.
This is a Junior SEO task list designed to help new starters in the field get the right skills by doing hands-on jobs, and possibly to help find a specialism in SEO or digital marketing.
How long should this take? Let’s ballpark at 60–90 days.
Before anything, here’s some prerequisite reading:
Master the lingo and have a decent idea of how the Internet works before they start having conversations with developers or contributing online. Have the trainee answer the following questions. To demonstrate that they understand, have them answer the questions using analogies. Take inspiration from this post.
Must be able to answer the following in detail:
If the trainee doesn’t understand what something is, make sure that they try and figure it out themselves before coming for help. Building a website by hand is absolutely painful, and they might want to throw their computer out the window or just install Wordpress — no, no, no. There are so many things to learn by doing it the hard way, which is the only way.
The site must contain at least one instance of each of the following, and every page which contains a directive (accompanying pages affected by directives as well) must be tracked through a rank tracker:
The trainee can use whatever tracking tool they like; http://ift.tt/1nfnpgT is free for 100 keywords. The purpose of the rank tracking is to measure the effects of directives implemented, redirects, and general fluctuation.
Create the following XML sitemaps:
These tasks can be done on an independent website or directly for a client; it depends on your organizational requirements. This is the part of the training where the trainee learns how to negotiate, sell, listen, promote, and create exposure for themselves.
Spreadsheets are to SEOs as fire trucks are to firefighters. Trainees need to be proficient in Excel or Google Docs right from the start. These tasks are useful for grasping data manipulation techniques in spreadsheets, Google Analytics, and some more advanced subjects, like scraping and machine learning classification.
Must be able to fill in required arguments for the following formulas in under 6 seconds:
Also:
The candidate must be able to do the following:
The candidate must be able to do the following:
These projects are designed to broaden their skills, as well as as prepare the trainee for the future and introduce them to important concepts.
Write 2 functions in 2 different programming languages — these need to be functions that do something useful (i.e. “hello world” is not useful).
If I were to pick which technology, it would be Javascript and Python. Javascript (Node, Express, React, Angular, Ember, etc.) because I believe things are moving this way, i.e. 1 language for both front and back end. Python because of its rich data science & machine learning libraries, which may become a core part of SEO tasks in the future.
I strongly recommend anyone in SEO to build their own search engine — and no, I’m not crazy, this isn’t crazy, it’s just hard. There are two ways to do this, but I’d recommend both.
Get them to pass http://oap.ninja/, built by the infamous Dean Cruddace. Warning, this is evil — I’ve seen seasoned SEOs give up after just hours into it.
Employers are asking for a wider array of skills that range from development to design as standard, not "preferred."
Have a look around at current SEO job listings. You might be surprised just how much we’re expected to know these days:
The list goes on and on, but you get the point. We’re expected to be developers, designers, PR specialists, salespeople, CRO, and social managers. This is why I believe we need to expose juniors to a wide set of tasks and help them develop a broad skill set.
You might hate me now, but when you’re making a lot more money you might change your mind (you might even want to cuddle).
Plus, I’m putting you through hell so that….
Feel free to ping me on Twitter (@dsottimano) or you can catch me hanging out with the DMG crew.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
Posted by George-Freitag
Duplicate listings have been a plague to local search marketers since local search was a thing. When Moz Local first introduced duplicate closure in the fall of 2014, the goal was to address the horribly time-consuming task of finding and closing all those duplicate listings causing problems in Google, Bing, and various mapping platforms. Though we’ve consistently been making improvements to the tool’s performance (we’ll get into this later), the dashboard itself has remained largely unchanged.
Not anymore. Today, we’re proud to announce our brand new duplicate management dashboard for Moz Local:
Here’s a rundown of the features you can look for in the Moz Local upgrade:
This duplicate management update represents a new standard in the industry and will help our users be more productive and efficient than ever.
Eliminating duplicates and near-duplicates on major data sources and directories has always been one of the most effective ways to increase your presence in the local pack. It’s a key part of citation consistency, which was rated as the second most important tactic for getting into local pack results according to the 2015 local ranking factors survey. On top of that, in last May’s Mozinar on local search, Andrew Shotland of Local SEO Guide mentioned that he saw a 23% increase in presence in the local pack just by addressing duplicates.
So we know that seeking and destroying duplicates works. The problem is that doing it manually just takes for-e-ver. Anyone who works in local search knows the pain and monotony of combing through Google for variations of a business, then spending more time finding the contact form needed to actually request a closure.
Our duplicate listing feature has always focused on easily identifying potential duplicates and presenting them to marketers in a way that allows them to quickly take action. In the case of the aggregators (like Infogroup and Localeze) and direct partner sites (like Foursquare and Insider Pages), this takes the form of single-click closure requests that are quickly reviewed and sent directly to the source.
For sites that aren’t part of our direct network or don’t accept closure requests from anyone, like Facebook, we still do our best to point our users in the right direction so they can close the listing manually. Originally, the dashboard took the form of a long list where marketers could scroll down and take action, as needed.
Though this worked great for many of our users, it quickly became problematic for large brands and agencies. Based on data collected from the thousands of brands and locations we track, we know that the average enterprise client can have around 3,500 duplicate listings and, in some cases, that number can be as high as 100,000 duplicates. Even though we estimate our tool can reduce the time spent managing duplicates by around 75%, when you have literally thousands of duplicates to parse through, a single to-do list quickly becomes impractical.
The first opportunity we saw was to provide you with a bit more transparency into our closure process. Though we always provided some insight related to where we were in the closure process, there was no way to view this at an aggregated level and no way to see how many duplicates had been closed so you could track your progress.
So we fixed that.
Now all Moz Local customers can easily see how many duplicates are still marked as "open," how many are being reviewed, and how many listings have been successfully closed. If you’re an agency or consultant, this can be especially useful to demonstrate progress made in identifying and closing duplicates for your clients. If you’re a brand, this can be a great way to build a business case for additional resources or show the value of your local strategy.
We also saw another opportunity to improve transparency by further breaking down the reporting by the type of data partner. Moz Local has always been very deliberate in surfacing the relationship we have with our partners. Because of this, we wanted to add another layer of insight based on the nature of the partnership.
Verification Partners include Google and Facebook, since they're sources we use to verify our own data. Though we can’t close duplicates directly at this point, they're so influential we felt it was imperative to include the ability to identify duplicates on these platforms and guide you as far as possible through the closure process.
Direct Partners are data sources that we have a direct relationship with and submit business listings instantly through our distribution service. For all major aggregators and most of our direct partner directories, you can use our single-click duplicate closure, meaning that all you have to is click “Close” and we’ll make sure it’s removed completely from their database, forever.
Lastly, we have our Indirect Partners. These are sources that receive all of our listing data via our direct partners, but we do not submit to directly. Though we can’t close listings on these sources automatically, we can still detect duplicates and send you directly to their closure form to help you request the closure.
The second opportunity was to address the long list-view that our users used to identify, evaluate, and take action related to duplicates we discovered. With so many of our clients having hundreds or thousands of listings to manage, it quickly became apparent that we needed some advanced sorting to help them out with their workflow.
So we added that, as well.
Now, if you only want to view the listings that need action, you can just click “Open,” then scroll down and choose to close or ignore any of duplicates in that view. If you then want to see how many duplicates have already been closed and removed from the data partner, you can just click that checkbox. If you want to only see the open duplicate listings for a certain partner, like Foursquare, that’s an option as well.
Further, just like everything else in the Moz Local dashboard and Search Insights, reporting strictly follows any filters and labels from the search bar. This can be especially useful if you’re an agency that wants to narrow your view to a specific client, or a brand that wants to only view reporting for a single marketing region.
For example, if you only want to see closed duplicates from Infogroup located in Texas that are part of the campaign “hanna-barbera” well, there you go.
All data in any filtered view is easily exportable via CSV so you can repurpose it for your own reporting or research.
Lastly, all of these reports are retroactive, meaning any duplicates you’ve requested closure or closed in the past will show up in the new duplicates dashboard and be available for advanced sorting and reporting.
The new interface and reporting features aren’t the only things we’ve improved. Over the last year, our developers have been spending countless hours fine-tuning the duplicate closure process and improving relationships with our data partners.
Early on, the Moz Local team decided that the product should focus on the data sources that have the greatest impact for local businesses, regardless of their relationship with us, directly. As a result, we built the widest and most complex set of partnerships with aggregators, direct and indirect partners, and business directories in the industry. This update not only launches a new dashboard but also marks the kickoff for some huge improvements to our back-end.
The challenge that comes with working with a network as diverse as ours is that each of our partners handle duplicate listings in completely different ways. The Moz Local team has always had resources devoted specifically to work with our partners to improve our data submission and listing management processes. For duplicates, however, this meant we needed to help some of our partners enhance their own APIs to accept closure requests or, in some cases, create the API all together!
As part of this update, our development team has implemented new instrumentation and alerts to better identify submission errors sent to our partners, speed up the closure process, and quickly re-submit any closure requests that were not processed correctly.
Additionally, we’ve shortened our internal review cycle for closure requests. In order ensure the quality of duplicate closures and to be sure our “alternates” feature isn’t being used maliciously, we manually review a percentage of closure requests. Through a variety of processes, we are now able to programmatically approve more closures, allowing for faster manual reviews of all other closure requests. As a result, we are now able to automatically approve around 44% of all closure requests instantly.
The most exciting thing about this update is that it’s only the beginning. Over the next few months expect to see further integration with our data partners, discovery and progress notifications, increased closure efficiency, and more.
We hope you find our new duplicates dashboard useful and, most importantly, we hope it makes your lives a little bit easier.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!