Tuesday, December 10, 2013

The Future of Content: Upcoming Trends in 2014

Posted by StephanieChang


We've entered a fortuitous time to be involved in the digital marketing space. Almost half of the global population now has access to the internet, the way consumers consume content is rapidly evolving, and with that comes an exciting array of challenges and opportunities. This post specifically focuses on the trends that lay ahead for content marketers and the role they play within an organization. Having a concrete understanding of upcoming trends is important in laying the foundation for defining the content goals within an organization and deciding where resources should be allocated.



Trend 1: Competition to gain consumers' attention will increase


Posting new, unique content regularly on your site is NOT enough. Each day there are around 92,000 new articles posted on the internet. Digital media publishers have created systems to produce the greatest amount of content at the lowest price. For example, The Huffington Post produces at least 1,200 pieces of content a day, and Forbes produces 400 (with 1,000 contributors). It's not just from publishers; WordPress users produce about 35.8 million new posts each month.



Image Credit


Smaller businesses won't be able to compete based on sheer volume. So how can a site differentiate itself in this market? This is where the development of a content strategy can come into play. It's extremely helpful to understand a company's unique value proposition, and if the company doesn't have one, to understand where the opportunities are in the space to create one. For B2C companies, it can be identifying the company's existing target audience and promoting the brand as an advocate for a particular lifestyle. For B2B companies, it is often times about positioning your brand to be the ultimate authority or source of knowledge in a specific industry/niche.


When developing a content strategy, it's important to evaluate the product that the business sells. Evaluating a product doesn't mean identifying the features or solely understanding the benefits of the product. It actually means understanding the marketability of the product. For instance, is the product a "think" product or a "feel" product? Does the product require high involvement or low involvement from the consumer? Using the FCB grid developed by Richard Vaughn is a useful tactic.


A "think" product is one where a consumer heavily considers before purchasing. These type of products usually involve a high amount of research and personal effort by the consumer before purchasing.


A "feel" product is one where emotion plays a pivotal role in the buying process.


A "high involvement" product is one where the consumer is heavily involved in the buying decision. These products are generally more expensive, but not from just a fiscal perspective. It can also be something that once purchased, will require a lot more time to change, or it has significantly more impact from a long-term perspective. For instance, opening a retirement account is a "high involvement" purchase. A wallpaper purchase is also a "high involvement" purchase.


"Low involvement" products tend to err on a more impulsive or spur-of-the moment purchase. Once a consumer decides they need this product, not much time will be spent researching because it involves a low margin of error if a decision was incorrectly made. The price of the product is usually low.



Image Credit


If the product the company sells is a "high involvement"/"think" product, the consumer is going to spend significantly more time researching the product, including reading/watching product reviews, identifying product features, assessing if this purchase is worth the cost, etc. As a result, the content strategy for such a product should involve plenty of information on the product features, the benefits of the product, as well as growing the product and brand awareness, so that consumers will both discover and search for the product.


If the product the company sells is a "low involvement"/"feel" product, more time should be invested to connecting with consumers and appealing to their emotions. These products should also focus their efforts on building brand loyalty and retention of customers because these products tend to be repeat purchases.


Julian Cole, the Head of Comms Planning at BBH, breaks down this process in great detail in his "Working Out the Business Problems" slide deck.


Trend 2: Determining the key metrics to measure content's success will be more important


Traditionally, traffic and page views have been the longstanding metrics to gauge a piece of content's success by. Although there are clear value propositions in having increased traffic (such as increased brand awareness and increased/potential revenue for publishers and bloggers), these metrics on their own can be misleading. More importantly, solely focusing on traffic and page views as a metric of success can lead to unintentional behaviors and misguided motivations. These can include an overemphasis of click-worthy headlines, overuse of keywords in a title, and changing the focus from creating content for users (building for the long-term) to creating content for page views (short-term wins).


Ultimately, determining the right metrics for an organization's content depends on the goals for the content. Is it to maintain an engaged community/develop brand advocates, build brand awareness, and/or to convert users into paying customers? Perhaps it is a combination of all 3? These are all difficult questions to answer.


At Distilled, we're currently working with clients to help them define these metrics for their content. Sometimes, the best option is to use a combination of metrics that we want to analyze and target. For some clients, a key metric could be combining organic traffic + % returning visitors + tracking changes in bounce rate and time on site. For instance, if a user finds exactly what they're looking for and bounce, that's not necessarily bad. Perhaps, they landed on an ideal landing page and found the exact information they were looking for. That's a fantastic user experience, especially if the users have a long time on site and if they become a returning visitor. Looking at any metric in isolation can lead to tons of wrong assumptions and while there is not a perfect solution, combining metrics can be the next best alternative.


For other businesses, social metrics can be a great conversion metric for content pieces. A Facebook like or a Twitter retweet signals some engagement, whereas a share, a comment, or becoming a "fan" of a Facebook page signals a potential brand advocate. Although a share or a new "fan" on a Facebook page may be worthy more, all these activities demonstrate the ability of a piece to gain a user's attention and that awareness is worth something.


Content Marketing Institute has a great list of key metrics that B2B and B2C companies use to measure the effectiveness of their content.




Trend 3: Increased interest in content integration (content will be produced for multiple channels)


Some of the biggest challenges involved in content often times have nothing to do with content. For many of my clients, the biggest struggles usually involve decisions regarding proper resource allocation - lack of time to implement all of the goals, lack of budget to implement these strategies in an ideal way, and the constant battle with readjusting priorities. These hard constraints make marketing especially challenging, especially as more and more channels develop and digital innovation advances so quickly. While there is no perfect solution to this problem, the next best alternative to balancing out hard resource constraints with the constant need for innovation is to develop better integration methodologies. A poll of CMOs have put integrated marketing communications ahead of effective advertising when it comes the most important thing they want from an agency.


Why is this so important? It's because there is a change in the way consumers shop. Accenture conducted global market research on the behaviors of 6,000 consumers in eight countries. One of the top recommendations was the important of providing consumers with a "seamless retail experience." This means providing an on-brand, personalized, and consistent experience regardless of channel. That seamless experience will require content to be heavily involved in a multitude of channels from online to in-person in order to provide potential and current customers with one consistent conversation.


The chart below shows statistics about the way Millennials shop. Although Millennials tend to be exceptionally digitally-savvy (especially when it comes to social media), studies show they still like to shop in retail/brick-and-more stores. Millennials use the internet to research and review price, products, value, and service and have shown to have an impact on how their parents shop.



The integration of content does not apply to just consumer retail stores. For instance, British Airways has a billboard in London that is programmed to show a kid pointing to a flying British Airways plane every time one passes over the billboard. Here is the video that shows how the billboard works.



Last year, AT&T launched a 10,000 foot digitally enhanced store to showcase an apps wall, as well content dedicated to lifestyle areas, like fitness, family, and art. Start-up food blog, Food52 (who is starting to go into ecommerce) is launching a holiday market pop-up store in NYC.


Content Marketing Institute's 2014 Report for B2B content marketers indicates that B2B content marketers still view in-person events as their most effective tactic. The seamless transition of content from online marketing channels (via social media conversations, PPC and display ads, and content on the site via case studies and videos) to in-person conversations and consumer experience will only grow in importance.



Trend 4: Experimentation with content in new mediums


Technology and digital innovation are experiencing rapid increases in growth. PCs are now a small percentage of connected devices, wearables, and smart TVs are about to go mainstream. As competition for attention increases, companies will be increasingly willing to experiment with content in new mediums to reach their intended audiences.



This graph is just one depiction of how quickly technology evolves. As marketers, having the ability to quickly adapt and scale to new trends/opportunities is critical. This past year, marketing agency, SapientNitro, released a 156-page free guide entitled Insights 2013 that talks in detail about some of these trends, such as in-store digital retail experiences, the future of television, sensors and experience design, and customer experience on the move to name a few.


One of their case studies talks about Sephora. Sephora has developed great content in retail stores, such as several interactive kiosks that allow users to explore different fragrances or gain understanding about skincare. IPads surround the store that provide how to makeup tips and items can be scanned to reveal product information. Sephora's mobile app has content that speaks to their core customer base and is in line with their other online and social media content. All of the content can be easily shared via email or through social networks.


Other brands, such as Nivea mixed print advertising with mobile innovation. In this case, Nivea's print ad also doubled as a solar ad charger for phones.



Finally, PopTopia is a mobile game that has a mobile phone attachment, called Pop Dangle that will emit the smell of popcorn as you play the game. The game works because the attachment plugs into the audio jack and at a certain frequency, it will signal to spread the smell of popcorn. These examples all show brands who have embraced new mediums for content.




2014 will be an exciting time for the future of content. As technology evolves and competition for user attention increases, marketers need to be agile and adapt to the growing needs and expectations of their customers. The future of businesses will absolutely be critical upon businesses having a very clear unique value proposition. Why is this so crucial? This is the pivotal foundation from which marketing strategies and execution will grow. Our job as marketers is to use that information to pinpoint the metrics we need to measure and prioritize all future marketing strategies. This task is very difficult, but our role is to continue to embrace these challenges in order to seek solutions. Now is the ideal time to begin.




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/future-of-content-upcoming-trends-in-2014

Investing in Non-Measurable Serendipitous Marketing - Whiteboard Friday

Posted by randfish


Sticking to what can be easily measured often seems like the safest route, but avoiding the unknown also prevents some of the happier accidents from taking place. In today's Whiteboard Friday, Rand explains why it's important to invest some of your time and resources in non-measurable, serendipitous marketing.














For reference, here's a still of this week's whiteboard!



Video Transcription



Howdy Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to talk about something that we don't usually talk about in the inbound marketing world because inbound, of course, is such a hyper-measurable channel, at least most of the investments that we make are very measurable, but I love serendipitous marketing too. That's investing in serendipity to earn out-sized returns that you might not be able to make. That's a tough sell for a lot of management, for a lot of executives, for a lot of marketers because we're so accustomed to this new world of hyper-measurability. But with a couple examples, I'll illustrate what I mean.


So let's say we start by maybe you go and you attend an off-topic conference, a conference that isn't normally in your field, but it was recommended to you by a friend. So you go to that event, and while you are there, you meet a speaker. You happen to run into them, you're having a great chat together, and that speaker later mentions your product, your company, your business on stage at the event. It turns out that that mention yields two audience members who become clients of yours later and, in fact, not just clients, but big advocates for your business that drive even more future customers.


This is pretty frustrating. From a measurability standpoint, first off, it's an off-topic event. How do you even know that this interaction is the one that led to them being mentioned? Maybe that speaker would have mentioned your business anyway. Probably not, but maybe. What about these folks? Would those two customers have come to your business regardless? Were they searching for exactly what you offered anyway? Or were they influenced by this? They probably were. Very, very hard to measure. Definitely not the kind of investment that you would normally make in the course of your marketing campaigns, but potentially huge.


I'll show you another one. Let's say one day you're creating a blog post, and you say, "Boy, you know, this topic is a really tough one to tackle with words alone. I'm going to invest in creating some visual assets." You get to work on them, and you start scrapping them and rebuilding them and rebuilding them. Soon you've spent off hours for the better part of a week building just a couple of visual assets that illustrate a tough concept in your field. You go, "Man, that was a huge expenditure of energy. That was a big investment. I'm not sure that's even going to have any payoff."


Then a few weeks later those visuals get picked up by some major news outlets. It turns out, and you may not even be able to discover this, but it turns out that the reporters for those websites did a Google image search, and you happened to pop up and you clearly had the best image among the 30 or 40 that they scrolled to before they found it. So, not only are they including those images, they're also linking back over to your website. Those links don't just help your site directly, but the news stories themselves, because they're on high-quality domains and because they're so relevant, end up ranking for an important search keyword phrase that continues to drive traffic for years to come back to your site.


How would you even know, right? You couldn't even see that this image had been called by those reporters because it's in the Google image search cache. You may not even connect that up with the rankings and the traffic that's sent over. Hopefully, you'll be able to do that. It's very hard to say, "Boy, if I were to over-invest and spend a ton more time on visual assets, would I ever get this again? Or is this a one-time type of event?"


The key to all of this serendipitous marketing is that these investments that you're making up front are hard or impossible to predict or to attribute to the return on investment that you actually earn. A lot of the time it's actually going to seem unwise. It's going to seem foolish, even, to make these kinds of investments based on sort of a cost and time investment perspective. Compared to the potential ROI, you just go, "Man, I can't see it." Yet, sometimes we do it anyway, and sometimes it has a huge impact. It has those out-sized serendipitous returns.


Now, the way that I like to do this is I'll give you some tactical stuff. I like to find what's right here, the intersection of this Venn diagram. Things that I'm passionate about, that includes a topic as well as potentially the medium or the type of investment. So if I absolutely hate going to conferences and events, I wouldn't do it, even if I think it might be right from other perspectives.


I do particularly love creating visual assets. So I like tinkering around, taking a long time to sort of get my pixels looking the way I want them to look, and even though I don't create great graphics, as evidenced here, sometimes these can have a return. I like looking at things where I have some skill, at least enough skill to produce something of value. That could mean a presentation at a conference. It could mean a visual asset. It could mean using a social media channel. It could mean a particular type of advertisement. It could mean a crazy idea in the real world. Any of these things.


Then I really like applying empathy as the third point on top of this, looking for things that are something that my audience has the potential to like or enjoy or be interested in. So this conference my be off-topic, but knowing that it was recommended by my friend and that there might be some high-quality people there, I can connect up the empathy and say, "Well, if I'm putting myself in the shoes of these people, I might imagine that some of them will be interested in or need or use my product."


Likewise, if I'm making this visual asset, I can say, "Well, I know that since this is a tough subject to understand, just explaining it with words alone might not be enough for a lot of people. I bet if I make something visual, that will help it be much better understood. It may not spread far and wide, but at least it'll help the small audience who does read it."


That intersection is where I like to make serendipitous investments and where I would recommend that you do too.


There are a few things that we do here at Moz around this model and that I've seen other companies who invest wisely in serendipity make, and that is we basically say 1 out of 5, 20% of our time and our budget goes to serendipitous marketing. It's not a hard and fast rule, like, "Oh boy, I spent $80 on this. I'd better go find $20 to go spend on something serendipitous that'll be hard to measure." But it's a general rule, and it gives people the leeway to say, "Gosh, I'm thinking about this project. I'm thinking about this investment. I don't know how I'd measure it, but I'm going to do it anyway because I haven't invested my 20% yet."


I really like to brainstorm together, so bring people together from the marketing team or from engineering and product and other sections of the company, operations, but I really like having a single owner. The reason for that single owner doing the execution is because I find that with a lot of these kind of more serendipitous, more artistic style investments, and I don't mean artistic just in terms of visuals, but I find that having that single architect, that one person kind of driving it makes it a much more cohesive and cogent vision and a much better execution at the end of the day, rather than kind of the design by committee. So I like the brainstorm, but I like the single owner model.


I think it's critically important, if you're going to do some serendipitous investments, that you have no penalty whatsoever for failure. Essentially, you're saying, "Hey, we know we're going to make this investment. We know that it's the one out of five kind of thing, but if it doesn't work out, that's okay. We're going to keep trying again and again."


The only really critical thing that we do is that we gain intuition and experiential knowledge from every investment that we make. That intuition means that next time you do this, you're going to be even smarter about it. Then the next time you do it, you're going to gain more empathy and more understanding of what your audience really needs and wants and how that can spread. You're going to gain more passion, a little more skill around it. Those kinds of things really predict success.


Then I think the last recommendation that I have is when you make serendipitous investments, don't make them randomly. Have a true business or marketing problem that you're trying to solve. So if that's PR, we don't get enough press, or gosh, sales leads, we're not getting sales leads in this particular field, or boy, traffic overall, like we'd like to broaden our traffic sources, or gosh, we really need links because our kind of domain authority is holding us back from an SEO perspective, great. Make those serendipitous investments in the areas where you hope or think that the ROI might push on one of those particularly big business model, marketing model problems.


All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday. We'll see you again next week. Take care.



Video transcription by Speechpad.com




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/investing-in-serendipitous-marketing-whiteboard-friday

New Moz-Builtwith Study Examines Big Website Tech and Google Rankings

Posted by Cyrus-Shepard


BuiltWith knows about your website.


Go ahead. Try it out.



BuiltWith also knows about your competitors' websites. They've cataloged over 5,000 different website technologies on over 190 million sites. Want to know how many sites use your competitor's analytics software? Or who accepts Bitcoin? Or how many sites run WordPress?



Like BuiltWith, Moz also has a lot of data. Every two years, we run a Search Engine Ranking Factors study where we examine over 180,000 websites in order to better understand how they rank in Google's search results.


We thought, "Wouldn't it be fun to combine the two data sets?"


That's exactly what our data science team, led by Dr. Matt Peters, did. We wanted to find out what technologies websites were using, and also see if those technologies correlated with Google rankings.



How we conducted the study


BuiltWith supplied Moz with tech info on 180,000 domains that were previously analyzed for the Search Engine Ranking Factors study. Dr. Peters then calculated the correlations for over 50 website technologies.


The ranking data for the domains was gathered last summer—you can read more about it here—and the BuiltWith data is updated once per quarter. We made the assumption that basic web technology, like hosting platforms and web servers, don't change often.


It's very important to note that the website technologies we studied are not believed to be actual ranking factors in Google's algorithm. There are huge causation/correlation issues at hand. Google likely doesn't care too much what framework or content management system you use, but because SEOs often believe one technology superior to the other, we thought it best to take a look..


Web hosting platforms performance


One of the cool things about BuiltWith is not only can you see what technology a website uses, but you can view trends across the entire Internet.


One of the most important questions a webmaster has to answer is who to use as a hosting provider. Here's BuiltWith's breakdown of the hosting providers for the top 1,000,000 websites:



Holy GoDaddy! That's a testament to the power of marketing.


Webmasters often credit good hosting as a key to their success. We wanted to find out if certain web hosts were correlated with higher Google rankings.


Interestingly, the data showed very little correlation between web hosting providers and higher rankings. The results, in fact, were close enough to zero to be considered null.


































Web Hosting Correlation
Rackspace0.024958629
Amazon0.043836395
Softlayer-0.02036524
GoDaddy-0.045295217
Liquid Web-0.000872457
CloudFlare Hosting-0.036254475


Statistically, Dr. Peters assures me, these correlations are so small they don't carry much weight.


The lesson here is that web hosting, at least for the major providers, does not appear to be correlated with higher rankings or lower rankings one way or another. To put this another way, simply hosting your site on GoDaddy should neither help or hurt you in the large, SEO scheme of things.


That said, there are a lot of bad hosts out there as well. Uptime, cost, customer service and other factors are all important considerations.


CMS battle – WordPress vs. Joomla vs. Drupal


Looking at the most popular content management systems for the top million websites, it's easy to spot the absolute dominance of WordPress.


Nearly a quarter of the top million sites run WordPress.



You may be surprised to see that Tumblr only ranks 6,400 sites in the top million. If you expand the data to look at all known sites in BuiltWith's index, the number grows to over 900,000. That's still a fraction of the 158 million blogs Tumblr claims, compared to the only 73 million claimed by WordPress.


This seems to be a matter of quality over quantity. Tumblr has many more blogs, but it appears fewer of them gain significant traffic or visibility.


Does any of this correlate to Google rankings? We sampled five of the most popular CMS's and again found very little correlation.






























CMS Correlation
WordPress-0.009457206
Drupal0.019447922
Joomla!0.032998891
vBulletin-0.024481161
ExpressionEngine0.027008018


Again, these numbers are statistically insignificant. It would appear that the content management system you use is not nearly important as how you use it.


While configuring these systems for SEO varies in difficulty, plugins and best practices can be applied to all.


Popular social widgets – Twitter vs. Facebook


To be honest, the following chart surprised me. I'm a huge advocate of Google+, but never did I think more websites would display the Google Plus One button over Twitter's Tweet button.



That's not to say people actually hit the Google+ button as much. With folks tweeting over 58 million tweets per day, it's fair to guess that far more people are hitting relatively few Twitter buttons, although Google+ may be catching up.


Sadly, our correlation data on social widgets is highly suspect. That's because the BuiltWith data is aggregated at the domain level, and social widgets are a page-level feature.


Even though we found a very slight positive correlation between social share widgets and higher rankings, we can't conclusively say there is a relationship.


More important is to realize the significant correlations that exist between Google rankings and actual social shares. While we don't know how or even if Google uses social metrics in its algorithm (Matt Cutts specifically says they don't use +1s) we do know that social shares are significantly associated with higher rankings.



Again, causation is not correlation, but it makes sense that adding social share widgets to your best content can encourage sharing, which in turn helps with increased visibility, mentions, and links, all of which can lead to higher search engine rankings.


Ecommerce technology – show us the platform


Mirror, mirror on the wall, who is the biggest ecommerce platform of them all?



Magento wins this one, but the distribution is more even than other technologies we've looked at.


When we looked at the correlation data, again we found very little relationship between the ecommerce platform a website used and how it performed in Google search results.


Here's how each ecommerce platform performed in our study.


















































Ecommerce Correlation
Magento-0.005569493
Yahoo Store-0.008279856
Volusion-0.016793737
Miva Merchant-0.027214854
osCommerce-0.012115017
WooCommerce-0.033716129
BigCommerce SSL-0.044259375
Magento Enterprise0.001235127
VirtueMart-0.049429445
Demandware0.021544097


Although huge differences exist in different ecommerce platforms, and some are easier to configure for SEO than others, it would appear that the platform you choose is not a huge factor in your eventual search performance.


Content delivery networks – fast, fast, faster


One of the major pushes marketers have made in the past 12 months has been to improve page speed and loading times. The benefits touted include improved customer satisfaction, conversions and possible SEO benefits.


The race to improve page speed has led to huge adoption of content delivery networks.



In our Ranking Factors Survey, the response time of a web page showed a -0.10 correlation with rankings. While this can't be considered a significant correlation, it offered a hint that faster pages may perform better in search results—a result we've heard anecdotally, at least on the outliers of webpage speed performance.


We might expect websites using CDNs to gain the upper hand in ranking, but the evidence doesn't yet support this theory. Again, these values are basically null.


































CDN Correlation
AJAX Libraries API0.031412968
Akamai0.046785574
GStatic Google Static Content0.017903898
Facebook CDN0.0005199
CloudFront0.046000385
CloudFlare-0.036867599


While using a CDN is an important step in speeding up your site, it is only one of many optimizations you should make when improving webpage performance.


SSL certificates, web servers, and framework: Do they stack up?


We ran rankings correlations on several more data points that BuiltWith supplied us. We wanted to find out if things like your website framework (PHP, ASP.NET), your web server (Apache, IIS) or whether or not your website used an SSL certificate was correlated with higher or lower rankings.


While we found a few outliers around Varnish software and Symanted VeriSign SSL certificates, overall the data suggests no strong relationships between these technologies and Google rankings.










































































Framework Correlation
PHP0.032731241
ASP.NET0.042271235
Shockwave Flash Embed0.046545556
Adobe Dreamweaver0.007224319
Frontpage Extensions-0.02056009
SSL Certificates
GoDaddy SSL0.006470096
GeoTrust SSL-0.007319401
Comodo SSL-0.003843119
RapidSSL-0.00941283
Symantec VeriSign0.089825587
Web Servers
Apache0.029671122
IIS0.040990108
nginx0.069745949
Varnish0.085090249


What we can learn


We had high hopes for finding "silver bullets" among website technologies that could launch us all to higher rankings.


The reality turns out to be much more complex.


While technologies like great hosting, CDNs, and social widgets can help set up an environment for improving SEO, they don't do the work for us. Even our own Moz Analytics, with all its SEO-specific software, can't help improve your website visibility unless you actually put the work in.


Are there any website technologies you'd like us to study next time around? Let us know in the comments below!





Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/web-tech-builtwith-study

Hummingbird's Unsung Impact on Local Search

Posted by David-Mihm


Though I no longer actively consult for clients, there seems to have been a significant qualitative shift in local results since Google's release of Hummingbird that I haven't seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.


From where I sit, the outcome has been a slightly lower overall quality in Google's local results, possibly due in part to a "purer" ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird's release have mostly disappeared, it's the secondary Hummingbird flutter, which may have coincided with the November 14th "update," that seems to have caused the most noticeable changes.


I'll be working with Dr. Pete to put together more quantitative local components of Mozcast in the coming months, but for the time being, I'll just have to describe what I'm seeing today with a fairly simplistic analysis.


To do the analysis, I performed manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country. I selected these keywords based on terms that I knew Google considered to have "local intent" across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each "pack."




































Keywords Markets Result Type Taxonomy
personal injury lawyerChicagonational directory (e.g., Yelp)
assisted living facilityPortlandregional directory (e.g., ArizonaGolf.com)
wedding photographerTampalocal business website (e.g., AcmeElectric.com)
electricianBurlingtonbarnacle webpage (e.g., facebook.com/acmeelectric)
pet storeFlagstaffnational brand (e.g., Petsmart.com)


I also performed an even smaller analysis using three keywords that returned carousel results (thanks to SIM Partners for this sample list of keywords): "golf course," "restaurant," and "dance club."


Again, a very simple analysis that is by no means intended to be a statistically significant study. I fully realize that these results may be skewed by my Portland IP address (even though I geo-located each time I searched for each market), data center, time of day, etc.


I'll share with you some interim takeaways that I found interesting, though, as I work on a more complete version with Dr. Pete over the winter.


1. Search results in search results have made a comeback in a big way


If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.


But the winners of this update haven't necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.


This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for "pet store portland" demonstrates.



Results like the one above call into question Google's longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pages—the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.



Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked at—a performance amplified when considering geo-modified keywords alone.



National brands as a result type is underrepresented due to 'personal injury lawyer,' 'electrician,' and 'wedding photographer' keyword choices. For the keywords where there are relevant national brands ('assisted living facility' and 'pet store'), they performed quite well.


2. Well-optimized regional-vertical directories accompanied by content still perform well


While a number of thriving directories were wiped out by the initial Panda update, here's an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page position—in some cases above the fold. I don't remember seeing as many of these kinds of sites over the last 18 months as I do now.


Especially if keywords these sites are targeting return carousels instead of packs, there's still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.




3. There's little-to-no blending going on in local search anymore


While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.


Numerous "lucky" small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.



4. When it comes to packs, position 4 is the new 1


The overwhelming majority of packs seem to be displaying in position 4 these days, especially for "generic" local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.



Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn't before—additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.



5. The number of pack results seems now more dependent on industry than geography


This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google's structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they'd show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.


At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.



































Keyword # in Pack Reason for Variance
assisted living facility6.96-pack in Burlington
electrician6.96-pack in Portland
personal injury lawyer6.4Authoritative OneBox / Bug in Chicago
pet store3.0
wedding photographer7.0

This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn't reliably differ from that of (former) packs, it stands to reason that visual display of all local results might now be controlled by a single back-end mechanism.


6. Small businesses are still missing a big opportunity with basic geographic keyword optimization


This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie's List, Yellowpages.com, and others) for small-market geo-modified phrases (such as "electrician burlington").



For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including "Burlington, VT" in its homepage Title Tag. With just a little TLC—maybe a link to a contact page that says "contact our Burlington electricians"—sites like this one might be able to displace those national directories in positions 1-2-3.


7. The Barnacle SEO strategy is underutilized in a lot of industries


Look at the number of times Facebook and Yelp show up in last year's citation study I co-authored with Whitespark's Darren Shaw. Clearly these are major "fixed objects" to which small businesses should be attaching their exoskeletons.


Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.



This result for "pet store chicago" is one of the few barnacles that I came across—and it's a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema'd stars right on the main Google SERP—which has to increase her clickthrough rate relative to her neighbors.



Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite result—the surprisingly competitive "dance club flagstaff" where Jax is absolutely crushing it on Facebook despite no presence in the carousel.



What does all this mean?


I have to admit, I don't really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?


One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?


At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.



  1. Reports of directories' demise have been greatly exaggerated. For whatever reason (?), Google seems to be giving directories a renewed lease on life. With packs overwhelmingly in the fourth position, they can now compete for above-the-fold visibility in positions 1-2-3, especially in smaller and mid-size metro areas.

  2. Less-successful horizontal directories (non-Yelps and TripAdvisors, e.g.) should consider the economics of their situation. Their ship has largely sailed in larger metro areas like Chicago and Portland. But they still have the opportunity to dominate smaller markets. I realize you probably can't charge a personal injury lawyer in Burlington what you charge his colleague in downtown Chicago. But, in terms of the lifetime value of who will actually get business from your advertising packages, the happy Burlington attorney probably exceeds the furious one from Chicago (if she is even able to stay in business through the end of her contract with you).

  3. The Barnacle opportunity is huge, for independent and national businesses alike. With Google's new weighting towards directories in organic results and the unblending of packs, barnacle listings present an opportunity for savvy businesses to earn three first-page positions for the same keyword—one pack listing, one web listing, and one (or more) barnacle listing.

  4. National brands who haven't taken my advice to put in a decent store locator yet should surely do so now. Well-structured regional pages, and easily-crawled store-level pages, can get great visibility pretty easily. (If you're a MozCon attendee or have purchased access, you can learn more about this advice in my MozCon 2013 presentation.)


  5. Andrew Shotland already said it in the last section of his Search Engine Land column, but regionally-focused sites—whether directories or businesses—should absolutely invest in great content. With Penguin and Hummingbird combined, thin-content websites of all sizes are having a harder time ranking relative to slightly thicker content directories.




Well, that's my take on what's happening in local search these days...is the Moz community seeing the same things? Do you think the quality of local results has improved or declined since Hummingbird? Have you perceived a shift since November 14th? I'd be particularly interested to hear comments from SEOs in non-U.S. markets, as I don't get the chance to dive into those results nearly as often as I'd like.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/local-hummingbird-results

How to Build Your Own Mass Keyword Difficulty Tool

Posted by MartinMacDonald


Despite keywords being slightly out of fashion, thanks to the whole (not provided) debacle, it remains the case that a large part of an SEO's work revolves around discovering opportunity and filling that same opportunity with content to rank.


When you are focusing on smaller groups of terms, there are plenty of tools to help; the Moz Keyword Difficulty Tool being a great example.



These tools function by checking the top results for a given keyword, and looking at various strength metrics to give you a snapshot as to how tough they are to rank for.


The problem is, though, that these tools operate on the fly, and generally only allow you to search for a small amount of keywords at any one time. The Moz tool, for instance, limits you to 20 keywords.


But I need to check 100,000 keywords!


By the end of this tutorial you will be able to visualize keyword difficulty data in a couple of ways, either by keyword:



Or by keyword type:



Or by category of keyword, spliced by specific position in the results:



So what do we need to do?


All keyword difficulty tools work in the same way when you break them down.


They look at ranking factors for each result in a keyword set, and sort them. It's that simple.


The only thing we need to do is work out how to perform each step at scale:



Step 1: Get URLs


My preference for scraping Google is using Advanced Web Ranking to get the ranking results for large sets of keywords.


Quite a few companies offer software for this service (including Moz), but the problem with this approach is that costs spiral out of control when you are looking at hundreds of thousands of keywords.


Once you have added your keyword set, run a ranking report of the top 10 results for the search engine of your choice. Once it's complete you should see a screen something like this:



The next step is to get this data out of Advanced Web Ranking and into Excel, using a "Top Sites" report, in CSV format (The format is important! If you choose any other format it makes manipulating the data much tougher):



This presents us with a list of of keywords, positions, and result URLs:



So now we can start harvesting some SEO data on each one of those results!


My preference is to use the fantastic Niels Bosma Excel Plugin and the MajesticSEO API to access their Citation Score metric.


Equally, though, you could use the SEOgadget Excel tool alongside the Moz API. I haven't tested that thoroughly enough, but it should give you pretty similar results if you are more used to using them.


Step 2: Analyze results


Now that we have a nice result set of the top 10 results for your keyword list, its time to start pulling in SEO metrics for each of those to build some actionable data!


My preference is to use the Niels Bosma Excel Plugin, as its super easy and quick to pull the data you need directly into Excel where you can start analyzing the information and building charts.


If you haven't already done so, you should start by downloading and installing the plugin available here (note: It's for Windows only, so if you are a Mac user like me, you'll need to use Parallels or another virtual machine).



In the column adjacent to your list of URLs you simply need to use the formula:


=MajesticSEOIndexItemInfo(C2,"CitationFlow","fresh",TRUE)

This formula gives you the CitationFlow number for the URL in cell C2. Obviously, if your sheet is formatted differently, then you'll need to update the cell reference number.


Once you see the CitationFlow appear in that cell, just copy it down to fill the entire list, and if you have lots of keywords right now would be a great time to go grab a coffee, as it can take some time depending on your connection and the number of results you want.


Now you should be looking at a list something like this:



Which allows us to start doing some pretty incredible keyword research!


Step 3: Find opportunity


The first thing that you probably want to do is look at individual keywords and find the ranking opportunity in those. This is trivially easy to do as long as you are familiar with Excel pivot tables.


For a simple look, just create a pivot of the average citation score of each keyword, the resulting table creator wizard will look something like this:



Of course you can now visualize the data just by creating a simple chart, if we apply the above data to a standard bar chart you will begin to see the kind of actionable data we can build:



This is just the beginning, though! If you create a pivot chart across a large dataset and look at the average citation score for each position, you can see interesting patterns develop.


This example is looking at a dataset of 52,000 keywords, and taking the average score of each site appearing in each position in the top 10 results:



As you can see, across a large dataset there is a really nice degradation of strength in the top 10 results, a real vindication that the data we are looking at is rational and is a good indicator of how strong you need to be to rank a given page (providing the content is sufficient and focused enough).


You really want to splice the data into categories at this stage, to identify the areas of quickest opportunity and focus on building content and links towards the areas where you are likely to earn traffic.


The below chart represents a comparison of three categories of keywords, sorted by the average Citation of the results in each category:



From this we can see that of the three keyword categories, we are likely to rank higher up for keywords in the "brown widgets" category. Having said that, though, we are also able to rank lower down the page in the "blue widgets" category, so if that has significantly more traffic it might prove a better investment of your time and energy.


There you go!


We have created a homebrew keyword difficulty tool, capable of analyzing hundreds of thousands of URLs to mine for opportunity and guide your content and linkbuilding strategies!


There is so much you can do with this data if you put your mind to it.


True, scraping Google's results strictly speaking is against their Terms of Service, but they have a habit of using our data, so lets turn the tables on them for a change!




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/build-your-own-mass-keyword-difficulty-tool

Panic Stations! A Case Study on How to Handle an Important Page Disappearing from Google

Posted by steviephil




Picture the scene...



You wake up, grab a shower, throw on your Moz t-shirt (and other clothes, presumably...), boil the kettle, pour yourself a cup of coffee, switch on the ol' laptop, let your daily rank checks complete and then slowly run through them one by one...


...Yep...


...Yep...


...Ooo, that's nice...


...Uh-huh...


...Yes! Great jump there!...


...Yep...


...Ye- Wait, hold on... What? Lots of red , all across the board? Rankings have either dropped multiple pages or dropped out of the top 100 results entirely?!


Uh-oh. It's gonna be a looong day....


This happened to me recently with one of my clients. Their homepage - their main page as far as rankings were concerned - had mysteriously vanished from Google's index overnight, taking with it a lot of page one rankings, as you can see from the world's saddest and perhaps most unnecessary GIF image below:



This was also the first time that it'd happened to me. Granted, I've consulted on this type of thing before, but usually when it's happened to someone and they approach me asking what's happened afterwards. However, this was the first instance of it where I was discovering it for myself and it was happening under my watch, affecting one of my clients.


This post runs through the steps that I took to resolve the issue. I acted methodically yet swiftly, and in doing so managed to get the homepage back in Google's index (and - with it - its former rankings) in less than 12 hours.


I accept that this is one of those articles where you probably won't even need it until it happens to you. To be honest, I was in that exact situation - I pretty much knew what to do, but I was still a bit like "OMG OMG OMG, whowhatwherewhenwhy?!" in trying to find an article to just double-check that I was doing everything I could be doing and wasn't overlooking anything obvious.


So... Are you ready? Here we go!


Check if it's just that page or all pages on the site


I primarily use Link Assistant's Rank Tracker (with Trusted Proxies) for my rank checking needs, with Moz PRO's rank checking as a backup and second opinion. Rank Tracker allows a 'URL Found' column, which revealed something to me instantly: other pages were still ranking, just not the homepage. Additionally, where a ranking had seen a drop of a few pages (but was still ranking within the top 10 pages/100 results), a different page was ranking instead - in my client's case, it was things like the Services, Testimonials and Contact pages.


This suggested to me that it was just the homepage that was affected - but there was still a way that I could find out to be sure...


Use the 'site:' operator to check if it's still in Google's index


My next step was to use Google's 'site:' operator (see #1 here) on the domain, to see whether the homepage was still in Google's index. It wasn't - but all of the site's other pages were. Phew... Well at least it wasn't site-wide!


Even though I had a feeling that this would be the case based on what Rank Tracker was saying, it was still important to check, just in case the homepage was still ranking but had been devalued for whatever reason.


Now that I knew for sure that the homepage was gone from Google, it was time to start investigating what the actual cause might be...


Check 1) Accidental noindexing via the meta noindex tag


In my experience, this is usually what's responsible when something like this happens... Given that the option to noindex a page is often a tick-box in most CMS systems these days, it's easy enough to do. In fact, one of the times I looked into the issue for someone, this was what was the cause - I just told them to untick the box in WordPress.


In order to check, bring up the page's source code and look for this line (or something similar):



<meta name="robots" content="noindex">

(Hit Ctrl + F and search for "noindex" if it's easier/quicker.)


If you find this code in the source, then chances are that this is responsible. If it's not there, onto the next step...


Check 2) Accidental inclusion in the site's robots.txt file


It seems to be a somewhat common myth that robots.txt can noindex a page - it actually tells search engines not to crawl a page, so it'd only be true if the page had never actually appeared in Google's index in the first place (e.g. if it were a brand new site). Here's more info if you're interested.


To be honest though, given what had happened, I didn't want to assume that this wasn't the cause and therefore I thought it would be best just to check anyway.


But alas... The site's robots.txt file hadn't changed one iota. Onto step 3...


Check 3) Penalty checks



Given that this was my client, I was already familiar with its history, and I was already adamant that a penalty wasn't behind it. But again, I wanted to do my due diligence - and you know what they say when you assume...!


I jumped into Google Webmaster Tools and looked at the recently added Manual Actions tab. Unsurprisingly: "No manual webspam actions found." Good good.


However, let's not rule out algorithmic penalties, which Google doesn't tell you about (and oh lordy, that's caused some confusion). As far as Pandas were concerned, there was no evidence of accidental or deliberate duplicate content either on the site or elsewhere on the Web. As for those dastardly Penguins, given that I'm the first SEO ever to work on the site and I don't build keyword anchor text links for my clients, the site has never seen any keyword anchor text, let alone enough to set off alarm bells.


Following these checks, I was confident that a penalty wasn't responsible.


Check 4) Remove URLs feature in Google Webmaster Tools


Another check while you're in your Webmaster Tools account: go to Google Index > Remove URLs and check that the page hasn't been added as a removal request (whether by accident or on purpose). You never know... It's always best to check.


Nope... "No URL removal requests" in this case.


It was at this point, that I was starting to think: "what the hell else could it be?!"


Check 5) Accidental 404 code



On the day that this happened, I met up with my good friends and fellow SEOs Andrew Isidoro (@Andrew_Isidoro) and Ceri Harris of Willows Finance for a drink and a bite to eat down the pub. I ran this whole story by them along with what I'd done so far, and Andrew suggested something that I hadn't considered: although extremely unlikely, what if the homepage was now showing up as a 404 (Not Found) code instead of a 200 (OK) code? Even if the page is live and performing normally (to the visitor), a 404 code would tell Google that that page "don't live here no more" (to quote the mighty Hendrix) and Google would remove it accordingly.


Again, it was worth checking, so I ran it past SEO Book's HTTP header checker tool. The verdict: 200 code. It was a-OK (pun fully intended - it's a good thing that I'm an SEO and not a comedian...)


Ok, so now what?


Testing the page in Google Webmaster Tools


Now it was time to ask the big boss Googly McSearchengineface directly: what do you make of the page, oh mighty one?


In order to do this, go to Google Webmaster Tools, click on the site in question and select Crawl > Fetch as Google from the side-menu. You should see a screen like this:


Fetch as Google screenshot 1


Simply put the affected page(s) into it (or leave it blank if it's the homepage) and see what Google makes of them. Of course, if it's "Failed," is there a reason why it's failed? It might also help to give you an idea about what could be wrong...


Asking Google to (re)index the page


Once you have done the above in GWT, you're given this option if Google can successfully fetch the page:


Fetch as Google screenshot 2


I decided to do just that: ask Google to (re)submit the page to its index.


At this point I was confident that I had done pretty much everything in my power to investigate and subsequently rectify the situation. It was now time to break the news, by which I mean: tell the client...


Inform the client



I thought it best to tell the client after doing all of the above (except for the 404 check, which I actually did later on), even if it was possible that the page might recover almost immediately (which it did in the end, pretty much). Plus I wanted to be seen as proactive, not reactive - I wanted to be the one to tell him, not for him to be the one finding out for himself and asking me about it...


Here's the email that I sent:


Hi [name removed],

I just wanted to bring your attention to something.

I conduct daily ranks checks just to see how your site is performing on Google on a day-to-day basis, and I've noticed that your homepage has disappeared from Google.

Usually this is the result of a) accidental de-indexation or b) a penalty, but I have checked the usual suspects/causes and I see no sign of either of those occurring.

I have checked in your Webmaster Tools account and Google can successfully read/crawl the page, so no problems there. I have taken appropriate steps to ask Google to re-index the page.

I've done all that I can for now, but if we do not see everything back to normal in the next couple of days, I will continue to research the issue further. It's likely the case that it will recover of its own accord very soon. Like I say, I've checked the usual signs/causes of such an issue and it doesn't appear to be the result of any of those.

Just to check, have you or your web designer made any changes to the website in the last couple of days/weeks? If so, could you please let me know what you have done?

I know it's not an ideal situation, but I hope you can appreciate that I've spotted the issue almost immediately and have taken steps to sort out the issue.

If you have any questions about it then please do let me know. In the meantime I will keep a close eye on it and keep you posted with any developments.


(Note: In this instance, my client prefers email contact. You may find that a phone call may be better suited, especially given the severity of the situation - I guess it will be a judgement call depending on the relationship that you have with your client and what they'd prefer, etc.)


He took it well. He hadn't noticed the drop himself, but he appreciated me notifying him, filling him in on the situation and explaining what action I had taken to resolve the issue.


* Recovery! *


Later on the same day in the evening, I did another quick check. To my surprise, the homepage was not only back in Google, but the rankings were pretty much back to where they once were. PHEW!


I say "surprised" not because of my ability to pull it off, but with how quickly it'd happened - I expected that it might've taken a few days maybe, but not a mere few hours. Oh well, mustn't complain...!



The real (possible) cause...


So what did cause the deindexation? Well, another suggestion that came from Andrew while we were down the pub that I'd stupidly overlooked: downtime!


It could've been an unfortunate and unlucky coincidence that Google happened to re-crawl the page exactly when the site had gone down.


I hadn't added the site to my Pingdom account before all of this had happened (something that I have since rectified), so I couldn't know for sure. However, the site went down again a day or so later, which made me wonder if downtime was responsible after all... Even so, I advised the client that if this was a common occurrence that he should maybe consider switching hosting providers to someone more reliable, in order to reduce the chance of this happening all over again...


Preparing yourself for when it happens to you or your clients


In order to make sure that you're fully on top of a situation like this, make sure that you're carrying out daily rank checks and that you're quickly checking those rank checks, even if it's a quick once-over just to make sure that nothing drastic has happened in the last 24 hours. It's clear to say that if I hadn't have done so, I might not have realised what had happened for days and therefore might not have rectified the situation for days, either.


Also, having a 'URL Found' column in addition to 'Ranking Position' in your rank checking tool of choice is an absolute must - that way you can see if it's a particular page that's affected if different pages are now the highest-ranking pages instead.


Anyway, I hope that this case study/guide has been useful, whether you're reading it to brush up ready for when the worst happens, or whether the worst is happening to you right now (in which case I feel for you, my friend - be strong)...!


Also, if you'd do anything differently to what I did or you think that I've missed a pivotal step or check, please let me know in the comments below!


Did you like the comic drawings? If so, check out Age of Revolution, a new comic launched by Huw (@big_huw) & Hannah (@SpannerX23). Check them out on Facebook, Twitter and Ukondisplay.com (where you can pick up a copy of their first issue). Their main site - Cosmic Anvil - is coming soon... I'd like to say a massive thanks to them for providing the drawings for this post, which are simply and absolutely awesome, I'm sure you'll agree!




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://moz.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study