Wednesday, February 5, 2020
Want to understand SEO impact of rel="noreferrer noopener" tag
from Google SEO News and Discussion WebmasterWorld https://ift.tt/398TbnJ
Position Zero Is Dead; Long Live Position Zero
Posted by Dr-Pete
In 2014, Google introduced the featured snippet, a promoted organic ranking that we affectionately (some days were more affectionate than others) referred to as "position zero" or "ranking #0." One of the benefits to being in position zero was that you got to double-dip, with your organic listing appearing in both the featured snippet and page-1 results (usually in the top 3–4). On January 23, Google announced a significant change (which rolled out globally on January 22) ...
"Declutters" sounds innocuous, but the impact to how we think about featured snippets and organic rankings is significant. So, let's dig deep into some examples and the implications for SEO.
What does this mean for Moz?
First, a product announcement. In the past, we treated Featured Snippets as stand-alone SERP features — they were identified in our "SERP Features" report but were not treated as organic due to the second listing. As of Saturday, January 25 (shout-out to many of our team for putting in a long weekend), we began rolling out data that treats the featured snippet as position #1. SERPs with featured snippets will continue to be tagged in SERP Features reporting, and we're working on ways to surface more data.
Here's a partial screenshot of our "SERP Features" report from one of my own experiments ...
At a glance, you can see which keywords displayed a featured snippet (the scissor icon), owned that featured snippet (highlighted in blue), as well as your organic ranking for those keywords. We're working on bringing more of this data into the Rankings report in the near future.
If you're a Moz Pro customer and would like to see this in action, you can jump directly to your SERP Features report using the button below (please let us know what you think about the update):
This change brings our data in line with Google's view that a featured snippet is a promoted organic result and also better aligns us with Google Search Console data. Hopefully, it also helps provide customers with more context about their featured snippets as organic entities.
How does Google count to 10?
Let's take a deeper look at the before and after of this change. Here are the desktop organic results (left-column only) from a search for "LCD vs LED" on January 21st ...
Pardon some big images, but I promise there's method to my madness. In the "before" screenshot above, we can clearly see that the featured snippet URL is duplicated as the #1 organic result (note: I've added the green box and removed a People Also Ask box). Ranking #1 wasn't always the case prior to January 22nd, but most featured snippet URLs appeared in the #1–#3 organic positions, and all of them came from page-one results.
Here's the same SERP from January 23rd ...
You can see that not only is the featured snippet URL missing from the #1 position, but it doesn't appear on page one at all. There's more to this puzzle, though. Look at the January 21st SERP again, but numbered ...
Notice that, even with the featured snippet, page one displays 10 full organic results. This was part of our rationale for treating the featured snippet as the #0 position and a special case, even though it came from organic results. We also debated whether duplicating data in rankings reports added value for customers or just created confusion.
Now, look at the numbered SERP from January 23rd ...
The duplicate URL hasn't been replaced — it's been removed entirely. So, we're only left with 10 total results, including the featured snippet itself. If we started with #0, we'd be left with a page-one SERP that goes from #0–#9.
What about double snippets?
In rare cases, Google may show two featured snippets in a row. If you haven't seen one of these in action, here's an example for the search "Irish names" from January 21st ...
I've highlighted the organic URLs to show that, prior to the update, both featured snippet URLs appeared on page one. A quick count will also show you that there are 10 traditional organic listings and 12 total listings (counting the two featured snippets).
Here's that same SERP from January 23rd, which I've numbered ...
In this case, both featured snippet URLs have been removed from the traditional organic listings, and we're left once again with 10 total page-one results. We see the same pattern with SERP features (such as Top Stories or Video carousels) that occupy an organic position. Whatever the combination in play, the featured snippet appears to count as one of the 10 results on page one after January 22nd.
What about right-hand side panels?
More recently, Google introduced a hybrid desktop result that looks like a Knowledge Panel but pulls information from organic results, like a Featured Snippet. Here's an example from January 21st (just the panel) ...
In the left-hand column, the same Wordstream URL ranked #3 in organic results (I've truncated the image below to save your scrolling finger) ...
After January 22nd, this URL was also treated as a duplicate, which was met with considerable public outcry. Unlike the prominent Featured Snippet placement, many people felt (including myself) that the panel-style UI was confusing and very likely to reduce click-through rate (CTR). In a fairly rare occurrence, Google backtracked on this decision ...
Our data set showed reversal kicking in on January 29th (a week after the initial change). Currently, while some featured snippets are still displayed in right-hand panels (about 30% of all featured snippets across MozCast's 10,000 keywords), those URLs once again appear in the organic listings.
Note that Google has said this is a multi-part project, and they're likely going to be moving these featured snippets back to the left-hand column in the near future. We don't currently know if that means they'll become traditional featured snippets or if they'll evolve into a new entity.
How do I block featured snippets?
Cool your jets, Starscream. Almost the moment Google announced this change, SEOs started talking about how to block featured snippets, including some folks asking publicly about de-optimizing content. "De-optimizing" sounds harmless, but it's really a euphemism for making your own content worse so that it ranks lower. In other words, you're going to take a CTR hit (the organic CTR curve drops off quickly as a power function) to avoid possibly taking a CTR hit. As Ford Prefect wisely said: "There's no point in driving yourself mad trying to stop yourself going mad. You might just as well give in and save your sanity for later."
More importantly, there are better options. The oldest currently available option is the meta-nosnippet directive. I'd generally consider this a last resort — as a recent experiment by Claire Carlile re-affirms, meta-nosnippet blocks all snippets/descriptions, including your organic snippet.
As of 2019, we have two more options to work with. The meta-max-snippet directive limits the character-length of search snippets (both featured snippets and organic snippets). It looks something like this ...
<meta name="robots" content="max-snippet:50">
Setting the max-snippet value to zero should function essentially the same as a nosnippet directive. However, by playing with intermediate values, you might be able to maintain your organic snippet while controlling or removing the featured snippet.
Another relatively new option is the data-nosnippet HTML attribute. This is a tag attribute that you can wrap around content you wish to block from snippets. It looks something like this ...
<span data-nosnippet>I will take this content to the grave!</span>
Ok, that was probably melodramatic, but the data-nosnippet attribute can be wrapped around specific content that you'd like to keep out of snippets (again, this impacts all snippets). This could be very useful if you've got information appearing from the wrong part of a page or even a snippet that just doesn't answer the question very well. Of course, keep in mind that Google could simply select another part of your page for the featured snippet.
One thing to keep in mind: in some cases, Featured snippet content drives voice answers. Danny Sullivan at Google confirmed that, if you block your snippets using one of the methods above, you also block your eligibility for voice answers ...
A featured snippet isn't guaranteed to drive voice answers (there are a few more layers to the Google Assistant algorithms), but if you're interested in ranking for voice, then you may want to proceed with caution. Also keep in mind that there's no position #2 in voice search.
How much should I freak out?
We expect these changes are here to stay, at least for a while, but we know very little about the impact of featured snippets on CTR after January 22nd. In early 2018, Moz did a major, internal CTR study and found the impact of featured snippets almost impossible to interpret, because the available data (whether click-stream or Google Search Console) provided no way to tell if clicks were going to the featured snippet or the duplicated organic URL.
My hunch, informed by that project, is that there are two realities. In one case, featured snippets definitively answer a question and negatively impact CTR. If a concise, self-contained answer is possible, expect some people not to click on the URL. You've given them what they need.
In the other case, though, a featured snippet acts as an incomplete teaser, naturally encouraging clicks (if the information is worthwhile). Consider this featured snippet for "science fair ideas" ...
The "More items..." indicator clearly suggests that this is just part of a much longer list, and I can tell you from my as a parent that I wouldn't stop at the featured snippet. Lists and instructional content are especially well-suited to this kind of teaser experience, as are questions that can't be answered easily in a paragraph.
All of this is to say that I wouldn't take a hatchet to your featured snippets. Answering the questions your visitors ask is a good thing, generally, and drives search visibility. As we learn more about the impact on CTR, it makes sense to be more strategic, but featured snippets are organic opportunities that are here to stay.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Moz Blog https://ift.tt/2GYgwfX
Some questions about ranking erosion and crawl budget
from Google SEO News and Discussion WebmasterWorld https://ift.tt/31sJjmf
Anyone using 303 redirections to retain inbound link juice
from Google SEO News and Discussion WebmasterWorld https://ift.tt/2ttphve
Very Strange Issue after site 301 redirection
from Google SEO News and Discussion WebmasterWorld https://ift.tt/2v8qwQR
Pay Attention to These SEO Trends in 2020 and Beyond
Posted by Suganthan-Mohanadasan
Without a doubt, it is our job as SEOs to keep an eye on the future and anticipate what Google is planning, testing, or looking to drop on our doorsteps. Over the past 12 months alone, we have seen several changes in Google Search — each impacting how we plan, implement, and report on campaigns.
In this article, I will take a look at what is in store for SEO in 2020 and how these factors will change the way we formulate strategies throughout the next year and beyond.
Artificial intelligence will continue to evolve
Over the past half-decade, artificial intelligence has become a pioneering force in the evolution of SEO.
In 2015, for example, we were introduced to RankBrain -- the machine-based search algorithm that helps Google push more relevant results to users. Although RankBrain is coming up on its fifth birthday, we are only now catching early glimpses into how artificial intelligence will dominate SEO in the coming years.
The most recent step in this progression of artificial learning is, of course, the introduction of Bidirectional Transformers for Language Understanding (BERT), which Google announced at the end of October. For those who missed it, BERT is Google’s neural network-based technique for natural language processing, and it’s important because it deals with the very fundamentals of how people search. Google itself says that the algorithm represents “the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.”
Affecting one in ten searches, BERT gives Google a better understanding of how language is used and helps it comprehend the context of individual words within searches. The important thing to know about BERT (and also RankBrain), is the fact that you cannot optimize for it.
There's nothing to optimize for with BERT, nor anything for anyone to be rethinking. The fundamentals of us seeking to reward great content remain unchanged.
— Danny Sullivan (@dannysullivan) October 28, 2019
But what does this mean for SEOs?
BERT is just one signal of how Google understands language, but it is one of the most important in the search engine’s arsenal. This means that now more than ever, webmasters and SEOs alike must focus their efforts on creating the most useful, natural, and highest-quality content. Quite simply, as Danny Sullivan says, “write content for users.”
It’s also worth understanding how BERT interprets questions, which you can find out more about in the Whiteboard Friday episode below.
Voice search is here to stay
It’s hard to imagine at the dawn of 2020, but when voice search was released in 2012 many assumed it would be just another project consigned to the ever-growing Google graveyard.
Today, however, we know so much more about the technology and, thanks to schema.org, where it is likely to go in the future. The adoption rate is slower than predicted, but it has nevertheless leaked into our lives, so we must not completely ignore voice search.
Schema markup
A new form of markup is released nearly every month, with one of the latest developments being markup for movies. Although this might seem insignificant, the fact that we are now seeing markup for films shows just how granular and far-reaching structured data has come.
With smart speakers now numbering 120 million in the US alone, webmasters should be taking the time to investigate where schema can be placed on their website so they can take advantage of the 35.6 million voice search demands taking place every month. What’s more, website markup has a monumental influence on featured snippets, which can be highly lucrative for any website. Take a look at this Moz guide for more information on how voice search influences featured snippets.
Speakable
If you’re in the US, it’s also worth noting that Speakable (BETA) is used by Google Assistant to answer people’s questions about specific topics. The assistant can return up to three relevant articles and provide audio playback using text-to-speech markup. Implementing such a markup can be highly lucrative for news sites, because when the assistant provides an answer, it also attributes the source and sends the full article URL to the user's mobile device. If you’re a news site that publishes in English but doesn’t yet have Speakable markup implemented, you can read up on both the technical considerations and content requirements necessary for eligibility.
Google Actions
Actions on Google, a development platform for Google Assistant, is also worth your consideration. It allows the third-party development of "actions" — applets for Google Assistant that provide extended functionality. Actions can be used to get things done by integrating your content and services with the Google Assistant.
Actions allow you to do a number of things:
- Build Actions to ensure Google Assistant uses your apps
- Allow users to search for and engage with your app
- Provide your app as a recommendation for user queries
Check out this fantastic article by Andrea Vopini about how to optimize your content using Google assistant.
Google is heavily invested in using entities
Entities aren’t something that you hear SEOs talking about every day, but they are something Google is putting a lot of resources into. Put simply, Google itself states that entities are “a thing or concept that is singular, unique, well-defined, and distinguishable.”
Entities don’t have to be something physical, but can be something as vague as an idea or a color. As long as it is either singular, unique, distinguishable, or well-defined, it is an entity.
As you can see, Moz shows up in the knowledge panel because the company is an entity. If you search the Google Knowledge Graph API for the company name, you can see how Google understands them:
But what does this mean for SEOs?
In 2015, Google submitted a patent named “Ranking Search Results Based On Entity Metrics,” which is where the above entity description is sourced from. Although few patents are worth getting excited about, this one caused a stir in the technical SEO scene because it takes machine learning to an entirely new level and allows Google to accurately calculate the probability of user intent, thus giving it an understanding of both user language and tone. What’s more, entities place a reduced reliance on links as a ranking factor, and depending on what your SEO strategy is, that could result in the need for big campaign changes.
The most important aspect you will need to consider is how Google understands the entities on your website.
For example, if your site sells shoes, you need to think about how many different types, colors, sizes, brands, and concepts exist for your shoes. Each shoe will represent a different entity, which means you must consider how to frame each product so that it meets the expectations of users as well as the learning capabilities of Google — which is where we meet markup once again.
Sites themselves can also become entities, and that provides huge rewards as they appear in the Knowledge Panel, which I will discuss next.
The knowledge panel will be important for personalities and brands
Although Google’s Knowledge Graph was launched way back in 2012, its expansion since then means it is still a core part of the search matrix and one that will reach far into the next decade.
Closely tied with featured snippets and rich results, earlier last year Google began allowing entities to claim their own knowledge panel, giving them access to edit and control the information presented to users in search results. They can make specific requests, such as changing the featured image, panel title, and social profiles provided within the panel.
The benefits of claiming your knowledge panel are numerous. They help users gain quick access to your site, which thanks to the Knowledge Graph, displays trust and authority signals. Knowledge panels also provide brands and personalities with the ability to control what objective information is shown to users. However, there are still many brands that have yet to claim their own panels.
You can claim your business’s knowledge panel in a few easy steps:
- Ensure that your website is verified with Search Console.
- Update your panel by suggesting a change to Google.
But what does this mean for SEOs?
As you can see from the above examples, being in the Knowledge Graph can improve trust and add authenticity to your business or personal brand, as well as providing additional visibility. But it's easier said than done.
Unless you're a recognized, famous person or brand, claiming space in the Knowledge Graph is going to be difficult. Having a Wikipedia page can be enough, but I don't recommend creating pages just to get there — it will get deleted and waste your effort. Instead, build brand mentions and authority around your name gradually. While having a wikidata page can be helpful, it’s not guaranteed. The goal is to get Google to recognize you as a notable person or brand.
Queryless proactive predictive search is getting better
Google Discover was released in June of 2017, prompting a new kind of search altogether — one that is queryless. Discover is an AI-driven content recommendation tool and claims 80 million active users.
Using the aforementioned Knowledge Graph, Google added an extra layer called the Topic Layer, which is engineered to understand how a user’s interest develops over time (this article by the University of Maryland offers an in-depth explanation of topic layers and models).
By understanding the many topics a user is interested in, Discover identifies the most accurate content to deliver from an array of websites.
But what does this mean for SEOs?
To appear in Discover, Google states that pages appear “if they are indexed by Google and meet Google News content policies. No special tags or structured data are required.” It ranks content based on an algorithm that inspects the quality of content alongside the interests of the user and the topic of the page in question. The exact formula is unknown, however, based on several studies and experiments we now have a pretty good idea of how it works.
This screenshot from a presentation by Kenichi Suzuki highlights some of the factors that help pages appear in Discover.
According to Google, there are two ways to boost the performance of your content within Discover:
- Post interesting content
- Use high-quality images
As ever, ensure that you generate high quality content that is unique and creates a great experience for users. If your site tends to publish clickbait articles, the chance of those articles appearing in Discover is low.
Other tips for appearing in Discover would be to arrange your content semantically so that Google finds it easier to understand your work, and ensure that your website is technically proficient.
Like any form of search, you can use Google Search Console to see how well your articles are performing in Discover. You can find Discover stats under the performance section.
Google Discover analytics data is fairly new, and therefore limited. There isn't currently a native way to segment this traffic inside Google Analytics. To track user behavior data, this article provides a technique to track it inside Google Analytics.
We have yet to see the biggest changes in visual image Search
It could be argued that the biggest change to image search happened in September 2018 when Google Lens rolled out. Not only did featured videos begin appearing in image search, but AMP stories and new ranking algorithms and tags were also released.
But while speaking at a Webmaster Meetup in New York last year, John Mueller shared that there will be major changes in image search in the coming year. Rather than merely viewing images, very soon people will use itto accomplish goals, purchase products, and learn new information.
Google has always said that images should be properly optimized and marked, so if you have not started to add such data or information to your images, now is definitely the time to start.
In the past six months alone we have seen Google introduce small changes such as removing the “view image” function, as well as colossal changes, such as totally revamping image search for Desktop.
Furthermore, people don’t even have to search within it to see images anymore. It's common for the SERP to present a universal search result, which encompasses images, videos, maps, news, and shopping listings. The opportunity to appear in a universal (or blended) result is just another reason why properly tagged and marked images are so important.
Finally, Google has added visual image search attributes to search results. The interesting thing with this update is that these attributes are now available as image carousels within the main search results.
But what does this mean for SEOs?
With so much to play with, webmasters and SEOs should consider how they can take advantage of such changes, which could prove potentially very lucrative for the right sites — especially when you consider that 35% of Google product searches return transactions in as little as five days.
E-A-T doesn’t apply to every site — but it still matters
E-A-T (Expertise, Authoritativeness, Trustworthiness) is something every SEO should know back to front, but remember:
- E-A-T isn’t a ranking factor
- E-A-T is critical for Your Money or Your Life (YMYL) topics and pages
Although these two statements might seem contradictory, they make more sense when you consider what Google defines as YMYL.
According to Google’s Rater Guidelines, YMYL is a page or topic that “could potentially impact a person’s future happiness, health, financial stability, or safety.” This means that if your page has information that could potentially change a person’s life, it is considered YMYL and offering E-A-T is important. If your site is merely your personal collection of cat pictures, then showcasing authority or expertise is less critical.
But what does this mean for SEOs?
The issue, however, is that the majority of websites (and certainly the ones invested in SEO) are generally going to have YMYL pages or topics, but Google is taking big steps to ensure that low quality or questionable YMYL content is weeded out. As you might know, you can’t optimize for E-A-T because it isn’t an algorithm, but you can implement changes to make sure your site sends the right kind of quality signals to Google. This Moz article by Ian Booth and this guide by Lily Ray offer great tips for how to do that.
Topics and semantics over keywords
Google is putting less priority on both links and keywords, which is where topic modeling and semantics come into the conversation.
Google has become very clever at understanding what a user is searching for based on just a few basic words. This is thanks, in part, to topic modeling (as Google itself admitted in September 2018 when it introduced its “topic layer”). Indeed, this algorithm has a deep understanding of semantics and yearns to provide users with deep troves of information.
But what does this mean for SEOs?
This means that it has never been more important to create high quality, in-depth, and meaningful content for users — but you also need to think about information structure.
For example, if your site sells running shoes, you could create long-form educational pieces about how to choose shoes for specific running environments, athletic diets for runners, or tech accessory reviews. These articles could then be clustered into various topics. By clustering your topics into compartments through your website architecture, both users and crawlers can easily navigate and understand the content provided.
Studies have also shown that Google’s crawlers prefer pages with semantic groupings and sites that are designed around topic modeling. This 2018 presentation by Dawn Anderson gives a brilliant insight into this. If you want to know more about topic modeling and semantic connectivity, check out this Whiteboard Friday by Rand Fishkin.
SERPs will continue to evolve
Over the past couple of years, we’ve seen search results evolve and transform like never before. In fact, they have changed so much that, in some cases, being placed first within the organic search results may not be the most lucrative position.
This is something that would have been unthinkable just a few short years ago (check out this Moz article from 2018) that works to quell the panic from zero position SERPs).
With the introduction of Voice Search, rich results, rich snippets, knowledge panels, Google My Business, and updated Image Search results, SEOs now need to consider a whole new range of technical marketing strategies to appear in a multitude of organic search results.
It’s hard to know where Google is taking SERPs in the next year, but it is fair to say the strategies we use today for the search environment will likely be outdated in as little as six months.
Take, for example, the recent addition and subsequent removal of favicons in the SERPs; after backlash, Google reversed the change, proving we can never predict which changes will stick and which are blips on the radar.
But what does this mean for SEOs?
Ensure that your strategies are flexible and constantly prepare for changes in both your business sector (if you do not work within SEO) and the constantly evolving search environment. Pay attention to the seasonality of searches and use tools such as Google Trends to cover any out-of-season deficit that you may encounter.
You can use tools like Moz Keyword Explorer to help plan ahead and to create campaigns and strategies that provide useful traffic and lucrative conversions.
Conclusion
SEOs need to move away from the ideology that links and traditional search results should be priorities for an organic campaign. Although both still carry weight, without investment in technical strategy or willingness to learn about entities or semantic connectivity, no SEO campaign can reach its full potential.
The world of SEO in 2020 is bright and exciting, but it will require more investment and intelligent strategy than ever before.
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!
from Moz Blog https://ift.tt/2OtLbFZ
5 takeaways for marketers from Google’s Q4 2019 earnings
Please visit Search Engine Land for the full article.
from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing https://ift.tt/36XTiRF