Please visit Search Engine Land for the full article.
from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2giYevA
Posted by randfish
What are the factors Google considers when weighing whether a page is high or low quality, and how can you identify those pages yourself? There's a laundry list of things to examine to determine which pages make the grade and which don't, from searcher behavior to page load times to spelling mistakes. Rand covers it all in this episode of Whiteboard Friday.
Click on the whiteboard image above to open a high-resolution version in a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we're going to chat about how to figure out if Google thinks a page on a website is potentially low quality and if that could lead us to some optimization options.
So as we've talked about previously here on Whiteboard Friday, and I'm sure many of you have been following along with experiments that Britney Muller from Moz has been conducting about removing low-quality pages, you saw Roy Hinkis from SimilarWeb talk about how they had removed low-quality pages from their site and seen an increase in rankings on a bunch of stuff. So many people have been trying this tactic. The challenge is figuring out which pages are actually low quality. What does that constitute?
This is an intriguing one. So if someone performs a search, let's say here I type in a search on Google for "pressure washing." I'll just write "pressure wash." This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.
If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher's query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn't answer your query, so you go visit another one that does. It's very likely that this result will be moved down and be perceived as low quality in Google.
This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.
As a marketer, as an SEO, there's a process that we can use. We don't have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.
In general, I'm going to urge you NOT to use things like:
A. Time on site, raw time on site
B. Raw bounce rate
C. Organic visits
D. Assisted conversions
Why not? Because by themselves, all of these can be misleading signals.
So a long time on your website could be because someone's very engaged with your content. It could also be because someone is immensely frustrated and they cannot find what they need. So they're going to return to the search result and click something else that quickly answers their query in an accessible fashion. Maybe you have lots of pop-ups and they have to click close on them and it's hard to find the x-button and they have to scroll down far in your content. So they're very unhappy with your result.
Bounce rate works similarly. A high bounce rate could be a fine thing if you're answering a very simple query or if the next step is to go somewhere else or if there is no next step. If I'm just trying to get, "Hey, I need some pressure washing tips for this kind of treated wood, and I need to know whether I'll remove the treatment if I pressure wash the wood at this level of pressure," and it turns out no, I'm good. Great. Thank you. I'm all done. I don't need to visit your website anymore. My bounce rate was very, very high. Maybe you have a bounce rate in the 80s or 90s percent, but you've answered the searcher's query. You've done what Google wants. So bounce rate by itself, bad metric.
Same with organic visits. You could have a page that is relatively low quality that receives a good amount of organic traffic for one reason or another, and that could be because it's still ranking for something or because it ranks for a bunch of long tail stuff, but it is disappointing searchers. This one is a little bit better in the longer term. If you look at this over the course of weeks or months as opposed to just days, you can generally get a better sense, but still, by itself, I don't love it.
Assisted conversions is a great example. This page might not convert anyone. It may be an opportunity to drop cookies. It might be an opportunity to remarket or retarget to someone or get them to sign up for an email list, but it may not convert directly into whatever goal conversions you've got. That doesn't mean it's low-quality content.
So what I'm going to urge you to do is think of these as a combination of metrics. Any time you're analyzing for low versus high quality, have a combination of metrics approach that you're applying.
1. That could be a combination of engagement metrics. I'm going to look at...
2. You can combine some offsite metrics. So things like...
3. Search engine metrics. You can look at...
4. You are almost definitely going to want to do an actual hand review of a handful of pages.
Make 3 buckets:
Using these combinations of metrics, you can build some buckets. You can do this in a pretty easy way by exporting all your URLs. You could use something like Screaming Frog or Moz's crawler or DeepCrawl, and you can export all your pages into a spreadsheet with metrics like these, and then you can start to sort and filter. You can create some sort of algorithm, some combination of the metrics that you determine is pretty good at ID'ing things, and you double-check that with your hand review. I'm going to urge you to put them into three kinds of buckets.
I. High importance. So high importance, high-quality content, you're going to keep that stuff.
II. Needs work. second is actually stuff that needs work but is still good enough to stay in the search engines. It's not awful. It's not harming your brand, and it's certainly not what search engines would call low quality and be penalizing you for. It's just not living up to your expectations or your hopes. That means you can republish it or work on it and improve it.
III. Low quality. It really doesn't meet the standards that you've got here, but don't just delete them outright. Do some testing. Take a sample set of the worst junk that you put in the low bucket, remove it from your site, make sure you keep a copy, and see if by removing a few hundred or a few thousand of those pages, you see an increase in crawl budget and indexation and rankings and search traffic. If so, you can start to be more or less judicious and more liberal with what you're cutting out of that low-quality bucket and a lot of times see some great results from Google.
All right, everyone. Hope you've enjoyed this edition of Whiteboard Friday, and we'll see you again next week. Take care.
Video transcription by Speechpad.com
Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!