Thursday, June 8, 2017

Voice search becomes voice action: A key talking point at SMX London

Columnist Andreas Reiffen recaps a session from SMX London that focused on the future of voice search -- and what search marketers must do to prepare. The post Voice search becomes voice action: A key talking point at SMX London appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2s7S9XJ

LiveRamp adds people-based search targeting to IdentityLink

Its new capability allows brands to add missing email addresses to their CRMs for search targeting with Google AdWords. The post LiveRamp adds people-based search targeting to IdentityLink appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2sXyNRr

What you need to know about Referrer Policy

Seeing some confusing referrer data in your analytics program? It might be the result of an issue with 'noreferrer.' Columnist Patrick Stox explains Referrer Policy, which lets webmasters define the value of the referrer header in outbound links. The post What you need to know about Referrer...

Please visit Search Engine Land for the full article.


from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2s7uiHx

Site Crawl, Day 1: Where Do You Start?

Posted by Dr-Pete

When you're faced with the many thousands of potential issues a large site can have, where do you start? This is the question we tried to tackle when we rebuilt Site Crawl. The answer depends almost entirely on your site and can require deep knowledge of its history and goals, but I'd like to outline a process that can help you cut through the noise and get started.

Simplistic can be dangerous

Previously, we at Moz tried to label every issue as either high, medium, or low priority. This simplistic approach can be appealing, even comforting, and you may be wondering why we moved away from it. This was a very conscious decision, and it boils down to a couple of problems.

First, prioritization depends a lot on your intent. Misinterpreting your intent can lead to bad advice that ranges from confusing to outright catastrophic. Let's say, for example, that we hired a brand-new SEO at Moz and they saw the following issue count pop up:

Almost 35,000 NOINDEX tags?! WHAT ABOUT THE CHILDREN?!!

If that new SEO then rushed to remove those tags, they'd be doing a lot of damage, not realizing that the vast majority of those directives are intentional. We can make our systems smarter, but they can't read your mind, so we want to be cautious about false alarms.

Second, bucketing issues by priority doesn't do much to help you understand the nature of those problems or how to go about fixing them. We now categorize Site Crawl issues into one of five descriptive types:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Categorizing by type allows you to be more tactical. The issues in our new "Redirect" category, for example, are going to have much more in common, which means they potentially have common fixes. Ultimately, helping you find problems is just step one. We want to do a better job at helping you fix them.

1. Start with Critical Crawler Issues

That's not to say everything is subjective. Some problems block crawlers (not just ours, but search engines) from getting to your pages at all. We've grouped these "Critical Crawler Issues" into our first category, and they currently include 5XX errors, 4XX errors, and redirects to 4XX. If you have a sudden uptick in 5XX errors, you need to know, and almost no one intentionally redirects to a 404.

You'll see Critical Crawler Issues highlighted throughout the Site Crawl interface:

Look for the red alert icon to spot critical issues quickly. Address these problems first. If a page can't be crawled, then every other crawler issue is moot.

2. Balance issues with prevalence

When it comes to solving your technical SEO issues, we also have to balance severity with quantity. Knowing nothing else about your site, I would say that a 404 error is probably worth addressing before duplicate content — but what if you have eleven 404s and 17,843 duplicate pages? Your priorities suddenly look very different.

At the bottom of the Site Crawl home, check out "Moz Recommends Fixing":

We've already done some of the math for you, weighting urgency by how prevalent the issue is. This does require some assumptions about prioritization, but if your time is limited, we hope it at least gives you a quick starting point to solve a couple of critical issues.

3. Solve multi-page issues

There's another advantage to tackling issues with high counts. In many cases, you might be able to solve issues on hundreds (or even thousands) of pages with a single fix. This is where a more tactical approach can save you a lot of time and money.

Let's say, for example, that I want to dig into my 916 pages on Moz.com missing meta descriptions. I immediately notice that some of these pages are blog post categories. So, I filter by URL:

I can quickly see that these pages account for 392 of my missing descriptions — a whopping 43% of them. If I'm concerned about this problem, then it's likely that I could solve it with a fairly simple CMS page, wiping out hundreds of issues with a few lines of code.

In the near future, we hope to do some of this analysis for you, but if filtering isn't doing the job, you can also export any list of issues to CSV. Then, pivot and filter to your heart's content.

4. Dive into pages by PA & crawl depth

If you can't easily spot clear patterns, or if you've solved some of those big issues, what next? Fixing thousands of problems one URL at a time is only worthwhile if you know those URLs are important.

Fortunately, you can now sort by Page Authority (PA) and Crawl Depth in Site Crawl. PA is our own internal metric of ranking ability (primarily powered by link equity), and Crawl Depth is the distance of a page from the home-page:

Here, I can see that there's a redirect chain in one of our MozBar URLs, which is a very high-authority page. That's probably one worth fixing, even if it isn't part of an obvious, larger group.

5. Watch for spikes in new issues

Finally, as time goes on, you'll also want to be alert to new issues, especially if they appear in large numbers. This could indicate a sudden and potentially damaging change. Site Crawl now makes tracking new issues easy, including alert icons, graphs, and a quick summary of new issues by category:

Any crawl is going to uncover some new pages (the content machine never rests), but if you're suddenly seeing hundreds of new issues of a single type, it's important to dig in quickly and make sure nothing's wrong. In a perfect world, the SEO team would always know what changes other people and teams made to the site, but we all know it's not a perfect world.

I hope this gives you at least a few ideas for how to quickly dive into your site's technical SEO issues. If you're an existing customer, you already have access to Moz's new Site Crawl and all of the features discussed in this post.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog http://ift.tt/2rOcbDL

SearchCap: Google redesign local panels, AdWords roadmap & artificial intelligence

Below is what happened in search today, as reported on Search Engine Land and from other places across the web. The post SearchCap: Google redesign local panels, AdWords roadmap & artificial intelligence appeared first on Search Engine Land.

Please visit Search Engine Land for the full article.


from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2sUilkW

Wednesday, June 7, 2017

New Site Crawl: Rebuilt to Find More Issues on More Pages, Faster Than Ever!

Posted by Dr-Pete

First, the good news — as of today, all Moz Pro customer have access to the new version of Site Crawl, our entirely rebuilt deep site crawler and technical SEO auditing platform. The bad news? There isn't any. It's bigger, better, faster, and you won't pay an extra dime for it.

A moment of humility, though — if you've used our existing site crawl, you know it hasn't always lived up to your expectations. Truth is, it hasn't lived up to ours, either. Over a year ago, we set out to rebuild the back end crawler, but we realized quickly that what we wanted was an entirely re-imagined crawler, front and back, with the best features we could offer. Today, we launch the first version of that new crawler.

Code name: Aardwolf

The back end is entirely new. Our completely rebuilt "Aardwolf" engine crawls twice as fast, while digging much deeper. For larger accounts, it can support up to ten parallel crawlers, for actual speeds of up to 20X the old crawler. Aardwolf also fully supports SNI sites (including Cloudflare), correcting a major shortcoming of our old crawler.

View/search *all* URLs

One major limitation of our old crawler is that you could only see pages with known issues. Click on "All Crawled Pages" in the new crawler, and you'll be brought to a list of every URL we crawled on your site during the last crawl cycle:

You can sort this list by status code, total issues, Page Authority (PA), or crawl depth. You can also filter by URL, status codes, or whether or not the page has known issues. For example, let's say I just wanted to see all of the pages crawled for Moz.com in the "/blog" directory...

I just click the [+], select "URL," enter "/blog," and I'm on my way.

Do you prefer to slice and dice the data on your own? You can export your entire crawl to CSV, with additional data including per-page fetch times and redirect targets.

Recrawl your site immediately

Sometimes, you just can't wait a week for a new crawl. Maybe you relaunched your site or made major changes, and you have to know quickly if those changes are working. No problem, just click "Recrawl my site" from the top of any page in the Site Crawl section, and you'll be on your way...

Starting at our Medium tier, you’ll get 10 recrawls per month, in addition to your automatic weekly crawls. When the stakes are high or you're under tight deadlines for client reviews, we understand that waiting just isn't an option. Recrawl allows you to verify that your fixes were successful and refresh your crawl report.

Ignore individual issues

As many customers have reminded us over the years, technical SEO is not a one-sized-fits-all task, and what's critical for one site is barely a nuisance for another. For example, let's say I don't care about a handful of overly dynamic URLs (for many sites, it's a minor issue). With the new Site Crawl, I can just select those issues and then "Ignore" them (see the green arrow for location):

If you make a mistake, no worries — you can manage and restore ignored issues. We'll also keep tracking any new issues that pop up over time. Just because you don't care about something today doesn't mean you won't need to know about it a month from now.

Fix duplicate content

Under "Content Issues," we've launched an entirely new duplicate content detection engine and a better, cleaner UI for navigating that content. Duplicate content is now automatically clustered, and we do our best to consistently detect the "parent" page. Here's a sample from Moz.com:

You can view duplicates by the total number of affected pages, PA, and crawl depth, and you can filter by URL. Click on the arrow (far-right column) for all of the pages in the cluster (shown in the screenshot). Click anywhere in the current table row to get a full profile, including the source page we found that link on.

Prioritize quickly & tactically

Prioritizing technical SEO problems requires deep knowledge of a site. In the past, in the interest of simplicity, I fear that we've misled some of you. We attempted to give every issue a set priority (high, medium, or low), when the difficult reality is that what's a major problem on one site may be deliberate and useful on another.

With the new Site Crawl, we decided to categorize crawl issues tactically, using five buckets:

  • Critical Crawler Issues
  • Crawler Warnings
  • Redirect Issues
  • Metadata Issues
  • Content Issues

Hopefully, you can already guess what some of these contain. Critical Crawler Issues still reflect issues that matter first to most sites, such as 5XX errors and redirects to 404s. Crawler Warnings represent issues that might be very important for some sites, but require more context, such as meta NOINDEX.

Prioritization often depends on scope, too. All else being equal, one 500 error may be more important than one duplicate page, but 10,000 duplicate pages is a different matter. Go to the bottom of the Site Crawl Overview Page, and we've attempted to balance priority and scope to target your top three issues to fix:

Moving forward, we're going to be launching more intelligent prioritization, including grouping issues by folder and adding data visualization of your known issues. Prioritization is a difficult task and one we haven't helped you do as well as we could. We're going to do our best to change that.

Dive in & tell us what you think!

All existing customers should have access to the new Site Crawl as of earlier this morning. Even better, we've been crawling existing campaigns with the Aardwolf engine for a couple of weeks, so you'll have history available from day one! Stay turned for a blog post tomorrow on effectively prioritizing Site Crawl issues, and a webinar on Friday at 9am Pacific.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!



from Moz Blog http://ift.tt/2rBEDtT

Who should optimize content: SEOs or content writers?

Search engine optimization (SEO) and content marketing have a lot of overlap, but they're still separate disciplines. Columnist Stoney deGeyter discusses who should ultimately own content optimization. The post Who should optimize content: SEOs or content writers? appeared first on Search Engine...

Please visit Search Engine Land for the full article.


from Search Engine Land: News & Info About SEO, PPC, SEM, Search Engines & Search Marketing http://ift.tt/2sDIHs6