Saturday, March 15, 2014

Keyword Targeting, Density, and Cannibalization - Whiteboard Friday

Posted by randfish


Keyword targeting is still an integral part of online marketing, but it isn't the same as it used to be, and we want to make sure you're able to keep up with the changes. In today's Whiteboard Friday, Rand covers today's best practices for keyword targeting, and clears up some common misconceptions about keyword density and cannibalization.


















For reference, here's a still of this week's whiteboard!



Video transcription



Howdy Moz fans and welcome to another edition of Whiteboard Friday. This week I'm going to talk a little bit about some keyword targeting, density, keyword density, and cannibalization issues. These are issues that I've seen come up a few times. I've received some email questions about them, and so I thought maybe it's a good time to readdress some of these best practices and to talk about how things like Hummingbird, in particular, have changed some of the ways that we think about keyword targeting, as Google's engine has really evolved to be more sophisticated with how they identify and process keyword use than they have historically.


So, first off, I'm going to start by identifying this page, actually a really wonderful blog and website from a local Seattle blogger, talking about Seattle Espresso. So I did a search for Seattle's best espresso, because that's a topic of us many here at the Mozplex and many of you who come to visit Seattle are often interested in, and I found this wonderful page.


Now, there are some interesting things about it. It ranks very well. I think it's ranking number four, and another blog post from the same guy from the next year of his reviews is ranking number five. So he's got sort of two positions in there. But what's interesting to me is there's not a lot of keyword targeting. In fact, this particular gentleman even has something on his About page that says, "If you're an SEO or a social media person, don't even contact me." So clearly this is not a guy who's thinking tremendously about SEO, doesn't have a lot of keyword targeting in mind, but is doing a tremendously good job of ranking, and that's because he's, perhaps unintentionally, following a lot of really smart rules.


So first off, as opposed to early keyword targeting world of SEO, today I really don't stress repetition. I think repetition is something we can almost avoid. So I don't worry about, "Hey, I only have four instances of the term 'Seattle's Best Espresso' on the page. That's not enough. I really need six or I need seven or I need five or I need three." I don't worry about the number. I do, generally speaking, like to make sure that at least somewhere on the page, at one point or another, the phrase is mentioned once or twice is generally good enough, and sometimes if it makes sense to have it in the copy anyway, for user experience reasons, for readability reasons, for content reasons, great, fine. That's okay.


Also, I never, ever use a density metric. It used to be the case that density was somewhat reasonably okay, reasonably correlated with better keyword targeting. But, honestly, that went out the window so long ago. I think when I started in SEO, in 2002, it was already dying. People were already talking about keyword density being a relatively useless metric.


Let me just explain what density is very briefly for anyone who might not know. So there's a lot of content here, in fact 67 unique words, and what I've done is highlight in purple these Seattle, espresso, best espresso, espresso, 67 unique words. Keyword density basically says, "Well, there are four instances of espresso. Out of 67 words, that's a 5.97% density of espresso." Can you see how incredibly useless this is?


So search engines evolved dramatically beyond keyword density, probably as soon as the late '90s. So we're talking a long time ago, and yet there are still a tremendous number of SEOs who look for a keyword density analysis and density tools and think this is a good way to do the best practice. It really is not. I would urge you not to use density as a metric, not to think about it. You won't find it in our keyword tools. You won't find it in most good keyword tools.


Title is very useful. It's a very useful place to employ your keywords, but a click-worthy title is actually worth a lot more than just a perfectly keyword-targeted title. So perfectly keyword-targeted would be keyword phrase right at the beginning, exact match, so something like "Seattle's Best Espressos," and then "I review 113 different coffee places in the city." Okay, that's not a terrible title. You could imagine clicking that.


I actually really like the title that this blogger's put together: "The Best and Worst in Seattle Espresso, 2011 Edition." This isn't perfectly keyword-targeted. I searched for "Seattle's best espresso," which is, by far, the most common phrasing that searchers are going to use. But he's got "best" separated from "Seattle Espresso." It's not right at the front of the title. It's still a great title.


You know what's even smarter, that I really like, is the way that he writes it. "The Best and Worst in Seattle Espresso" is almost more compelling to me than just knowing the best. I'm really curious about the worst. The worst holds a curious fascination for me. If I see some coffee shop that I really love on the worst list, well, I'm going to get all inflamed about that and riled up. But what a great way to write headlines, to write titles. He's employed the keywords intelligently, but he's made me want to click, and that's something that I think we should all take away from.


On page is very useful. So putting the keyword on the page, especially important in the headline. Why is it so important in the headline? It's not because SEO is about perfect keyword placement and getting that H1 tag. It's not actually that important or critical that you get it in the H1 or the H2. It's a best practice, and I would generally recommend it, but it's okay if you don't.


The reason I really recommend this is because when someone clicks on this title in the search results, "The Best and Worst in Seattle Espresso, 2011 Edition," if they land on a page that does not have that headline, that title at the top of the page in some bigger font, instantly searchers will get the impression that they've landed on the wrong page and they'll click the Back button. As we know, pogo-sticking is a real problem. People jumping from a result over to the search results and then jumping back to search results, that gives the engine an indication that people were not satisfied and happy with this result. They're going and they're scrolling down and clicking on other people's results instead. You don't want that. You want to own that experience. You want to be the provider of the best possible relevancy and searcher experience that you can.


That's why one of the other recommendations that I have, when it comes to on page, is never sacrificing user experience. If you're thinking to yourself, "Well, Rand said I should really have the keyword on the page in some sort of exact format, like at least twice and in the headline," yes, but if you think that's making a worse user experience, then mixing it up a little bit like this blogger did, mix it up a little bit. Go for the better user experience every time. Particularly because of things like what Google did with Hummingbird, where they've gotten much more sophisticated about text, contextual analysis, relevancy, the way that they interpret things, you can see a lot of search results now where it is not keyword targeting that's winning the day, but really searcher intent. Meaning, if I'm going and searching and this blogger has done a really good job of connecting up the terms and concepts that Google has identified that they associate with best espresso, they're going to rank particularly well.


Let me show you some really smart things that perhaps unintentionally this blogger did. He mentions coffee shop names -- Victrola, Cortona. He's got Vivace down there later. He has Herkimer Coffee. Herkimer is the maker of the espresso that they serve at Cortona Cafe. This is incredibly intelligent because when Google scans the Web and they see lots of people talking about Seattle's best espresso, these coffee shops and roasters are mentioned very frequently. There's a high degree of network connectivity, keyword connectivity between these terms and phrases.


So when I see, as Google, Seattle's best espresso and I don't see any mention of Herkimer or Vivace or Victrola or Ballard Coffee Works, Seattle Coffee Works, I'm going to get a little suspicious. If I see things like Starbucks and Tully's and Seattle's Best Coffee, which is a brand, I'm going to think, "Gosh, I don't know if they've actually localized. I don't know if this is relevant to that searcher's query." In fact, if you look at the front page for Seattle's best espresso, you will not find places that list, well, most of the results do not list places like Starbucks and Seattle's Best Coffee and Tully's, and these bigger national brands or regional brands.


The last thing that I'll mention on targeting is that providing unique value is essential. I did a whole Whiteboard Friday about providing unique value and the uniqueness of content. But those topically relevant terms that I just mentioned can be very helpful here. But really it's about providing something that you'll never find anywhere else. Not just unique content, meaning this text is unique to the Web, but meaning the value provided by it is truly unique. I can't find this value. I can't get what I get from this article anywhere else on the Web. That's critically important.


All right. Next piece is cannibalization, and keyword cannibalization is sort of a tough, meaty topic. It's not quite as important as it used to be, because Google has gotten much more sophisticated, more advanced in being able to tell. Basic idea behind cannibalization is, "I've got a page targeting Seattle's Best Espresso, and then I have another page targeting Wallingford's Best Espresso, which is a neighborhood here in Seattle. Should I be really careful not to use the word Seattle on my Wallingford Best Espresso page? How do I link between them? How do I make sure that Google knows which one to rank well?" In the past, Google was not smart enough, and a lot of times you would see these not as relevant pages outranking the one you really wanted to rank. So people in the SEO world came up with this term keyword cannibalization, and they tried to find ways to make Google rank the page that they wanted. Google's gotten much better about this. There are still a few best practices that we should keep in mind.


So, first off, on page targeting for a unique keyword phrase is optimal. So if we know that we want a page that's Seattle's Best Espresso, great. Having that term in the title, in the headline of a unique page is a very good idea. If we know that we want another one that's Wallingford, that's great too. Bt it is okay if you have multiple pages employing part of a keyword term or phrase. So, for example, I've got my Wallingford page. It's okay on the Wallingford page if I also mention Seattle. I could say "Seattle's Wallingford Neighborhood," or "The Best Espresso in Seattle's Wallingford Neighborhood," or "In Wallingford, Seattle." That's okay to do. That's not going to create the cannibalization that it might have in years past.


Linking with appropriate anchor text is very helpful. So let's say here's my coffee addict's guide to Seattle, and I've got links in here: "Best coffee roasters in Seattle," "Best espresso in Seattle," "Best coffee online from Seattle's roasters." Great. So now I have unique keyword phrases that I'm targeting, and I'm going to link out to each of these pages, and then from each of these pages, if I've got my best online coffee from Seattle roasters page, I probably do want to link to my best espresso in Seattle page with that anchor text. Call it what the page is. Don't just say, "For some great espresso places in Seattle, click here." No. "Click here," not great anchor text. "Best espresso in Seattle," that's the anchor text I generally want to have, and that's not just for search engines. It's also for users.


Number four, the last part about keyword cannibalization is if you have older pages, this happens a lot for bloggers and content marketers who are producing pages, lots of unique content over time, but some of it is repetitive. So if you have an older page, it can be very wise to retire that content in favor of something newer and fresher, and there's a number of ways to do this. I could 301 from the old URL to the new one. I could use a rel=canonical to point from my old piece of content to my new one on the same topic. Or I could refresh the existing page, essentially take the same URL, dump the old content, and put the new content on there. I could even archive the old content on a brand new page that's sort of like, "Hey, if you want the old version of this, here it is."


You can see we've done that at Moz several times with things like MozCon, with our industry survey, with our old ranking factors. We sort of move that old content off to another URL and put the new stuff up at the URL that's been ranking, been performing so that we don't have the challenge of having one trying to compete against another.


These techniques can be really helpful for those of you who've got sites and you're producing lots of content, you're targeting many keywords, and you're trying to figure out how to organize these things.


I look forward to some great comments. Thanks very much gang. I'll see you again next week for another edition of Whiteboard Friday. Take care.



Video transcription by Speechpad.com




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://ift.tt/1ghlaOx

Google's 2014 Redesign: Before and After

Posted by Dr-Pete


Over the past few months, Google has been testing a redesign of both their overall SERP format and their AdWords blocks. In the past day or two, it appears that they've rolled these changes out to a large part of their audience. While we still have a chance to grab before and after versions of the SERPs, I thought it would be worth a quick stroll down memory lane and a look at the future of Google.


I. Basic search result


Let's start with a pretty basic search result, a query for [pygmalion]. Here's the before and after:



The title font in the new version is slightly bigger, and Google has done away with the underlining. Interestingly, the source URL is actually a little smaller. The snippet and mini-links seem to have remained the same.


II. Expanded site-links


Here's a #1 result with expanded site-links. The query is [carolina place mall]:



Like the main result, site-links are also getting the larger title font without underlines. This example also clearly shows that some title tags will get cut off with the new, larger font. This could impact click-through rates, so you may want to consider shorter titles going forward (at least for critical pages).


Notice the faint horizontal divider at the bottom. This sets the expanded #1 result apart from the rest of the SERP. These horizontal dividers are used frequently in the new design, and I strongly believe that they are a move toward a more card-like look (akin to mobile, Google+, and Google Now).


III. Image vertical results


This is what the new image vertical results look like. The query is [roger williams university]:



The new format has the new font, plus a fairly pronounced "More images…" link. Again, the vertical results are separated (above and below) by a horizontal divider. The images themselves appear to be formatted the same.


IV. News vertical results


Here's a query for [wtop traffic], showing the redesigned news vertical results. Note that these were captured on different days, so the actual articles have changed—the count/layout are equivalent, though:



All articles links are using the larger font (with the same implications for length/wrapping). Like image vertical results, news results get a top and bottom divider. In general, you can see that almost every type of result is taking up significantly more vertical space.


V. Local pack results


Here's a 3-pack of local results, for the query [lands end] and focused on San Diego, CA:



Larger font, no underlines, horizontal dividers—you know the drill. Note the lighter-gray text on the actual location information (address and phone).


VI. In-depth articles


Here's a look at Google's newest vertical, in-depth articles. The query is [palm oil]:



The redesign pretty much follows the pattern of the other verticals. Note that the actual header font—"In-depth articles"—is a bit smaller and slightly grayed out.


Google has been testing many variations of in-depth articles, and all of them suggest that this expanded format may be replaced with something more Spartan. Here's a recent test (this is not live, and this design will likely change), for the query [foreclosure]:



While this test format follows the rules of the redesign, it is in every other way dramatically different from Google's current treatment of in-depth articles. Note that this test version appeared in the "#2" slot (right after the first organic result), whereas current in-depth article blocks usually appear at or near the end of page 1. Expect in-depth articles to get a major overhaul in the next few months.


VII. Video thumbnails


In 2014, video results are really more of an enhancement than an actual vertical. Here's a quick before and after for the query [wild kratts]:



This is essentially just an organic result, with a bit of information and a thumbnail added—the general layout and thumbnail characteristics have remained the same. This also true of authorship results and review snippets—the title and URL fonts have changed, but the general layout, thumbnail size, etc. seem to all be the same.


VIII. AdWords (top)


On top of the general design change, Google has been testing a new AdWords format for months—these may be rolling out together, but the tests themselves have been separate. Here's a reasonably complex AdWords block from the top of a query for [keens]:



In addition to the larger, non-underlined titles and horizontal divider, the colored background is gone, and a yellow [Ad] box appears next to each individual ad. The "Ads related to…" text has been removed as well.


IX. AdWords (right)


The AdWords block in the right-hand column has also changed, but the difference is a bit less dramatic. Here's the same query ([keens]):



There's just one yellow [Ads] label for the entire block, and there's no change to the background (because the old version didn't have a colored background). The new fonts do expand the titles significantly and increase the vertical area of the total ad space.


Note that the AdWords block on the bottom of the left-hand column looks very similar to the redesigned top AdWords block. Other SERP elements, including the knowledge panel, answer boxes, paid shopping, and carousels seem to have been unaffected by the redesign (so far).


It's in the cards


Back in November, I predicted that Google would move toward a more card-like format in 2014. While my future SERP concepts were heavily influenced by mobile and Google Now and are more extreme than the currrent redesign, don't overlook the way Google is using dividers to separate out SERP elements. As mobile and tablet proliferate, and new devices like Glass come into play, Google wants to have SERPs that they can easily mix-and-match, providing whatever combination is most relevant for each device and situation. For now, desktop remains a fixed, two-column format, but Google's design decisions are being driven more and more by mobile devices, and the future is in individual information elements that can be easily rearranged.


To see this idea in action, here's a local (Chicago suburbs) search for [starbucks]. Notice how the dividers separate the expanded top ad, the expanded #1 result, a local 3-pack, a news box, and, finally, the rest of the organic results:



While a horizontal line might not seem like a big change, Google is clearly working to carve up the SERP into units that can potentially be mixed and matched. Also note where "#2" is on this page. As simple as they may seem, these design changes are redefining organic results.


Do you like it?


Trick question—no one cares. Sorry, that was a bit harsh, but here's the reality: Google has been testing this for months across what are probably millions of unique visitors. A few dozen marketers complaining about the new design is not going to sway their decision. At this point, the decision is 98% made, and it's made based on Google's goals and Google's data. The best you can do is try to assess how these changes impact your bottom line and adjust accordingly. Don't waste your time shouting at the wind.


One final note: While this redesign seems to be rolling out, Google has not officially confirmed the change and it may still be in testing (albeit widespread testing). I wanted to put together a post while we could still compare and contrast the before and after versions, but this design could still change over the next few days, weeks, or months.


Update: In the comments, Gaurav pointed out that Google's lead search designer, Jon Wiley, confirmed the roll-out yesterday on Google+. Looks like it is at least mostly official.




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://ift.tt/O52HyH

Troubleshooting Local Ranking Failures: A Beginner's Guide

Posted by MiriamEllis


The fallout from lost Google local rankings can be drastic, from silent phones in the office to a loss of pride in a company's standing in the community. Don't panic: Be proactive and take the steps outlined in this guide to begin troubleshooting the cause of your ranking failure. This article is intended for both local business owners and new Local SEOs who will benefit from having a set of procedures to follow should local rankings go south. While I can't cover every possible cause of local ranking issues, the steps outlined below will help you surface major, common problems and take steps to correct them. This graphic provides an overview, and the details of each step follow.





Check for mass issues



The first thing to discover is whether the issue being experienced is part of a major change or bug in Google's system. It's extremely common for a single issue to affect enormous numbers of businesses when something goes wrong with Google's local product or when they change a guideline, turn a filter on or off, or alter their algorithm.



Start by going to the Google and Your Business Forum. Search for the problem you're experiencing in a variety of ways in the search box. If you see multiple threads reporting your identical issue stemming from a time near the date you began to notice the problem, there is a good chance that a bug or update may be the cause of the situation. Read all of the existent threads to see what the Top Contributor members of the forum are saying. Frequently, a Google staffer will chime in on these threads with Google's position on the issue. Some problems may require that you take a specific action, such as performing a null edit on your listing, whereas others dictate that you sit tight until the matter is resolved.



Other good places to check for news of mass issues are:


Linda Buquet's Local Search Forum

Mike Blumenthal's Blog

51 Blocks' Google Local Weather Report

Moz Q&A Forum


Rule out obvious violations





One of the quickest ways to incur a penalty or ranking issues is to violate one of Google's clearly stated quality guidelines. Unfortunately, there is some grey area surrounding certain aspects of Google's rules, but on many points, Google is completely straightforward. Read the Google Places Quality Guidelines and be sure that basic (but major) mistakes haven't been made in regards to the naming of the business, address, phone number, number of listings and similar components.



If you are new to Local SEO, there is a learning curve involved in understanding Google's unique take on how local businesses should present themselves. For example, many real world businesses use vanity phone numbers in their advertising. This is what helps me remember to phone 1 (800) COMCAST every time my Internet service hiccups. But if my provider chose to try to enter this as their phone number when creating their Google+ Local listing, they would be falling outside of Google's guidelines, both because they'd be using a toll free number and because it's a vanity number. There are dozens of instances in which common real world practices don't gel with Google's vision of Local. This is why it pays to take the time to memorize Google's guidelines and check back with them for the updates Google releases from time to time.



Breaking even one of Google's rules can cost you your rankings. Be sure that your troubleshooting work includes a double check for obvious violations.


Check for NAP consistency issues





NAP is the acronym for your business name, address and phone number and it represents the very core of your business data. NAP is also sometimes referred to as NAP+W because your website is also a key component of this data. Every place your complete or partial NAP is listed on the web is referred to as a 'citation'. Because Google checks not just your Google+ Local page and the website it links to for information about your business, but checks the whole Internet, it's absolutely vital that they find a consistent presentation of your data in your citations. Small inconsistencies concerning abbreviations like Howard St. vs. Howard Street do not matter, but things like this do matter:



  • Differences in the business name

  • Different numbers or spelling in street name

  • Lack of suite number on some citations but not others

  • Different phone numbers

  • Citations pointing to more than one website for the business


These kinds of discrepancies can 'confuse' Google and cause them to lose trust in the data they've accumulated about your business, leading to ranking problems. The basic rule of thumb here is to ensure that your NAP is identical across the web. Here are three methods for auditing your citations to discover NAP consistency issues:



Use a free tool

GetListed.org allows you to plug in your business name and zip code to be returned data about your listings on core directories. At a glance, you will be able to see if there are mismatches in your NAP on these important platforms. It's wonderful that this tool is free. Two limitations of this option are that the tool currently only supports US-based businesses, and that it searches only a limited number of platforms for your listings. This is an excellent place to start your citation audit, but once you have cleaned up errors on these core platforms, you will need to look beyond them for additional NAP consistency issues.



Use a paid tool or service

BrightLocal's Citation Tracker tool is a paid product that will surface your citations from around the web, enabling you to see if there are NAP consistency problems that need to be corrected. The tool also alerts you to new places to get your business listed. One helpful aspect of BrightLocal's products is that they support multiple countries. If you are feeling overwhelmed by the task of auditing and/or cleaning up your citations, you can pay a company like Whitespark to do it for you. While there is no reason why a local business owner cannot audit and repair his own citations, paid tools can represent a major savings of time and effort in return for an investment.



Manual search and cleanup

This method requires an investment of time and should be approached in an organized fashion. Create a simple spreadsheet. Then, go to Google and begin searching for the following:



  • Your complete business name

  • Your partial business name or any variants you have ever used

  • Your street address, including old addresses if you have moved within the past decade

  • If you have a suite number, search for your address both with and without it

  • Your phone number, including old phone numbers you may have used in the past decade

  • The names of partners, if you are in a multi-partner practice


Input the listing details and URL in the spreadsheet of each citation you find. Review the spreadsheet to check for NAP inconsistencies, and then, track your progress on the document as you make efforts to clean up any problematic listings. Remember that citations are not limited to local business directories or review sites; they comprise any mention of your complete or partial NAP, including news sites, social media platforms and blogs.



It can take time to see results from citation cleanup jobs. Be patient and know that the more cohesive your overall NAP is across the web, the better your chances of obtaining consistent, high local rankings.


Seek out duplicate Google+ Local listings





With the exception of multi-partner practices (like law firms) and businesses with multiple departments (like hospitals), each business is allowed to have just one Google+ Local Page per location. Having more than one is not only a violation of the Google Places Quality Guidelines but is also likely to cause local search ranking failures. Sometimes, duplicate listings are intentionally created by business owners due to a lack of understanding of the rules and sometimes they are built by spammers for manipulative purposes. In other cases, however, owners may have no idea that they've got duplicates, because they've been automatically generated by Google or have resulted from some scenario such as moving locations or changing a phone number.



Clues that you may have duplicates include a discrepancy in the name, phone number, address, categories or description that you see in your Google Places for Business Dashboard vs. what is appearing live on your Google+ Local listing. Even if you don't see any variance, however, it's a good idea to check for duplicate listings if you're experiencing ranking issues. Start by going to maps.google.com. *Be sure you are viewing the old, classic version of Google Maps, not the new Google Maps (for more on why this is important, read this fascinating thread on Linda Buquet's Local Search Forum.) Once inside the classic Maps, take these actions:



  • Search for your business name

  • Search for variants of your business name, including old names and names of partners in your business

  • Search for your business name + your address

  • Search for your address, including addresses you may have occupied within the past decade and search both with and without a suite number, if you have one.

  • Search for your phone number, including all phone numbers you may have used within the past decade


If any of these searches brings up more than one listing in the left hand column of the page, beside the large map, then you are dealing with a duplicate. Be sure you are clicking the link at the bottom of the results to view all listings, if the link exists.



Getting rid of duplicates is a major cause of confusion in the local business arena. There are so many 'if' clauses involved. I have attempted to set out here a set of steps for businesses in a variety of scenarios which will hopefully lead to the removal of duplicates. I want to personally thank Google and Your Business Forum Top Contributors Mike Blumenthal and Linda Buquet for conferring with me about some of the following points:



If the listing is claimed and appearing in your dashboard:



Delete it via the dashboard. This will hopefully remove it from the index and prevent its reappearance. But, if you continue to see the listing appearing live and you have the New Google+ Local Dashboard, take the following steps:



  • Sign into the dashboard of your Google+ Local Page

  • Click the 'Edit Business Information' link where your NAP information is listed

  • Scroll to the bottom of the page and click the 'Get Help With This Listing' link

  • Click the 'Contact Us' button in the upper right of the page

  • Click the 'Call Us' button on the popup

  • Provide your name and phone number and Google will call you


If you still have the old dashboard:



Apparently this option exists behind a 'help' link or 'gear' icon. Unfortunately, I no longer have access to an old Google Places dashboard, so need to recommend that you search for the above in order to facilitate a call with Google.



If you run a brick-and-mortar business and the listing either isn't claimed or doesn't exist in your dashboard, you have three options:



1. Do a brand name search in Google's main search engine. Select the 'feedback link' from the knowledge panel to the right of the search results to report the issue.



2. From the live Google+ Local page, select 'edit details' and report the issue.



3. If you have access to the old Google Maps, use the 'report a problem' link.



All of these options will take you to a MapMaker URL that enables you to report the place as a duplicate of another listing. Be sure to provide the URLs of both the authoritative listing and the duplicate listing so that Google can understand which one you want shut down. *Note that this will not work for service area businesses with hidden addresses. If you have such a business, I recommend trying the form located behind the "Contact Us" button on this troubleshooter.



If several weeks go by and none of the above methods have worked, I recommend posting a request for help at the Google And Your Business Forum. Getting duplicates deleted may require a bit of work and patience, but it is worth it when you consider that their presence is both a violation of the guidelines and a drain on your company's ability to rank well.


Consider hidden merged duplicates and multi-practitioner listings





I want to make note here of two subtopics which are related to the theme of duplicate listings: duplicates for multi-practitioner listings and hidden merged duplicates. These involve advanced troubleshooting techniques I don't plan to cover in this article. It's my opinion that Linda Buquet covers these subjects better than anyone else in the industry. If obvious duplicates do not surface from the above steps and you've ruled out more common ranking hazards, I recommend that you read the following:


WARNING: Dual Claimed Ranking Penalty - Drop like a Rock or Get Suspended



Warning Email re Google Places Duplicate Listings



Multiple attorneys sharing the same NAP


Identify website quality issues



The on-page quality of your website and the methods you choose for marketing it play a major role in your local search rankings. When troubleshooting ranking issues, take all of the following into consideration:



  • Is the website properly optimized for local search? Does it have the complete NAP on the Contact Us page and elsewhere on the site, such as in the footer? Have keywords been reflected sensibly in elements like title tags and copy?

  • Is the website over-optimized? Are tags and copy stuffed with keywords to the point that the site reads poorly to human users and might be considered spammy?

  • Does the website feature clear, crawlable navigation? Can humans and bots make their way through the site easily?

  • Does Google Webmaster Tools indicate that the site is being crawled properly? Could there be errors in places like the robots.txt file, preventing indexing?

  • Is the website content professional, or does it contain errors of spelling or grammar that could result in it being deemed of low quality?

  • Do pages on the site feature thin content? Does it appear that multiple pages were published for manipulative reasons instead of for the purpose of helping human visitors?

  • Do pages on the site feature duplicate content, stemming either from other pages of the site or scraped from third-party sources?

  • Has the business taken a multi-site approach, which can be a danger zone for thin/duplicate content and NAP+W consistency issues?

  • Does the website feature a natural-looking backlink profile, or could questionable practices have incurred a famous Google penalty?

  • Has the website suffered from known penalties in the past? Is it certain that steps were taken to rectify problems and get back into Google's good graces?


The interesting thing about some of the above factors is that they represent a judgment call on the part of the troubleshooter. While marketers can easily audit a website without bias, it can sometimes be challenging for local business owners to critique their own websites objectively because of the amount of time and money they have invested in them. When it comes to somewhat nebulous issues like the quality of content or possible over-optimization, it can help to get second opinions from impartial parties.


Moz members have access to the On-Page Grader Tool, which highlights potential issues surrounding keyword usage, URL formatting, links and other quality factors. If you are not currently a Moz member, you might like to check out Hubspot's Marketing Grader, which is free. There are many other free website analysis tools available, with differing degrees of value. Your website is the Internet's most authoritative document about your business, and how it is developed and marketed directly contributes to high or low local rankings. If you suspect you are operating a low-quality website, you may want to consider hiring a reputable Local SEO firm to provide a custom audit. If a business is failing to rank well locally, the website must be considered as a possible contributing factor.


Discover relationship to the Google Maps Business Centroid



This is one of the more complex local search ranking factors. In the past, it was believed that businesses located nearest to the official geographic center of a city had an edge over competitors located further from the heart of town. Recently, however, it has become apparent that the "centroid" refers not to the city center, but to an area of town which Google has deemed to be the center of a specific industry within that city. In other words, it appears that Google may feel there is one centroid for restaurants and another for car dealerships within the same city. Examples have been surfaced of apparent business centroids being located completely outside of the downtown districts of major cities.


What this means for the troubleshooter is that it's important to do a search on Google Maps for the industry in question. Note if businesses ranking A-J appear to be clustered within a certain area of the target city. Then, assess whether the street address of the business in question falls inside or outside of that cluster. Keeping this in mind can be helpful when troubleshooting a business which falls shy of ranking in the top 7 of the Google Local Pack, particularly if you have been unable to discover any obvious penalties. Though the earlier concept of a city centroid has been abandoned, location still does matter in the form of the business centroid.


I am unaware of any official studies that have been undertaken detailing how to overcome the business center bias if a business is located outside of the cluster. I would propose that the best method of attempting to do so would be to continue to build both local and organic authority in hopes of getting ahead of less active competitors.


Factor in time and authority



Over the years, I've received frantic phone calls from business owners eager to know why they aren't ranking better in the local pack of results. If I asked about complex factors first, I might spend thirty minutes chatting before I realized that time was actually all that was working against the company. This experience helped me learn to ask early on in these conversations the specifics of certain actions, such as:



  • When the website was published

  • When the Google+ Local page was created

  • When the business started building third party citations


If a business is just getting started with the above components, then they shouldn't expect to have earned automatic dominance in the local search results, unless they are literally the only game in town. The fruits of these kinds of labors take time to take effect. You can't move the clock forward, but you can get busy on building buzz about your business via social media while waiting for your more central efforts to begin earning the rankings you hope for. Time is a local search ranking factor.


By the same token, the concept of authority is a local search ranking factor. Have you built the best website for your industry in your city? Are you continuing to build it out with exceptional, fresh content? Have you implemented a review acquisition strategy? Are you earning positive reviews across a variety of platforms and handling negative reviews proactively? Are you earning high quality links and social mentions? Are you participating in offline activities that are earning your company online notice? The benefits of these activities will seldom become apparent overnight, but a consistent effort to reach out and be visible to your local community will help you to build authority that stands the test of time.


Can't find anything wrong? Consider other possibilities



If you're troubleshooting a local ranking failure and, by going through this article, you've determined that the company isn't suffering from a mass bug, an obvious violation, NAP inconsistencies, visible duplicates, hidden merged duplicates, business centroid issues or issues of quality, time or authority, you must be wondering what to do next.


While I can't cover every possible cause of ranking failures, I can make a few more suggestions for things to consider if you've ruled out all of the above.


Shared categories between multi-practitioner listings

If a legal firm, medical practice or similar business model has taken advantage of Google's invitation to create a listing for each partner as well as one of the practice, it's important to know that many Local SEOs believe that duplicating categories between the listings may cause ranking failures. So, for example, if the practice listing is using 'Elder Law Attorney' as one of its ten Google+ Local categories, then that same category should not be used on the listings of any of the partners. You will not find advice regarding this in the Google Places Quality Guidelines, but if you've ruled out other issues and see that there are shared categories at play, you'll want to know that this theory exists. It's believed that it's best to divvy up available categories between the practice and the various partners so that none of the listings feature duplicates.


Geographic terms in the business description

This is an oldie, and while I've not heard recent reports of issues with this, it's worth mentioning. In the early days of Google's local product, it was observed that including city names and other geo terms within the business description field appeared to have a negative effect on rankings. Again, this was never mentioned in the official guidelines. Google seems to have downgraded the importance of the business description over the past couple of years, so how great an effect this practice might be having on listings these days is unknown. Still, if you've got mysterious ranking problems and have geo terms in your description, removing them can't hurt and might just help.


Disconnected place page

This is a strange one, brought to my attention by Linda Buquet. The theory here is that if a local business has a high organic ranking but poor local rankings, the website and Google+ Local page may be disconnected. Linda posits that this can happen due to issues covered above, such as guideline violations, distance from business centroid and NAP inconsistencies. A good way to gauge your pure organic authority is to search in AOL.com. If you see great organic rankings there and a lack of local pack rankings in Google, you may be dealing with a disconnect stemming from the above problems. Time to troubleshoot such issues and clean them up, if possible.


Map zooming conundrum

Here's another weird one I've seen discussed in Linda's forum in which a listing is dropping in and out of the pack due to the Maps radius expanding and contracting from search to search. In the case I've seen highlighted, the business in question was located on the outskirts of town and when the radius tightened, they dropped off the map and out of the pack. If your business is at the edge of the map shown for your important searches and you've got rankings that are playing hide-and-seek, I would suggest taking screenshots of a variety of searches over the course of several weeks to see if the radius of the map is changing. Given that this is based upon Google's display of geography, there is unlikely to be an easy fix for any business in this peculiar scenario.


Standing still

If your business previously enjoyed higher local rankings that have gradually ebbed away, you must take into consideration whether you have been standing still while competitors have been busy. A loss in rankings may have nothing to do with violations or inconsistencies and may simply be the product of a competitive market. Analyze your competitors to discover if they've surpassed you by dint of superior marketing efforts.


Organic penalties

Remember that organic factors play a major role in local rankings. If the website is hit with a major penalty or falls on the wrong side of a Google update, it can absolutely effect your local pack rankings. This is one of the reasons why it is so important to consistently monitor how you're doing, ranking-wise. If you notice a sudden drop off, the date on which it occurred can yield vital clues as to whether the issue coincides with a web-wide change effecting everybody. Keep up with your reading of major local and organic SEO blogs so that you know when trouble hits and how experts are making efforts to recover.


History of spamming

This one is especially for new Local SEOs. You will sometimes be contacted by local business owners who have made a practice of spamming Google, either via their own volition or because they've been poorly advised by past marketers. An example of this might be a lawyer calling you for help and the conversation revealing that he has set up ten virtual offices in order to create multiple Google+ Local pages. In my experience, some businesses dig themselves into too deep of a hole for me to dig them back out. Such cases will involve a judgement call on your part - do you feel reasonably certain that you can undo spam, including penalties the business may have incurred, and put spamming clients back onto the straight-and-narrow? If not, pass on the work. You will be saving yourself a major headache.


Troubleshooting is seldom easy

I am far from being the world's best troubleshooter! As a Local SEO, I much prefer building up a positive business presence to trying to diagnose and fix a troubled one. That being said, we all run into businesses with outstanding issues that need to be corrected before new, quality work can shine through.


I sincerely hope having this list of pitfalls to check for will help you troubleshoot not only sudden ranking drops, but also, problems that could be putting a damper on the effectiveness of your overall local marketing campaign.



Do you have additional techniques you use to surface common local ranking problems? Please contribute to the usefulness of this article by sharing them with the whole community!




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://ift.tt/1niYhE6

12 Ways to Increase Traffic From Google Without Building Links

Posted by Cyrus-Shepard


Link building is hard, but it's not the only way to make traffic gains in Google's search results.


When I first started SEO, building links wasn't my strong suit. Writing outreach emails terrified me, and I had little experience creating killer content. Instead, I focused on the easy wins.


While off-page factors like links typically weigh more heavily than on-page efforts in Google's search results, SEOs today have a number of levers to pull in order to gain increased search traffic without ever building a link.


For experienced SEOs, many of these are established practices, but even the most optimized sites can improve in at least one or more of these areas.


1. In-depth articles


According to the MozCast Feature Graph, 6% of Google search results contain In-depth articles. While this doesn't seem like a huge numbers, the articles that qualify can see a significant increase in traffic. Anecdotally, we've heard reports of traffic increasing up to 10% after inclusion.



By adding a few signals to your HTML, your high quality content could qualify to appear. The markup suggested by Google includes:



While Google seems to favor authorities news sites for In-depth Article inclusion, most sites that may qualify don't have the proper semantic markup implemented.


2. Improving user satisfaction


Can you improve your Google rankings by improving the onsite experience of your visitors?


In many ways the answer is "yes," and the experience of several SEOs hints that the effect may be larger than we realize.


We know that Google's Panda algorithm punishes "low-quality" websites. We also know that Google likely measures satisfaction as users click on search results.


"… Google could see how satisfied users were. … The best sign of their happiness was the "long click" – this occurred when someone went to a search result, ideally the top one, and did not return."



-Stephen Levy from his excellent book In the Plex


The idea is called pogosticking, or return-to-SERP, and if you can reduce it by keeping satisfied visitors on your site (or at least not returning to Google to look for the answer somewhere else) many SEOs believe Google will reward you with higher positions in search results.




Tim Grice of Branded3 reports a saying they have at their SEO agency:


"If you have enough links to be in the top 5, you have enough links to be position 1″

While we have no direct evidence of pogosticking in Google's search results, we've seen enough patents, interviews and analysis to believe it's possibly one of the most underutilized techniques in SEO today.


3. Rich snippets from structured data


Google constantly expands the types of rich snippets it shows in search results, including events, songs, videos and breadcrumbs.


The first time I heard about structured data was from a presentation by Matthew Brown at MozCon in 2011. Matthew now works at Moz, and I'm happy to glean from his expertise. His Schema 101 presentation below is well worth studying.




If you're just getting started, check out this amazingly helpful Guide to Generating Rich Snippets from the folks at SEOgadget.


Two of our favorite types of markup for increasing clicks are videos and authorship, so we'll discuss each below.


4. Video optimization


Pixel for pixel, video snippets capture more search real estate than any other type of rich snippet, even more than authorship photos. Studies show our eyes go straight to them.



Eye-Tracking Google SERPs - 5 Tales of Pizza


Unlike author photos, video snippets are often easier to display and don't require connecting a Google+ account.


Video snippets generally require creating a video XML sitemap and adding schema.org video markup.


To simplify things, many third party services will take care of the technical details for you. Here at Moz we use Wistia, which creates a sitemap and adds schema.org markup automatically.


Pro tip: Both schema.org and XML sitemaps allow you to define the video thumbnail that appears in search results. As the thumbnail highly influences clicks, choose wisely.


Recommended reading: Getting Video Results in Google


5. Google authorship


Scoring the coveted author photo in Google search results doesn't guarantee more clicks, but getting the right photo can help your click-through rate in many results.


What makes a good author photo? While there are no rules, I've personally tested and studied hundreds of photos and found certain factors help:



  • Use a real face, not a company logo, cartoon or icon

  • High contrast colors. Because the photo is small, you want it to stand out with good separation between the background and foreground.

  • Audience targeted. For example, young Disney fans are probably less likely to click on an old guy in a suit who looks like a financial adviser.


Google recently got more selective about the author photos it chooses to show, but if you implement authorship correctly you may find yourself in the 20% (according to MozCast) of all search results that include author photos.


6. Improving site speed


Improving site speed not only improves visitor satisfaction (see point #1) but it may also have a direct influence on your search rankings. In fact, site speed is one of the few ranking factors Google has confirmed.


One of the interesting things we learned this year, with help from the folks at Zoompf, is that actual page load speed may be far less important than Time to First Byte (TTFB). TTFB is the amount of time it takes a server to first respond to a request.



As important as page speed is for desktop search Google considers it even more important for mobile devices. Think about the last time you waited for a page to load on your cell phone with a weak signal.


"Optimizing a page's loading time on smartphones is particularly important given the characteristics of mobile data networks smartphones are connected to."



- Google Developers

Suggested tool: PageSpeed Insights


7. Smartphone SEO


Aside from speed, if your website isn't configured properly for smartphones, it probably results in lower Google search results for mobile queries. Google confirms that smartphone errors may result in lower mobile rankings.


What is a smartphone error? It could include:



  • Redirecting visitors to the wrong mobile URL

  • Embedding a video that doesn't play on a particular phone (Flash video on an iPhone, for example)

  • Pop-ups that aren't easily closed on mobile

  • Buttons or fonts that are too small on a mobile device


Google recommends making your site responsive, but many of the top brands in the world, including Apple.com, don't have responsive sites. Regardless, a good mobile experience is imperative.


8. Expanding your international audience


Does your website have traffic potential outside your existing country and/or language?


Our international experts like Aleyda Solis know this well, but folks inside the United States have been slow to target specific languages and countries with SEO.


Oftentimes, the opportunities for appearing in international search results are greater than staying within your own borders, and the competition sometimes less. To see if it's worth your while to make an investment, check out this International SEO Checklist by Aleyda (who is also a mobile SEO expert—it's so unfair!)



9. Social annotations with Google+


When you share content on Facebook and Twitter, your network basically sees it only when they are looking at Facebook and Twitter.


On the other hand, when you share content on Google+, your network can see it every time they search Google.


Google's own research shows that users fixate on social annotations, even when presented with videos and other types of rich snippets.


The easiest way to take advantage of this is to expand your Google+ network and share good content regularly and often. Rand Fishkin elegantly explains how to use Google+ to appear in the top of Google results every time.


Additionally, content shared through Google+ often ranks in regular search results, visible to everyone on the web, regardless of their social connections.


10. Snippet optimization


This goes back to basic meta tag and title tag optimization, but it's a good practice to keep in mind.


In the past two years, Google changed the maximum length of title tags so that it's no longer dependent on the number of characters, but on the number of pixels used, generally around 500 pixels in length. This keeps changing as Google tests new layouts.



Because 500 pixels is difficult to determine when writing most titles, best advice is still to keep your titles between 60-80 characters, or use an online snippet optimization tool to find your ideal title tag length.


Google also updated its advice on meta descriptions, further clarifying that duplicate meta descriptions are not a good idea. Matt Cutts tells us that if you can't make your descriptions unique for each page, it's better to have none at all.


"You can either have a unique meta tag description, or you can choose to have no meta tag description."



Google's Matt Cutts

Given that duplicate meta descriptions are one of the few HTML recommendations flags in Webmaster Tools, does this indicate Google treats repetitive meta descriptions as a negative ranking factor? Hmmm….


11. Updating fresh content


Websites that stop earning new links often lose ground in Google search results. At the same time, sites that never add new content or let their pages go stale can also fall out of favor.


Freshening your content doesn't guarantee a rankings boost, but for certain types of queries it definitely helps. Google scores freshness in different ways, and may include:



  • Inception date

  • The amount (%) your content changes

  • How often you update your content

  • How many new pages you create over time

  • Changes to important content (homepage text) vs. unimportant content (footer links)



Recommended reading: 10 Illustrations on How Fresh Content Can Influence Rankings


12. Ongoing on-page SEO


The factors listed here only scratch the surface of earning more real estate in search results. Issues such as indexing, crawling, canonicalization, duplicate content, site architecture, keyword research, internal linking, image optimization and 1,000 other things can move ranking mountains.


The job of the Technical SEO becomes more complex each year, but we also have more opportunities now than ever.


It's easy to think nothing is new in SEO, or that SEO is easy, or that Google will simply figure out our sites. Nothing is further from reality.


The truth is, we have work to do.




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://ift.tt/NUWEg5

How to Set Up Meaningful (Non-Arbitrary) Custom Attribution in Google Analytics

Posted by Tom.Capper


Attribution modeling in Google Analytics (GA) is potentially very powerful in the results it can give us, yet few people use it, and those that do often get misleading results. The built-in models are all fairly useless, and creating your own custom model can easily dissolve into random guesswork. If you’re lucky enough to have access to GA Premium, you can use Data-Driven Attribution, and that’s great—but if you haven't got the budget to take that route, this post should show you how to get started with the data you already have.



If you've read up on attribution modelling in the past, you probably already know what’s wrong with the default models. If you haven’t, I recommend you read this post by Avinash, which outlines the basics of how they all work.


In short, they’re all based on arbitrary, oversimplified assumptions about how people use the internet.


The time decay model


The time decay model is probably the most sensible out of the box, and assumes that after I visit your site, the effect of this first visit on the chance of me visiting again halves every X days. The below graph shows this relationship with the default seven-day half-life. It plots "days since visit" against "chance this visit will cause additional visit." If it takes seven days for the repeat visit to come around, the first visit's credit halves to 25%. If it takes 14 days for the repeat visit to come around, the first visit's credit halves again, to 12.5%. Note that the graph is stepped—I'm assuming it uses GA's "days since last visit" dimension, which rounds to a whole number of days. This would mean that, for example, if both visits were on the day of conversion, neither would be discounted and both would get equal credit.



There might be some site and userbase out there for which this is an accurate model, but as a starting assumption it’s incredibly bold. As an entire model, it’s incredibly simplistic—surely we don’t really believe that there are no factors relevant in assigning credit to previous visits besides how long ago they occurred? We might consider it relevant if the previous visit bounced, for example. This is why custom models are the only sensible approach to attribution modelling in Google Analytics—the simple one-size-fits-all models are never going to be appropriate for your business or client, precisely because they’re simple, one-size-fits-all models.


Note that in describing the time decay model, I’m talking about the chance of one visit generating another—an important and often overlooked aspect of attribution modelling is that it’s about probabilities. When assigning partial credit for a conversion to a previous visit, we are not saying that the conversion happened partly because of the previous visit, and partly because of the converting visit. We simply don’t know whether that was the case. It could be that after their first visit, the user decided that whatever happened they were going to come back at some point and make a purchase. If we knew this, we’d want to assign that first visit 100% credit. Or it might be that after their first visit, the user totally forgot that our website existed, and then by pure coincidence found it in their natural search results a few days later and decided to make a purchase. In this case, if we knew this, we’d want to assign the previous visit 0% credit. But actually, we don’t know what happened. So we make a claim based on probabilities. For example, if we have a conversion that takes place with one previous visit, what we’re saying if we assign 40% credit to that previous visit is that we think that there is a 40% chance that the conversion would not have happened without the first visit.



If we did think that there was a 40% chance of a conversion being caused by an initial visit, we’d want to assign 40% credit to “Position in Path” exactly matching “First interaction” (meaning visits that were the user's first visit). If you want to use “Position in Path” as your sole predictor of the chance that a visit generated the conversion, you can. Provided you don’t pull the percentages off the top of your head, it’s better than nothing. If you want to be more accurate, there’s a veritable smorgasbord of additional custom credit rules to choose from, with any default model as your starting point. All we have to do now is figure out what numbers to put in, and realistically, this is where it gets hard. At all costs, do not be tempted to guess—that renders the entire exercise pointless.


Tested assumptions


One tempting approach is simply to create a model based to a greater or lesser extent on assumptions and guesswork, then test the conclusions of that model against your existing marketing strategy and incrementally improve your strategy in this manner. This approach is probably better than nothing for improving your market strategy, and testing improvements to your strategy is always worthwhile, but as a way of creating a realistic attribution model this starting point is going to set you on a long, expensive journey.


The ideal solution is to do this process in reverse—run controlled experiments to build your model in the first place. If you can split your users into representative segments, then test, for example,



  • the effect of a previous visit on the chance of a second visit

  • the effect of a previous non-bounce visit on the chance of a second visit

  • the effect of a previous organic search visit on the chance of a second visit


and so on, you can start filling in your custom credit rules this way. If your tests are done well, you can get really excellent results. But this is expensive, difficult, and time consuming.


The next-best alternative is asking users. If users don’t remember having encountered your brand before, that previous visit they had probably didn’t contribute to their conversion. The most sensible way to do this would be an (optional but incentivised) post-conversion questionnaire, where a representative sample of users are asked questions like:



  • How did you find this site today?

  • Have you visited this site before?



    • If yes:



      • How many times?

      • How did you find it?

      • Did this previous visit impact your decision to visit today?

      • How long ago was your most recent visit?






The results from questions like these can start filling in those custom credit rules in a non-arbitrary way. But this is still somewhat expensive, difficult and time-consuming. What if you just want to get going right away?


Deconstructing the Data-Driven Attribution model


In this blog post, Google offers this explanation of the Data-Driven Attribution model in GA Premium:


“The Data-Driven Attribution model is enabled through comparing conversion path structures and the associated likelihood of conversion given a certain order of events. The difference in path structure, and the associated difference in conversion probability, are the foundation for the algorithm which computes the channel weights. The more impact the presence of a certain marketing channel has on the conversion probability, the higher the weight of this channel in the attribution model.The underlying probability model has been shown to predict conversion significantly better than a last-click methodology. Data-Driven Attribution seeks to best represent the actual behaviour of customers in the real world, but is an estimate that should be validated as much as possible using controlled experimentation.” (my emphasis)

Similarly, this paper recommends a combination of a conditional probability approach and a bagged logistic regression model. Don't worry if this doesn't mean much to you—I’m going to recommend here using a variant of the much simpler conditional probability method.


I'd like to look first at the kind of model that seems to be suggested by Google's explanation above of their Data Driven Attribution feature. For example, say we wanted to look at the most basic credit rule: How much credit should be assigned to a single previous visit? The basic logic outlined in the explanation from Google above would suggest an approach something like this:



  • Find conversion rate of new visitors (let’s say this is 4%)

  • Find conversion rate of returning visitors with one previous visit (let’s say this is 7%)

  • Credit for previous visit = ((7-4)/7) = 43%


To me, this model is somewhat flawed (though I’m fairly sure that this flaw lies in my application of Google’s explanation of their Data-Driven Attribution rather than in the model itself). For example, say we had a large group of repeat visitors who were only coming to the site because of a previous visit, but that were converting poorly. We’d want to assign credit for these (few) conversions to the previous visits, but the model outlined above might assign them low or negative credit; this is because even though conversions among this group are caused by previous visits, their conversion rate is lower than that of new visitors. This is just one example of why this model can end up being misleading.


My best solution


Figuring out from our data whether a repeat visitor came because of a previous visit or independently of a previous visit is hard. I’ll be honest: I don’t know how Google does it. My best solution is an approximation, but a non-arbitrary one. The idea is using the percentage of traffic that is either branded or direct as an indicator for brand familiarity. Going back again to how much credit should be assigned to a single previous visit, my solution looks like this:



  • Calculate the percentage of your new visitor traffic is direct, branded organic or branded PPC (let’s say it’s 50%)



    • Note: Obviously most of your organic is (not provided), so I recommend multiplying your total organic traffic by the % of your known keyword traffic that is branded. As (not provided) approaches 100%, you’ll have to use PPC data to approximate your branded organic traffic levels.



  • Calculate the percentage of your 2nd-time-visitor traffic is direct, branded organic or branded PPC (let’s say it’s 55%)

  • Based on the knowledge that only 50% (in this case) of people without previous visits use branded/direct, approximate that without their first visit we’d only have seen (100%-55%)*(100/50)=90% of these 2nd time visitors.

  • Given this, 10% of visitors came because of a previous visit, so we should assign 10% credit for 2nd time visits to the first visit.


We can use similar logic applied to users with 3+ visits to calculate the credit deserved by “middle interactions”.


This method is far from perfect—that’s why I recommended two others above it. But if you want to get started with your existing data in a non-arbitrary way, I think this is a non-ridiculous way to get started. If you’ve made it this far and you have any ideas of your own, please post them in the comments below.




Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!






from Moz Blog http://ift.tt/1oErKDL

Website Multilingual SEO

Multi Language on Single Domain



from Google SEO News and Discussion WebmasterWorld http://ift.tt/1o2cAK2

Friday, March 14, 2014