Eli Schwartz

Posts by

Eli

Home / Blog Archive
planet earth
SEO

International SEO: Geotarget users vs location specific pages

A common question that comes up with any website that has a global multilingual audience is what the best structure might for international SEO. Personally, I find all aspects of international SEO to be the most fascinating since the cultural nuances involved make it a constant learning exercise. Qualitative aspects aside there are very serious technical decisions to be made and architecture is one of those.

When you have a website targeted at only one market or at least one language, your website architecture and folder decisions are very simple. You will have a bunch of directories that all live under the root domain and ideally link to each other. Whether a user lives in California or New York, or even Sydney or San Francisco they will experience the website the exact same way.

From a Google perspective, provided there is no local filter applied your visibility should be similar in any location. For your query set when you are the most relevant, there will only be one page to serve to each of these locations with no intra-site conflict.

Expanding to new markets

The complexity begins when you add other languages, for example, you now want to expand into Spanish for the US market or French for the Canadian market. You will need to make a determination of what pages will be available for these new audiences and where they will live on your website. If you put them in a language folder, for example /fr/ for French does that mean you need to put English /en/ or do you just have a French folder?

Now to make it even more complex, what if you want to serve one set of French to the Canadian market and another to the French market in France. To those that are unaware, yes there are nuances that can be fairly substantial between the markets. (Think about how different UK English is from the US. Yes, you can understand it, but it might not be preferable to use if you are trying to sell something). Does this mean you now have to create a language folder with a country subfolder: /fr/ca and /fr/fr/ or do you do vice versa and have a country folder with a language subfolder: /ca/fr/ and still /fr/fr/?

This question might seem quite binary with regards to Canada and France, but we just created a bigger problem in the US. Do you now need a /us/en directory because you are now serving a Spanish audience in the US? Or do you have an English directory /en/ where the default content will live, and Spanish will live as just /es/. This decision will again open up a new can of worms since you now need to decide what to do if you want to target Mexico.

If you have a default Spanish folder of /es/, you are likely going to have to split that folder up into countries that speak Spanish too since Spanish will also be spoken differently between Spain and Latin America which is even fairly different between Latin American countries. You will definitely want to have content for Argentina that is not the same as what you have for Mexico.

Future growth planning

These are very important considerations as you start to build out your international strategy, and you want to make decisions that will take into account your future growth plans. It would be quite unfortunate to assume that you will only ever target North America and then decide a couple years later that South America is now a target market. You not only will have to build for the multiple languages and dialects in South America, you will also upset the apple cart on your existing North American infrastructure.

SEO considerations

While these are globalization decisions, we haven’t even touched on SEO yet. Google is great at parsing intent and strings in English, but in other languages it leaves a bit to be desired. There are few factual reasons for this reality, but one of the biggest ones at least according to Google’s Off The Record Search Podcast is that there is just not enough content in some languages for their algorithms to properly learn.

Without the advantage of Google having prophetic abilities to know which dialect a user might want, you could leave yourself exposef to users going to the wrong place if you allow Google to index every country and language variation.

To try to mitigate this issue, Google released a way to inform them of language and country targeting called hreflang (which I will go into in a future post), but from my experience it is hardly foolproof. Hreflangs are easy to mess up, and even if you get them precisely right, they are only a suggestion to Google. Google is free to disregard your suggestion and rank your Mexican page in Argentina if they so choose.

This can also happen in English where a UK page will be visible in Canada when the hreflang suggests a Canadian page. I have seen many instances where Hreflang either did not solve an international visibility issue or even made it worse. My preference in general when it comes to things like hreflang and even canonicals is not to rely on them when the consequences of them being ignored are high. I would much prefer to use absolute solutions like blocking pages from being indexed or not having a page ever existing.

Recommended approach

So, with this in mind, my recommendation is not to index, all sorts of directories that are full of near duplicate content. (Keep in mind while Canadian French might be different than French French it’s mostly similar which means that all content will be at risk of being marked as duplicate.) If you do need to create these pages, my recommendation would be to noindex them and make them available only to users who navigate to them on the site.

To summarize, I would recommend indexing a hybrid page for all major languages popular in multiple locations (English, Spanish, Arabic, German, Portuguese, and French) and then either redirect users to the correct language page via a geo-detection script IF LOCAL PAGES matter. If local pages don’t really matter and it’s only currency and/or contact info that is different, just these elements should be changed with a geo-detection script.

This advice would only change when there’s a substantial difference in how users in these countries would search the primary keywords. If the primary keywords are the same, and it’s only the sentence phrasing and tail keywords that change, minimize the headache and just have one page per language.

A couple end notes:

I didn’t weigh in on how the directory structures should be set up for the user when SEO is excluded because I think this is going to be an individual decision.

Under any circumstances, do not consider creating new domains for each country.

This post is a brief summary of the decisions you will need to make and please reach out if you have questions!

crop person putting hand on misted glass
SEO

How to Forecast and Quantify SEO opportunities

When a team is debating investing resources into an SEO effort, they will (and should) inevitably ask the right question around the expected ROI. The common approach to SEO forecasting is to use search volume from a tool like Ahrefs, SEMrush or Google’s Keyword Planner and to then layer on an expected conversion rate from search to clicks to the site. Since these tools mostly use exact match (rolling up spelling differences), the forecast will include a grossed up estimate to account for searches that contain the keyword but are not exact matches.


I used this method a decade ago and I think now it should be considered somewhat obsolete. Here are 5 new ways to estimate volume and potential ROI for your effort.

  • Ads on Google  – By far running an ads campaign on Google is the easiest to get an accurate gauge of potential search volume AND ROI. If you were to use the keywords you hoped to be able to generate traffic on and sent that traffic to a replica of the SEO page you are planning to build, you will have immediate benchmarks.

    Google will tell you what your maximum budget would be if you removed the financial shackles and that is likely a good sense of how much search volume there is for your chosen keywords. More importantly the conversion rate you see from these paid channels should be similar to what you might get from organic considering that they are the same people from search engines.

    To truly maximize the value of this test you can also test potential SEO meta in the ads by putting your title in the headline field and your meta description in the subheadline. You will get immediate feedback on your value prop that you plan to use without the need of waiting for the SEO efforts to scale.


  • Traffic Estimation  – Use a traffic estimation tool like Ahrefs, Semrush or Similarweb to look at possible competitor websites and make assumptions on how you might get the same traffic. These don’t have to be direct competitors; rather they should be search competitors. If users are going to Wikipedia for their answers now, look at the traffic of that Wikipedia page. The same goes for any other informational website that you will be competing with for your targeted topics.

    The traffic you get from these tools might be a total addressable market, so you can’t assume that you will get all the traffic. Apply an expected penetration percentage and this will be the gross traffic number to use in your revenue forecasts.

  • Branded keywords – Find the biggest brands in your topic space and use their estimated traffic (from the same tools above) as a predictor of total demand for the product. This is very specific so message me if I can help calculate if for you, but here’s an example.

    If you are building a product in the real estate space and want to estimate total traffic you might look at how much traffic there is for the keyword “zillow” and then gross that up by a brand non-brand split. In my opinion, it would be fair to at least double the number you find.


  • Facebook interest categories – this is a bit of a different idea since it is not exactly search volume, but you will be using it to estimate total addressable market. Use the Facebook Ads tool see how many people are interested in a particular topic and that would be your TAM for search volume. You will still need to pick keywords, but at least you know what the upper limit my be.
  • Google Search Console – This is a bit of a chicken and egg issue because if you don’t have search volume yet, there won’t be anything to see here. However, if you are lucky enough to have accidental SEO traffic then the impression volumes you see for keywords (on the first page) will be accurate impressions to what the search demand truly is.

    Using the keywords you already have search traffic on, you would then use a search tool to find additional keywords and make assumptions on traffic based on the keyword volumes you know.

As is apparent from all of these ideas, SEO forecasting is very fuzzy and an art form rather than a precise science. In my years doing SEO, I have never ever seen an SEO forecast match the reality. Either the numbers miss for a variety of reasons or they are successful beyond any prediction.  

Regardless, businesses can’t make investment decisions without forecasts on ROI. To be fair, other channels also rely on fuzzy forecasts – TV anyone? – but SEO ends up being even more opaque because there is very little information that is publicly shared and considered to be rock solid reliable.

Unlike an advertising channel where the ad network (Google Ads included) are incentivized to share performance data, there is no good reason that Google search would share organic forecasts. Therefore, we need to operate somewhat in the dark with our hands tied behind our backs. That aside if you do your job right, you will hopefully blow your forecasts out of the water.

These five ideas are just seeds to hopefully get you started with ways you can estimate search potential without having to rely on methods that were never that reliable even when they were common practice. If you have other ways of estimating and forecasting search potential I would love to hear your ideas.

SEO

Does page speed really lead to SEO improvement?

Ever since Google announced that page speed is a critical part of the Google ranking algorithms, SEO audits began to include sections on page speed. In many cases, I have seen a good chunk of an audit devoted to page speed complete with screenshots of homepage page speed, internal pages and even competitor comparisons. These screenshots from popular page speed measuring tools can be quite ominous with waterfalls showing how many scripts are called during page load or they can include a big red pie chart showing a large chunk of the site in the slow category.

The logic behind most of these reports is that if page speed is slow then a site (or page) will experience low visibility in search. The converse is that if page speed is fast, a site (or page) will experience high visibility. If a site moves from slow to fast or vice versa, the visibility impact should be apparent. The goal, which is noble of course, is to improve a site’s visibility by making the necessary improvements.

While it would be great if something as straightforward as making site speed improvements would equal more visibility and search traffic, but in reality I have never personally seen this happen. Not only have I never seen speed improvements lead to more search visibility, I have also never been able to pinpoint a site’s low visibility as related to a speed issue.

I posed this question both Twitter and LinkedIn asking for case studies, and on both channels, I did not see conclusive data that showed that page speed moved the needle for the majority of websites. There were a few examples in the responses, but they appeared to be outliers with learnings that would not apply to all websites.

Speed for conversion

While I believe it is critical to have a fast website and page speed for conversion purposes, I am skeptical that there could ever be search visibility (defined by more URL’s ranking higher in search result) improvements directly correlated to speed. As a result, if there are page speed improvements to make, by all means do make them. The difference is where you expect to see ROI.

If any website were to make speed improvements specifically for SEO ROI, I don’t think this expense would be justified. On the other hand, if the speed improvements were made with the expectation of conversion/revenue improvements this could be justified.

The difference here would come down to the nature of the business. Any website that exists for immediate transactional purposes like ecommerce or any product/service that is sold online to consumers would likely benefit from speed improvements. Faster pages would minimize drop-offs that might happen if the time to sale takes too long.

However, a website that is in a B2B space where the content exists for more informational purposes or a content only site, may not see the same improvements in conversion/revenue from page speed. (One exception is an ad funded content website. If the ads take too long to load, the user may never even see them before they leave.)

One other aspect that should be considered in deciding whether to make a page speed investment is how much of a difference the investment might make. If a site is on shared hosting, moving it to dedicated hosting might be a world of difference. Or, if a site takes multiple seconds to load, cutting that number down by half would probably be a great idea. However, if a site is already fast, but just not fast enough, shaving off milliseconds may not really move the needle at all when it comes to actual engagement.

I have seen this site speed recommendation with clients who already operate their own data centers. In nearly every scenario, there is really nothing actionable to do to improve their site speed and even if there was it is unlikely to ever be worth the large expense. Certainly, if there are dead scripts to remove from the page load, they should be removed, but refactoring an entire site or upgrading to more data centers should be out of the question.

In summary, if you see a recommendation to revamp your entire website’s structure to improve page speed with the outcome of better SEO take it with a grain of salt. If there is low hanging fruit to improve overall speed, of course make those fixes, but don’t spend a significant amount of money hoping that page speed will make enough of a difference in SEO visibility.

If anyone reading this has an example of a website that improved their site’s visibility purely from page speed improvements, I would love to see it!

Note: I deliberately did not address core web vitals because I want to do more research on it.

SEO

Digital PR vs Linkbuilding

On Twitter, Google’s John Mueller made a positive comment about what he referred to as digital PR and said it’s a shame it gets lumped in with spammy link building. His tweet kicked off a whole thread which is certainly worth reading, but I think it brings up some very interesting points.

Before digging into the differences between Digital PR and link building it’s important to understand how link building even became an SEO tactic. Decades ago, when search engines were first released onto the newly developed Internet, rankings on these search engines were driven by how words on webpages matched words that users searched for. As you might imagine, the initial sentiment that the Internet was for adult searches leaked into early SEO tactics. Even if a website had nothing whatsoever to do with adult searches they  might have used some adult keywords on the site in order to generate some extra visits.

Google and Links

Google’s innovation was that they didn’t just look at the words on a website, but they also incorporated an assessment of a website’s authority based on who linked to that website. The authority of the incoming link was a calculation of all of that site’s inbound links and so on to infinity. While early SEO efforts manipulated keywords, Google’s focus on links directed manipulation efforts into links.

Initially, there were many ways to manipulate links in favor and gain higher rankings on Google. The more cash-flush website owner simply bought links from websites that were willing to sell them. Those with lots of time on their hands developed web properties specifically for the purposes of linking back to themselves. Others bought software which dropped spammy link stuffed comments onto websites without comment moderation. The most sophisticated link builders built entire networks of sites that linked to theirs via pyramids with the dirtiest linking to less dirty and the money site at the top.

As search engine algorithms – Google –  in particular became more sophisticated many of these link tactics stopped working. Algorithms became wise to the non-value of links within comments or unrelated pieces of content.  As a result, link builders shifted their tactics. Out with the trash went software that placed links in unsuspecting sites and even pyramid link building stopped working. Google claimed that they incorporated registration dates of a domain into the ranking of a website, so even the idea of buying expired domains became suspect.

All that was left of the old link building tactics is raw relationship building efforts to generate links. Links could still and are still bought and sold, but it happens over email rather than in a publicly accessible exchange. The net result of the link effort is usually a permanently placed link within the navigation of a website or a guest post. (An authored piece of content by a guest that contains a link.)

Even this kind of tactic when executed wrongly it can be flatly ignored by Google and other search engine’s advanced artificial intelligence algorithms. Yet, if links are a critical part of a website’s visibility and search engines ignore common tactics how is a site supposed to get links? The answer is digital PR.

Digital PR is a link building tactic, but unlike other link building tactics the focus is on the PR not    the link. When you conduct a proper PR effort, the goal is to get the right brand mentions not just any mentions. If all a company wanted to do was get mentions of their brand they could just pay for a one-time PR release from a company like PRweb and their release will appear on many websites for a few hours.

Instead, actual PR requires enticing a reporter or website to WANT to talk about your product, brand, and company. The more reputable the journalist and media outlet are the harder that enticement will be and the greater the barriers are to actually getting the mentions. This challenge makes the mentions and hopefully links even more valuable.

In short, the goal of link building is not the link it is getting someone to care about talking about you enough to give the link. The inputs to this process are not easily manipulated since you need to create something of actual value while at the same time building a relationship with someone who can talk about you.

In this respect, Digital PR is not just a type of link building it is really its own thing. Link building is an outcome of Digital PR. Whenever I am asked by anyone I work with how to build links, I always encourage them to think in terms of PR first and links second. If they have an existing PR team, that PR team should incorporate some SEO best practices into their processes – namely getting links to internal pages, but otherwise I recommend that they hire a PR agency that is great at SEO. (Side note: Reach out if I can help with an intro to the greatest SEO PR firm. )

This is why John Mueller was praising Digital PR efforts because they create great things worthwhile to be linked to and they are generating links in the purest way possible.

astronomy atmosphere earth exploration
SEO

Pick the low hanging fruit of international SEO

For many websites, international SEO is sort of the last frontier. The marketers responsible for traffic growth will quicker dig into how to optimize for Google Discover than a language that they don’t speak. This makes sense from a logical standpoint of course since it’s natural to shy away from what is unfamiliar.

However, I think that digging into the exotic and unfamiliar is even more profitable than what you already know. If you are only optimizing in English for people in the US, you are potentially giving up on some very low hanging fruit.

You might object and say that you don’t have any non-English or non-US users, but I absolutely do not believe that ever to be the case. The web is global and unless you deliberately block any user from the outside the US or those without English language browser settings, you are most definitely getting visitors from around the world.

To prove that this is the case, go into your Search Console and under the performance tab look at countries. You very likely will have the majority of your traffic coming in from the United States (or maybe Canada), but you should see other countries there too. From a conversion standpoint, you can look for these global visitors in your analytics suite and you will most likely observe that their conversion rates and overall engagement are far lower than your other traffic sources.

Aside from the obvious, there is another reason to optimize for international SEO and that is that is that optimizing for content outside of the English language is far simpler! There are a number of reasons for this. First, there is the raw volume of content created in English compared to that which exists in other languages. While English is not the most spoken language in the world (Mandarin is more popular), much of the content on the web existed in English first. When you create content in English, you are competing with that many more websites than if you did the same in Spanish.

As an example, a search for the word “shoes” in English will have many more results than a search in Spanish for “Zapatos”.

Even on a word like shoes, there is just so many possible results. This paradigm will hold true for nearly any word you could search in a popular language and is even greater in a less popular language. While hundreds of millions of people speak Spanish, a significantly smaller group uses the web in German or French even though the purchasing power of the speakers of these languages is quite substantial.

This ratio calculation is even great when you drop into even less popular Internet languages like those in Southeast Asia. There just simply is not enough content for all possible results.

Even further, when you are competing against other sites in other languages, the quality metrics you need to succeed are a lot lower. Having a strong quality of content and links might be a challenge in English, but in another language the quality you need to have strong visibility will be a lot lower.

In addition to the raw numbers there is another reason that International SEO can be so profitable is that search algorithms despite many advancements in language processing and AI are still not as good at search in other languages as they are in English.

This is a combination of some natural bias towards English machine learning because the engineers who create the algorithms speak English, but also because there are so many more users in English who search and train the algorithms. Standard features within search like “did you mean” on spelling mistakes or intent matching on queries rather than straight string matching will be less robust in other languages. This means that there are areas where traditional SEO tactics driven by keyword matching will be successful in ways they have not worked in English in many years. You will have a greater opportunity to have high visibility in search if you incorporate these tactics, rather than the way it works in English where you do your best and hope it is recognized by the algorithms.

In short, if you are neglecting international SEO, just look for easy opportunities to tackle anything simple in other languages.

Here are five things you can try right now:

  1. Use a freelancer to translate your homepage
  2. Translate your contact page into another language
  3. Translate your top blog posts into the non-English languages that are the most popular in your search console
  4. Create a way for users to pay you if they use another currency. This may already be accomplished with your credit card processing but make sure whatever solution you are using does not cause any issues.
  5. Identify keywords that might be popular in other languages and incorporate them into landing pages you create specifically for those other languages.

These efforts above will give you a taste of what the potential for global traffic is on your website. Finally, I will leave you with this stat which is only 12% of the world’s population can even read and understand English. If you focus only on English, you are only targeting a sliver of the total Internet market.

light road red yellow
SEO

Ranking Factor vs Ranking Signals for Search

I was listening to the Search Off the Record Podcast from Google (sidenote: if you don’t listen to it yet, go do that now) and I heard Gary Illyes refer to something as a ranking signal. I have heard thousands of words from Gary at conferences, in videos, and personal conversations but this was the first time his usage of the phrase  “ranking signal” rather than the commonly used “ranking factor” really jumped out at me.

I went on Twitter and I searched all of his past tweets and blog posts and there were hundreds of mentions of the phrase “ranking signal” going as far back as I could see. Yet, if you Google the term “ranking signal” Google considers this to be a synonym of “ranking factor” and the results mention ranking factors rather than signal.

This got me thinking about the distinction between those two words “signal” vs “factor” and how the usage of factor contributes to a common misconception of how search algorithms actually work. Many people look at search ranking factors as a precise mathematical formula for success ; if you score high on each of these factors you are guaranteed SEO success.

This could not be further from the truth. Let’s take a few of these and walk through an example of how this doesn’t actually add up to fact. Moz has a great page which they update every year on ranking factors, so I will use this a source for this discussion.  According to Moz, the title tag is the second most important ranking factor for a page. I don’t necessarily disagree, however the importance of this as a ranking factor is absolutely flexible. I have seen pages rank well on valuable terms without any title tag whatsoever, likewise I have seen pages with fantastic titles rank deep in the search index.

The reasons for these results are obvious because a title tag is simply taken into account when calculating rankings for relevant keywords for that pages – not of course all words that exist in the world.

Now let’s look at a second example which is page speed. A fast-loading website and page are of course important, but in no way does a fast website guarantee SEO success just like a slow website does not guarantee SEO purgatory. If you are curious, Google some popular terms and you will notice that page speeds of top ranking websites are really all over the board. I have even seen a site that consistently gets a score of 1 out of 100 (the lowest) rank #1 on extremely valuable travel terms for every city in the US.

Taken into context, I think these two examples demonstrate the definition of a “signal” rather than a “factor”. The word factor is a mathematical term which is defined as “a number or quantity that when multiplied with another produces a given number or expression.” According to that definition a factor that is presented needs to be included as a part of an equation, and I know when I was taught math in elementary school, excluding a number in a math problem was not an option.

However, a signal can be ignored or interpreted in a more flexible way. When you are driving and see a red light signal, you are being told that cross traffic is likely going to be in front of you, but there is nothing actually stopping you from ignoring that signal if you don’t see any traffic. If you are willing to risk a ticket from the police, you can certainly make a choice to use your own judgment and experience in ignoring that signal.

This is how I believe search engine algorithms calculate visibility in choosing which webpages to show for a given result. Having a great title tag might be indicative of what a webpage is about, but the algorithm can simply choose to ignore it if there are other signals that override this one. Unlike a factor which actually has to be taken into account in a calculation, signals can be magnified or minimized depending on a particular situation. This means that as elements of how a website ranks, each signal has variable strength rather than a fixed value like a factor.

This line of thinking is even more appropriate when applied to link building efforts. Rather than approach links the way some backlink tools do as a mathematical calculation. Only the algorithm will know the true value of each link and each link is a signal of quality which may or may not be calculated rather than an absolute contribution to a sites total value.

When you build your SEO efforts, focus on each of these declared factors as signals rather than ingredients for success. Accrue as many positive signals as you can, but don’t feel like ranking factors are a checklist that you must achieve. How a website earns visibility is a confluence of all of these signals and you can very well rank in strong positions without a perfect score.

Just to end with one clarification that I hope doesn’t confuse things, while it is helpful to think of search engines ranking websites based on signals, the algorithms absolutely do approach these signals as mathematical factors. Algorithms are long series of code and do not have intellect or emotion to choose which signals to trust. Rather, they approach everything they know about a website and query and maximize, minimize, and ignore signals based on if/else rules. But, knowing how this works is completely unnecessary and from a logical standpoint just think of everything as a positive or negative signal that may or may not be calculated in a ranking.

SEO

Brand vs Non-Brand SEO

I am a huge proponent in thinking about and sharing SEO performance at an aggregate level. This means that strategies and tactics should always be targeted at improving search visibility across the board for all URL’s and all keywords. This is the flip side of granular SEO which is focused on efforts around improving very specific keywords and on a select set of URL’s.

Generally, I believe that using any sort of position tracking is not a useful KPI because it is not aligned with overall business performance unless the business somehow manages to monetize visibility on Google that doesn’t even require a user to click. Positions on Google may be indicative of SEO performance, but it is not a KPI in its own right. Keyword level SEO may cause a business to be blind to their growth opportunities.

They may be doing poorly on the keyword they had hoped to be visible for, but not notice how well they might be doing on other even better phrases. As a result of their tunnel vision, they will inevitably miss out on opportunities for improvement in places they are already doing well. Viewing SEO performance from a holistic level means that all traffic is good and worthy of being measured.

However, there is one area that I strongly believe should be measured on a keyword level and that is splitting out branded vs non-branded traffic. Branded traffic is organic clicks that come from queries that contain the brand name or a misspell of the brand name. Non-branded traffic is anything that does not contain the brand name.

Measuring brand vs non-brand

If a brand does not rank for its brand name that is a huge red flag. Google’s worst penalty will demote a website far into the results for a search of its own brand name. A friend who worked at Google on the search quality team told me that internally it was referred to as the “Black Penalty”. Assuming there is no penalty issue, a brand not ranking for its own name will indicate that there is a technical issue that needs to be addressed, but once that is fixed there likely will not be continued growth on brand terms beyond the natural growth of the brand itself.

Branded traffic is great but it is not really SEO – this traffic  is the result of brand efforts such as PR or word of mouth. Non-branded traffic is SEO which is the net result of your strategy, tactics, and effort. For the most part, this bucket of traffic is the ROI of your SEO investment

To reveal what is actually non- brand traffic, you will need to filter out all brand traffic. This will include variations of the brand name so you will want to distill your search for branded keywords down to the lowest common denominator. For example, when I was at SurveyMonkey my brand filter looked for only the keywords “mon” since that even picked up the misspell of “survey money” and of course picked up the most common brand terms of “survey monkey” and “surveymonkey”.

Having worked on the SEO campaigns of a number of massive brands as well as startups with no brand, I believe that a healthy brand vs non-brand ratio should be somewhere between 40-60% . This is a wide range, but a primary driver is how big the brand is relative to the space. That being said, if branded traffic – even for an incredibly well known brand is north of 80%, there is inevitably a lot of traffic being left on the table.

When I first joined SurveyMonkey in 2012, global branded traffic was over 90% and over my tenure there we were able to bring that into the ideal range even while branded traffic itself grew double digit percentages year over year.

Whenever I start working on a new SEO project the very first thing I do is get a sense of brand vs non-brand traffic. Millions of impressions per month is not indicative of a great SEO strategy if most of the queries driving those impressions are branded. I don’t just stop at impressions as people need to be clicking those URL’s for the traffic to be meaningful. If the brand vs non-brand ratio is very different when sliced by clicks, then the non-branded visibility might be the wrong kinds of queries to acquire users.

When I find that the brand vs non-brand ratio leans too heavily to the brand, I add improving this ratio to my list of KPI’s that SEO should be measured by. This means that I will check in on how this ratio is changing on a fairly regular basis.

For those that have never looked at this metric before, here’s a quick primer on how to do so. The data source to use for this is Google Search Console, so you can either access this data via the user interface or the API. Obviously, the UI is a far easier option but beware that there may be data inconsistencies when you do a lookup where the sum of all the parts does not equal the whole. If this happens to you, you can fallback into using the API.

  1. Within the UI, choose a relatively short time period like the last month.
  2. Open up the query lookup box and choose “contains” and then put your shortened brand term.
    1. Grab the total of brand clicks
    1. Grab the total of brand impressions
  3. Go back into the query lookup box and this time change the option to “not contains” and keep that same brand term in there
    1. Grab the total of non-brand clicks
    1. Grab the total of non-brand impressions
  4. Divide either of these by your total clicks before filtering and this is your brand/non-brand percentage.

There may be no action at all that you can take from this data, but knowledge is still power. This is a quick temperature check on whether you have SEO traction – or not. You can use this number to make additional budgets, get a raise or find queries worth optimizing for. I believe this is one of the most important metrics in any SEO effort and it should be measured on a regular basis.

In addition to the lookups above, I have built some Data Studio dashboards which might be useful for you and I would be happy to share. Reach out and let me know if you would like to copy them!

SEO

Google Lens and Assistant show the power of Google Search AI

There are two underrated – at least when it comes to SEO thought – tools from Google that I believe really highlight the future of Google Search. In addition, these tools give a peek under the hood at the extent of Google’s AI that is always in use for every search on Google.

Google Assistant

The first tool I want to highlight is the Google Assistant which powers Android phones and Google smart devices. For those that are unfamiliar with the Google smart device ecosystem the assistant is Google’s Siri or Alexa. (Sidenote, maybe they should give it a cutesy name?). While it is a key component of made by Google devices and operating systems it is also available to anyone who wants to conduct a voice search in Chrome.

The assistant is so much more than just a tool to avoid typing a query into Google. For example, on many queries you can watch in real time as it listens to what you are saying and then auto corrects itself based on the set of strings. For example, it might hear you say “Kobe” but then when paired with the word  it thinks it heard as “systems” it will assume that you meant Covid symptoms and modify the query.

This is so much more advanced than just spell check as if you added the word “lighting” after “Kobe” it will then determine based on your history if you meant “lighting” which is a Nike shoe or “19” which again is related to Covid. I think just the query correction shows just how powerful Google’s algorithms are both from a query intent/detection standpoint and of course matching content with the intent of the content. In my opinion, it shows how futile it might be to use obsolete SEO tactics like keyword stuffing.

However, the Assistant is even smarter than just a language processing tool. It can also do things that require more intelligence.  It can identify songs based on just tunes that you hum which shows that doesn’t even need actual words to conduct a search. It can also take strings and then understanding query intent change those into totally new queries. A query like “I need gas” will be translated to “gas stations near me” and “hungry for lunch” will be a combo query of places to eat near that also have lunch in the reviews.  

These are really simple queries because they add keywords to the query, but there are many that transform into an entirely new query. “Do I need a jacket today” will trigger a weather forecast and “is there traffic on the road to home” will bring up a map of directions towards your home.

Google Lens

The second tool that shows the power of Google’s AI is Google Lens. This is a feature that is a part of all cameras on Android phones, but it can be downloaded as a stand alone app for non-Android devices. This app allows users to point their camera at anything and Google can either search for that object or use optical character recognition to search Google for any words it sees.

Again, this stretches well beyond the way most people approach search. Whereas many years ago, it was deemed important to add image alt text to declare to a search engine what an image might be, more than likely Google knows exactly what an image regardless. Additionally, Google may even be reading text within images and adding that to database entries on a page to determine relevancy based on that content.

This is likely not widespread just yet given the high cost of recognizing every image and reading all text, but the capabilities are certainly built within the search algorithms. Once this is a common function of search algorithms we can expect podcasts to be indexed based on the words within the audio (similar to passage indexing) and video – not just YouTube – to be indexed based on the video content.

In short, the algorithms that power these tools are the exact same algorithms that operate with every query conducted via a search box on Google. The AI at work in the Assistant and Lens are so much deeper than the simplistic syntax and text matching database search engine that Google started with two decades ago. Therefore, the strategies and tactics to succeed in this SEO paradigm have to be just as complex.

Don’t fall into a trap of believing that visibility on Google is simply the result of a formulaic approach. Search rankings have always been dynamic and the inputs are more diverse than have ever been in the past. Google may only use a select number of ranking variables in their algorithms, but those variables are run against an ever expanding set of new variables.

opinion

User Generated Content is not an SEO silver bullet

Content creation for search is expensive and complex, so therefore marketers are always looking for shortcuts that will unlock scalability while reducing expense. Throughout my career in search the one idea that everyone always seems to gravitate to as the “secret” formula to create content without actually creating content is UGC (user generated content).

 Having your users create the content seems like an amazing win-win. Your users will be engaged and they will be helping you create content – for free. There’s only one problem with this logic, and it happens to be a fairly big problem. It hardly ever works. Users rarely feel motivated to create content and when they do, it’s low quality and lacking of any substance. Or worse, you get flooded with UGC, but it’s all bot written spam that you need to devote resources to moderating.

But, there’s an even bigger problem with using UGC for search is that from my experience it never seems to drive traffic. In every attempt I have ever participated in to create UGC as a content hook for search crawlers, even with substantial amounts of content – only dribbles of impressions and clicks came from this content. Additionally, in my research of other sites that have loads of UGC, this content does not seem to be what draws in the majority of the search visibility.

My assumption is that Google is able to parse out what is UGC vs non-UGC and can weight the authority of the strings accordingly. One person’s unqualified opinion in a comment on a page does not earn the level of authority that will outrank another page with similar content but written by an author on the site.

I have only seen two notable exceptions to this rule: Tripadvisor and Amazon. Tripadvisor benefits from the extreme long tail that is hardly ever tackled by other sites. Think queries like “boutique hotels with heated pools” or “hotel beds that could fit for 5 children.” Tripadvisor helps this indexation along by adding in tag categories which will highlight reviews that contain the word “pool” or “platinum”.

Likewise, Amazon is similarly able to generate visibility on those long tail queries because they are the most authoritative site that might have content around “battery life” for example of an electronics product. It’s not that they have authority as Amazon, it’s simply that they are the only ones with the content. Up until recently, I had thought this was completely organic but then I noticed in my app after a purchase that Amazon was encouraging me to write a review with prompts on keywords to use.

I don’t think this is at all sneaky and is actually something that other sites with a desire to create UGC should learn from. This is even more useful if the UGC is useful for potential customers considering a business such as on Yelp or Google My Business. Encouraging the review writer to include specific wording beyond generic platitudes will go a long way into helping people make decisions from the content.

In summary, UGC isn’t to be considered any sort of silver bullet for SEO success that will unlock hidden growth without needing a full content library; however, if you are going to attempt to use UGC for SEO, here are some good best practices to follow.

  1. Encourage users to use key phrases that will be useful for your content purposes. If you want them to review specific aspects of your products or services, prompt them with how you want them to review it.
  2. You can take it even further by asking them to answer questions like “how long did the battery last?”  Or “did our technician arrive on time?”
  3. Ask for UGC in an offline format such as using an email or survey and then publish the content manually. This obviously is not as scalable as just using straight UGC, but it does eliminate many of the downsides of low quality or spam content.
  4. Incorporate the UGC into the body of the content and do not just leave it at the bottom of the authored content. From an algorithmic standpoint, this might make it more challenging for a search engine to determine which content is UGC should they even want to demote it.
  5. Finally, in everything you do make sure your basics are covered. A) Have a blacklist for spam and vulgar words. B) Remove any and all links embedded within content as there will rarely be a reason for UGC to link elsewhere. C) Moderate your content before it publishes.

If UGC can be helpful to your KPI’s you should absolutely use it as a strategy, just know it’s true value and upside potential.

Google

It’s not always Google’s fault.

Early in December Google announced that they were launching a core update of their algorithm which in Googlespeak is just a refresh of their biggest product. In my opinion, algorithm updates are a good thing and benefit all Google users. No one would want to have an old operating system on their mobile device, so similarly we should want our Google searches to be running off the latest and greatest technology.

Every time there is an algorithm update, there are complaints on Twitter and inevitable blog posts about the winners and losers. Obviously, the losers are quite disappointed while the winners remain silent out of fear of igniting the Internet’s wrath against them. In all of this chatter there is a single protagonist: Google. I believe this attention is entirely misplaced and leads to bad decision making.

From the sites where I have access to their search console, I see both increases and decreases from this most recent update and the common theme is intent – content match. A site might have been ranking on a query that might not have been the best match for the intent of that query. Google’s algorithm refresh did a better job of identifying the intent, so the formerly ranking URL might have been demoted in favor of a URL that is more of a fit.

Likewise, the sites that I saw benefit from the update didn’t suddenly magically have better content. Rather, other websites that might have been “hogging” the rankings without the right match have now been demoted.

The greatest impact to web traffic in my opinion is shifting demand by real users. Google doesn’t determine that demand all they do is direct that demand for content into the right places. Google is a convenient bogeyman, but I think instead of focusing on Google algorithm updates, sites should be hyper focused on their users and their needs. Sometimes sites will benefit from mismatched intent, but it should be considered only temporary and will likely disappear eventually.

The best way to measure the risk of mismatched content to user queries is to look at conversion rates on pages. If there are pages that have a lower-than-average conversion rate when compared to the rest of a website, this might be a sign that the content isn’t the greatest fit for what a user might be looking for on that query. This isn’t a hard and fast rule, of course, but it is an avenue that should be looked at.

Another way to assess content is to look at the other sites that are ranking on the same queries. Do the other pages and results have similar themes or is your site an outlier? In the same vein, you can also look at Google knowledge boxes on your popular queries. If the knowledge box is aimed at a different topic or intent than your page, then this could be something Google might eventually identify as a mismatch to intent that should be demoted.

Regardless  of where you stand today in search results, remember that rankings are just a vanity KPI and it is your business metrics that should dictate your strategies. Stay focused on the customer/user and search engines will hopefully recognize your natural fit for the user.

Note: this is an oversimplified summary of an algo update and is not meant to be a granular look at how any site in particular benefited or lost in the latest update.

1 2 3 14 15