Eli Schwartz

Our Blog

We only talk about the good stuff. Curated Collections, Freebies, well researched articles and much more

SEO

Google Lens and Assistant show the power of Google Search AI

There are two underrated – at least when it comes to SEO thought – tools from Google that I believe really highlight the future of Google Search. In addition, these tools give a peek under the hood at the extent of Google’s AI that is always in use for every search on Google.

Google Assistant

The first tool I want to highlight is the Google Assistant which powers Android phones and Google smart devices. For those that are unfamiliar with the Google smart device ecosystem the assistant is Google’s Siri or Alexa. (Sidenote, maybe they should give it a cutesy name?). While it is a key component of made by Google devices and operating systems it is also available to anyone who wants to conduct a voice search in Chrome.

The assistant is so much more than just a tool to avoid typing a query into Google. For example, on many queries you can watch in real time as it listens to what you are saying and then auto corrects itself based on the set of strings. For example, it might hear you say “Kobe” but then when paired with the word  it thinks it heard as “systems” it will assume that you meant Covid symptoms and modify the query.

This is so much more advanced than just spell check as if you added the word “lighting” after “Kobe” it will then determine based on your history if you meant “lighting” which is a Nike shoe or “19” which again is related to Covid. I think just the query correction shows just how powerful Google’s algorithms are both from a query intent/detection standpoint and of course matching content with the intent of the content. In my opinion, it shows how futile it might be to use obsolete SEO tactics like keyword stuffing.

However, the Assistant is even smarter than just a language processing tool. It can also do things that require more intelligence.  It can identify songs based on just tunes that you hum which shows that doesn’t even need actual words to conduct a search. It can also take strings and then understanding query intent change those into totally new queries. A query like “I need gas” will be translated to “gas stations near me” and “hungry for lunch” will be a combo query of places to eat near that also have lunch in the reviews.  

These are really simple queries because they add keywords to the query, but there are many that transform into an entirely new query. “Do I need a jacket today” will trigger a weather forecast and “is there traffic on the road to home” will bring up a map of directions towards your home.

Google Lens

The second tool that shows the power of Google’s AI is Google Lens. This is a feature that is a part of all cameras on Android phones, but it can be downloaded as a stand alone app for non-Android devices. This app allows users to point their camera at anything and Google can either search for that object or use optical character recognition to search Google for any words it sees.

Again, this stretches well beyond the way most people approach search. Whereas many years ago, it was deemed important to add image alt text to declare to a search engine what an image might be, more than likely Google knows exactly what an image regardless. Additionally, Google may even be reading text within images and adding that to database entries on a page to determine relevancy based on that content.

This is likely not widespread just yet given the high cost of recognizing every image and reading all text, but the capabilities are certainly built within the search algorithms. Once this is a common function of search algorithms we can expect podcasts to be indexed based on the words within the audio (similar to passage indexing) and video – not just YouTube – to be indexed based on the video content.

In short, the algorithms that power these tools are the exact same algorithms that operate with every query conducted via a search box on Google. The AI at work in the Assistant and Lens are so much deeper than the simplistic syntax and text matching database search engine that Google started with two decades ago. Therefore, the strategies and tactics to succeed in this SEO paradigm have to be just as complex.

Don’t fall into a trap of believing that visibility on Google is simply the result of a formulaic approach. Search rankings have always been dynamic and the inputs are more diverse than have ever been in the past. Google may only use a select number of ranking variables in their algorithms, but those variables are run against an ever expanding set of new variables.

opinion

User Generated Content is not an SEO silver bullet

Content creation for search is expensive and complex, so therefore marketers are always looking for shortcuts that will unlock scalability while reducing expense. Throughout my career in search the one idea that everyone always seems to gravitate to as the “secret” formula to create content without actually creating content is UGC (user generated content).

 Having your users create the content seems like an amazing win-win. Your users will be engaged and they will be helping you create content – for free. There’s only one problem with this logic, and it happens to be a fairly big problem. It hardly ever works. Users rarely feel motivated to create content and when they do, it’s low quality and lacking of any substance. Or worse, you get flooded with UGC, but it’s all bot written spam that you need to devote resources to moderating.

But, there’s an even bigger problem with using UGC for search is that from my experience it never seems to drive traffic. In every attempt I have ever participated in to create UGC as a content hook for search crawlers, even with substantial amounts of content – only dribbles of impressions and clicks came from this content. Additionally, in my research of other sites that have loads of UGC, this content does not seem to be what draws in the majority of the search visibility.

My assumption is that Google is able to parse out what is UGC vs non-UGC and can weight the authority of the strings accordingly. One person’s unqualified opinion in a comment on a page does not earn the level of authority that will outrank another page with similar content but written by an author on the site.

I have only seen two notable exceptions to this rule: Tripadvisor and Amazon. Tripadvisor benefits from the extreme long tail that is hardly ever tackled by other sites. Think queries like “boutique hotels with heated pools” or “hotel beds that could fit for 5 children.” Tripadvisor helps this indexation along by adding in tag categories which will highlight reviews that contain the word “pool” or “platinum”.

Likewise, Amazon is similarly able to generate visibility on those long tail queries because they are the most authoritative site that might have content around “battery life” for example of an electronics product. It’s not that they have authority as Amazon, it’s simply that they are the only ones with the content. Up until recently, I had thought this was completely organic but then I noticed in my app after a purchase that Amazon was encouraging me to write a review with prompts on keywords to use.

I don’t think this is at all sneaky and is actually something that other sites with a desire to create UGC should learn from. This is even more useful if the UGC is useful for potential customers considering a business such as on Yelp or Google My Business. Encouraging the review writer to include specific wording beyond generic platitudes will go a long way into helping people make decisions from the content.

In summary, UGC isn’t to be considered any sort of silver bullet for SEO success that will unlock hidden growth without needing a full content library; however, if you are going to attempt to use UGC for SEO, here are some good best practices to follow.

  1. Encourage users to use key phrases that will be useful for your content purposes. If you want them to review specific aspects of your products or services, prompt them with how you want them to review it.
  2. You can take it even further by asking them to answer questions like “how long did the battery last?”  Or “did our technician arrive on time?”
  3. Ask for UGC in an offline format such as using an email or survey and then publish the content manually. This obviously is not as scalable as just using straight UGC, but it does eliminate many of the downsides of low quality or spam content.
  4. Incorporate the UGC into the body of the content and do not just leave it at the bottom of the authored content. From an algorithmic standpoint, this might make it more challenging for a search engine to determine which content is UGC should they even want to demote it.
  5. Finally, in everything you do make sure your basics are covered. A) Have a blacklist for spam and vulgar words. B) Remove any and all links embedded within content as there will rarely be a reason for UGC to link elsewhere. C) Moderate your content before it publishes.

If UGC can be helpful to your KPI’s you should absolutely use it as a strategy, just know it’s true value and upside potential.

Google

It’s not always Google’s fault.

Early in December Google announced that they were launching a core update of their algorithm which in Googlespeak is just a refresh of their biggest product. In my opinion, algorithm updates are a good thing and benefit all Google users. No one would want to have an old operating system on their mobile device, so similarly we should want our Google searches to be running off the latest and greatest technology.

Every time there is an algorithm update, there are complaints on Twitter and inevitable blog posts about the winners and losers. Obviously, the losers are quite disappointed while the winners remain silent out of fear of igniting the Internet’s wrath against them. In all of this chatter there is a single protagonist: Google. I believe this attention is entirely misplaced and leads to bad decision making.

From the sites where I have access to their search console, I see both increases and decreases from this most recent update and the common theme is intent – content match. A site might have been ranking on a query that might not have been the best match for the intent of that query. Google’s algorithm refresh did a better job of identifying the intent, so the formerly ranking URL might have been demoted in favor of a URL that is more of a fit.

Likewise, the sites that I saw benefit from the update didn’t suddenly magically have better content. Rather, other websites that might have been “hogging” the rankings without the right match have now been demoted.

The greatest impact to web traffic in my opinion is shifting demand by real users. Google doesn’t determine that demand all they do is direct that demand for content into the right places. Google is a convenient bogeyman, but I think instead of focusing on Google algorithm updates, sites should be hyper focused on their users and their needs. Sometimes sites will benefit from mismatched intent, but it should be considered only temporary and will likely disappear eventually.

The best way to measure the risk of mismatched content to user queries is to look at conversion rates on pages. If there are pages that have a lower-than-average conversion rate when compared to the rest of a website, this might be a sign that the content isn’t the greatest fit for what a user might be looking for on that query. This isn’t a hard and fast rule, of course, but it is an avenue that should be looked at.

Another way to assess content is to look at the other sites that are ranking on the same queries. Do the other pages and results have similar themes or is your site an outlier? In the same vein, you can also look at Google knowledge boxes on your popular queries. If the knowledge box is aimed at a different topic or intent than your page, then this could be something Google might eventually identify as a mismatch to intent that should be demoted.

Regardless  of where you stand today in search results, remember that rankings are just a vanity KPI and it is your business metrics that should dictate your strategies. Stay focused on the customer/user and search engines will hopefully recognize your natural fit for the user.

Note: this is an oversimplified summary of an algo update and is not meant to be a granular look at how any site in particular benefited or lost in the latest update.

SEO

Technical SEO does not create value – it enhances it

I shared this sentiment on Twitter and received a flurry of responses as to how wrong I am. To be clear, I think there is tremendous value in having a well optimized site from a technical standpoint; however, I think technical SEO can only help a site improve on the visibility it should have. Scoring a perfect score on technical SEO will not drive any additional traffic if there isn’t already a great website that provides value to a users. The technical SEO enhances and reveals the value that had been blocked by not following SEO best practices.

Sometimes, there is an expectation that technical SEO fixes will do more for the growth of a site’s traffic than it can actually do. In this paradigm, the company may specifically look to make a technical SEO when most smaller companies would find that a creative content focuses SEO manager would be a far better fit for their needs.

In this vein, organizations may spend vast sums to improve technical aspects of a site to bring them up to SEO best practices, but in my opinion, it is important to determine if there will be ROI from these efforts before making an investment. Remapping every redirect, removing error pages and even improving site speed may not actually add enough additional users to justify the effort that might have to be invested to make these changes.

The purpose of technical SEO is really to unlock opportunity and value that has been hidden away by inefficient technical SEO infrastructure; however, if there is no value to unlock, SEO improvements will not add user acquisition.

The best way of explaining this is to think of this in a home purchase analogy. Technical SEO is the structure of the home that lives underneath the sheetrock. When a prospective buyer decides how much they are willing to pay for this home, the technical infrastructure may weigh into their calculation, but it will be a very small amount. Yes, it is nice to know that if an earthquake or fire hits the home it can withstand the onslaught, but what the buyers are really paying for is the location, the design and the square footage.

Translating this back into SEO, users of the website only care about satisfying whatever their immediate need might be. It is ideal that when they click on links they don’t get redirected many times, but they will never notice if those redirects are 301 or 302’s. The same goes for internal linking. It’s a nice to have from a user perspective that they can find additional pages/products within the site, but even if they can’t they can always use the search box.

Technical SEO will of course add users to a website, but those users should be weighed against other investments that might bring in even more users. The vast sums that might be required to improve a website’s speed might be better user to hire engineers that build additional features that might generate more search or other traffic. As an aside, I believe that site speed does not necessarily improve a website’s visibility in search rather it is a tool to increase on-page conversions. Therefore, in making an investment case for site speed, I would only use that conversion improvement as a part of the ROI calculation and not make any assumptions on additional traffic.

Similarly, there are a number of other areas of technical SEO which might serve a dual purpose of being both technical and content/product improvements. The addition of schema or FAQ’s while some might consider to be technical, I think those are value adds to a page that are actually content efforts implemented technically.  (Kind of like posting a blog post via a shell command. Yes, you need to be somewhat technical to do that, but the lift comes from the content – not the posting. )

I am sure that some of you reading this will disagree with my sentiments around technical SEO, and I would love to hear from you on how I might be approaching this the wrong way. Please contact me!

SEO

5 non-boring ways to use a Robots.txt

Don’t just think of a robots.txt as a file that you are going to use to block search engines from crawling pages on your site, there are creative uses for this file if you just allow yourself to think outside the box.

One of the very first technical SEO fires I ever had to extinguish was one started by a robots.txt file. The startup where I was working launched a brand-new redesigned website, and we were excited to see the results. However, instead of traffic going up it inexplicably went down. For anyone that has ever had the misfortune of screwing up a robots file (spoiler alert that’s what happened), you will know that a robots noindex directive is not an instant kiss of search death – it’s more like a slow drawn-out slide especially on a large site.

Since I had never seen anything like this before, I was frantically checking everything else that might cause this slow dive. While I turned over every rock to find the issue, I chalked it up to bad engineering, an algo update, or a penalty. Finally, I found a clue. A URL where I thought a ranking had slipped was actually no longer indexed. Then I found another URL like that and another one.

Only then did I check the robots.txt and discovered that it was set to noindex the entire site. When the brand-new site had been pushed live – everything had moved from the staging site to production including the robots file which had been set to noindex that staging site. When Googlebot revisited the homepage next, it fetched the robots.txt and saw the noindex directive. From the log data Googlebot continued to fetch pages, but the rate started declining quickly. (For thoughts around why this is, I will be doing another blogpost on this topic.)

We fixed the robots file as fast as we could and then waited for Google to love us again. If you are ever in this situation, this is your warning that this is not an instant fix. Unfortunately, it took longer to recover our positions than it did to lose them, and I have seen this be the case every time I have worked on an issue like this over the last decade. It took about five days to lose all of our rankings and weeks to recover them.

Ever since I dealt with this issue, I have had a very healthy respect for robots files. I carefully consider alternatives to using them to address issues that can be fixed in other ways, and only add folders to them if I really never want them to be indexed.

Robots.txt can have other uses

This healthy fear aside, I have also found some great uses for robots files that are completely harmless to your search visibility and could even cause some confusion with your competitors. Note that if you do any of these you should add a comment tag in front of any non-code so you don’t inadvertently break your robots file. These are just examples of what you can do with a robots file once you start being creative.

  1. Recruiting: Use robots files to advertise open SEO roles. The only humans that are ever going to look at a robots file are either search engine engineers or search marketers. This would be a great place to grab their attention and encourage them to apply. For an example of this, check out TripAdvisor’s robots file here.
  2. Branding: Showcase something fun about your brand like Airbnb does. They write out their brand name in ASCII code which reflects their design sense and make a casual reference to careers at the company.
  3. Mystery: Anything that is in a robots file as disallowed that appears to be a traditional file (and not a reference to scripts) will inevitably be something that people will want to check out if they find it. If you produce a technical tool, this might be the place where you can offer a discount to only the people that find the link in the robots file with a URL like “secret-discount”. If someone goes to the lengths to explore your robots file, they are most definitely going to navigate to a page that references a secret discount!
  4. Subterfuge: Any good marketer should always check out their competitors robots.txt files to see if there is anything that the competition is doing that you should keep track of. Assume that your competitors are going to check out your robots file and do the same to you. This where you can be sneaky by dropping links and folders that don’t exist. Would you love to do a rebrand but can’t afford it? Your competitors don’t need to know that. Put a disallow to folder called /rebrand -assets. For added fun you can even put a date on it, so the competition might mark their calendars for an event that never happens. When you start being creative along these lines, the ideas are truly endless. You can make references to events you won’t be having, job descriptions you aren’t releasing or even products with really cool names that will never exist. Just make sure you block anyone from seeing a real page with either password protection or a 404, so this does remain just a reference to an idea. Also, don’t take this too overboard into anything immoral or illegal!
  5. Taxonomy: A robots file really just exists to disallow folders and files not to allow them; unless your default is disallow (like Facebook’s robots file) and you just want to allow a few pages. An exercise where you sit down with content or engineering to add folders to be allowed might be a good way to find out if there are folders that should not exist. Truthfully, the value in this is just the exercise but you can carry it forward and actually lay out all the allowed folders in the robots file as a way of detailing the taxonomy.

As you can see a robots file is not just a boring requirement – and it is a requirement, every site should have one. There is a lot you can do with it if you think of it as another asset or channel on your website. If you have other ideas on how to make robots files useful, I would love to hear them – please email me!

SEO

Nofollow vs Follow Links – Who cares?

In 2005, Google along with the other search engines that at the time still mattered launched a new attribute to describe a link termed the “nofollow”. The nofollow attribute was a way for Google to disincentivize a spam issue specifically to acquire links that was spiraling out of control. With the nofollow attribute, websites could in theory negate the SEO value of any outbound link simply by declaring them to be nofollowed.

SEO

Stop using marketing jargon

Jargon kills the internal influence and unnecessarily impacts the effectiveness of a marketing team. Jargon, abbreviations, and bullets without explanation have their utility, but they should not be used in a public setting that includes a non-marketing audience.

SEO

Marketers should use strong project code names – even for core parts of the job


In the military, operations are typically given a code name to align everyone to the mission and keep the actual objective secret. As the practice took hold in World War I and II, the naming convention of missions began to take on names that connoted strength. Winston Churchill had a role in personally picking the names of missions and even set out guidelines that encouraged planners to use names of heroes of antiquity, figures from Greek and Roman mythology, and names of war heroes.

1 2 3 14 15