Eli Schwartz

Posts by

Eli

Home / Blog Archive
SEO

Technical SEO does not create value – it enhances it

I shared this sentiment on Twitter and received a flurry of responses as to how wrong I am. To be clear, I think there is tremendous value in having a well optimized site from a technical standpoint; however, I think technical SEO can only help a site improve on the visibility it should have. Scoring a perfect score on technical SEO will not drive any additional traffic if there isn’t already a great website that provides value to a users. The technical SEO enhances and reveals the value that had been blocked by not following SEO best practices.

Sometimes, there is an expectation that technical SEO fixes will do more for the growth of a site’s traffic than it can actually do. In this paradigm, the company may specifically look to make a technical SEO when most smaller companies would find that a creative content focuses SEO manager would be a far better fit for their needs.

In this vein, organizations may spend vast sums to improve technical aspects of a site to bring them up to SEO best practices, but in my opinion, it is important to determine if there will be ROI from these efforts before making an investment. Remapping every redirect, removing error pages and even improving site speed may not actually add enough additional users to justify the effort that might have to be invested to make these changes.

The purpose of technical SEO is really to unlock opportunity and value that has been hidden away by inefficient technical SEO infrastructure; however, if there is no value to unlock, SEO improvements will not add user acquisition.

The best way of explaining this is to think of this in a home purchase analogy. Technical SEO is the structure of the home that lives underneath the sheetrock. When a prospective buyer decides how much they are willing to pay for this home, the technical infrastructure may weigh into their calculation, but it will be a very small amount. Yes, it is nice to know that if an earthquake or fire hits the home it can withstand the onslaught, but what the buyers are really paying for is the location, the design and the square footage.

Translating this back into SEO, users of the website only care about satisfying whatever their immediate need might be. It is ideal that when they click on links they don’t get redirected many times, but they will never notice if those redirects are 301 or 302’s. The same goes for internal linking. It’s a nice to have from a user perspective that they can find additional pages/products within the site, but even if they can’t they can always use the search box.

Technical SEO will of course add users to a website, but those users should be weighed against other investments that might bring in even more users. The vast sums that might be required to improve a website’s speed might be better user to hire engineers that build additional features that might generate more search or other traffic. As an aside, I believe that site speed does not necessarily improve a website’s visibility in search rather it is a tool to increase on-page conversions. Therefore, in making an investment case for site speed, I would only use that conversion improvement as a part of the ROI calculation and not make any assumptions on additional traffic.

Similarly, there are a number of other areas of technical SEO which might serve a dual purpose of being both technical and content/product improvements. The addition of schema or FAQ’s while some might consider to be technical, I think those are value adds to a page that are actually content efforts implemented technically.  (Kind of like posting a blog post via a shell command. Yes, you need to be somewhat technical to do that, but the lift comes from the content – not the posting. )

I am sure that some of you reading this will disagree with my sentiments around technical SEO, and I would love to hear from you on how I might be approaching this the wrong way. Please contact me!

SEO

5 non-boring ways to use a Robots.txt

Don’t just think of a robots.txt as a file that you are going to use to block search engines from crawling pages on your site, there are creative uses for this file if you just allow yourself to think outside the box.

One of the very first technical SEO fires I ever had to extinguish was one started by a robots.txt file. The startup where I was working launched a brand-new redesigned website, and we were excited to see the results. However, instead of traffic going up it inexplicably went down. For anyone that has ever had the misfortune of screwing up a robots file (spoiler alert that’s what happened), you will know that a robots noindex directive is not an instant kiss of search death – it’s more like a slow drawn-out slide especially on a large site.

Since I had never seen anything like this before, I was frantically checking everything else that might cause this slow dive. While I turned over every rock to find the issue, I chalked it up to bad engineering, an algo update, or a penalty. Finally, I found a clue. A URL where I thought a ranking had slipped was actually no longer indexed. Then I found another URL like that and another one.

Only then did I check the robots.txt and discovered that it was set to noindex the entire site. When the brand-new site had been pushed live – everything had moved from the staging site to production including the robots file which had been set to noindex that staging site. When Googlebot revisited the homepage next, it fetched the robots.txt and saw the noindex directive. From the log data Googlebot continued to fetch pages, but the rate started declining quickly. (For thoughts around why this is, I will be doing another blogpost on this topic.)

We fixed the robots file as fast as we could and then waited for Google to love us again. If you are ever in this situation, this is your warning that this is not an instant fix. Unfortunately, it took longer to recover our positions than it did to lose them, and I have seen this be the case every time I have worked on an issue like this over the last decade. It took about five days to lose all of our rankings and weeks to recover them.

Ever since I dealt with this issue, I have had a very healthy respect for robots files. I carefully consider alternatives to using them to address issues that can be fixed in other ways, and only add folders to them if I really never want them to be indexed.

Robots.txt can have other uses

This healthy fear aside, I have also found some great uses for robots files that are completely harmless to your search visibility and could even cause some confusion with your competitors. Note that if you do any of these you should add a comment tag in front of any non-code so you don’t inadvertently break your robots file. These are just examples of what you can do with a robots file once you start being creative.

  1. Recruiting: Use robots files to advertise open SEO roles. The only humans that are ever going to look at a robots file are either search engine engineers or search marketers. This would be a great place to grab their attention and encourage them to apply. For an example of this, check out TripAdvisor’s robots file here.
  2. Branding: Showcase something fun about your brand like Airbnb does. They write out their brand name in ASCII code which reflects their design sense and make a casual reference to careers at the company.
  3. Mystery: Anything that is in a robots file as disallowed that appears to be a traditional file (and not a reference to scripts) will inevitably be something that people will want to check out if they find it. If you produce a technical tool, this might be the place where you can offer a discount to only the people that find the link in the robots file with a URL like “secret-discount”. If someone goes to the lengths to explore your robots file, they are most definitely going to navigate to a page that references a secret discount!
  4. Subterfuge: Any good marketer should always check out their competitors robots.txt files to see if there is anything that the competition is doing that you should keep track of. Assume that your competitors are going to check out your robots file and do the same to you. This where you can be sneaky by dropping links and folders that don’t exist. Would you love to do a rebrand but can’t afford it? Your competitors don’t need to know that. Put a disallow to folder called /rebrand -assets. For added fun you can even put a date on it, so the competition might mark their calendars for an event that never happens. When you start being creative along these lines, the ideas are truly endless. You can make references to events you won’t be having, job descriptions you aren’t releasing or even products with really cool names that will never exist. Just make sure you block anyone from seeing a real page with either password protection or a 404, so this does remain just a reference to an idea. Also, don’t take this too overboard into anything immoral or illegal!
  5. Taxonomy: A robots file really just exists to disallow folders and files not to allow them; unless your default is disallow (like Facebook’s robots file) and you just want to allow a few pages. An exercise where you sit down with content or engineering to add folders to be allowed might be a good way to find out if there are folders that should not exist. Truthfully, the value in this is just the exercise but you can carry it forward and actually lay out all the allowed folders in the robots file as a way of detailing the taxonomy.

As you can see a robots file is not just a boring requirement – and it is a requirement, every site should have one. There is a lot you can do with it if you think of it as another asset or channel on your website. If you have other ideas on how to make robots files useful, I would love to hear them – please email me!

SEO

Nofollow vs Follow Links – Who cares?

In 2005, Google along with the other search engines that at the time still mattered launched a new attribute to describe a link termed the “nofollow”. The nofollow attribute was a way for Google to disincentivize a spam issue specifically to acquire links that was spiraling out of control. With the nofollow attribute, websites could in theory negate the SEO value of any outbound link simply by declaring them to be nofollowed.

SEO

Stop using marketing jargon

Jargon kills the internal influence and unnecessarily impacts the effectiveness of a marketing team. Jargon, abbreviations, and bullets without explanation have their utility, but they should not be used in a public setting that includes a non-marketing audience.

SEO

Marketers should use strong project code names – even for core parts of the job


In the military, operations are typically given a code name to align everyone to the mission and keep the actual objective secret. As the practice took hold in World War I and II, the naming convention of missions began to take on names that connoted strength. Winston Churchill had a role in personally picking the names of missions and even set out guidelines that encouraged planners to use names of heroes of antiquity, figures from Greek and Roman mythology, and names of war heroes.

SEO

5 Lessons from Google’s Super Bowl Commercial

In this year’s Super Bowl like all the past years, Google ran a commercial. In prior years, Google highlighted search, but this year they decided to put the focus on the Google Assistant.  Based on immediate Twitter responses as well as recaps about the ads that ran this year, Google definitely hit the mark by causing an emotional response.

1 2 3 4 14 15