robots file

TurboTax’s Robots NoIndex Brings Regulatory Attention to SEO

Update May 22, 2019:  

Additional fallout from this issue includes the City of Los Angeles suing Intuit,Senators including Bernie Sander and  Elizabeth Warren launching an investigation, and  the state of New York launching a probe.  All of these efforts refer to the robots directive as a deceptive search practice. 

“These companies’ actions in hiding Free File [Alliance] from search engine results — and therefore from consumers — in order to artificially inflate profits and deprive low-income consumers of cheaper product merit investigation as unfair and deceptive practices,” 

As of today, there is still a robots directive on the internal pages mentioned in the earlier post.

Last week, ProPublica broke the news that both Intuit and HR Block placed a Robots no index directive on their free tax filing landing pages. This discovery by ProPublica came from of a wider investigative effort that revealed how TurboTax was allegedly steering people eligible to file their taxes for free under an IRS program into a funnel where they had to pay to file.

As a result of ProPublica’s post, Intuit (the parent company of TurboTax) announced that they had updated the robots file on their free filing site and they were:


“undertaking a thorough review of our search practices to ensure we are achieving our goal of increasing eligible taxpayers’ awareness of the IRS Free File Program and its availability.”

Spoiler alert: At the time of this writing on April 30th they only removed the robots directive from the homepage of the free file site – which has very little content, but not from the support and about pages with the content that might actually rank.

Here’s the robots file on about us:

Interestingly, even if they were to remove that robots directive, they would still have the canonical in place which also might lead to Google not indexing the page.

As a result of these robots directives, here’s what this free filing subdomain looks like in search.

As you can see the homepage is now indexed thanks to the quick response by Intuit, but those other results are buried in the supplemental results and don’t have a snippet. (Supplemental means that you would need to request to see “omitted results).

Tangent aside, this incident has kicked off what could be a fascinating debate in congress, the FTC, and at the IRS on what exactly a robots directive does and why it may have been used. I am willing to give the SEO team at Intuit the benefit of the doubt that this was an oversight that stemmed from sculpting crawl budget rather than a nefarious attempt to squeeze taxpayers.

Regardless of the intent, regulatory bodies are now going to turn a spotlight on SEO. Representative Katie Porter has called for both the IRS and the FTC to investigate.

Her letter has some remarkable paragraphs as she specifically references the issues around the robots file. The letter seems to imply that without being indexed by search engines, users will be inconvenienced. This steps into a whole new argument over whether being indexed by Google is a right or just a privilege. If sites are required to be indexed by Google, shouldn’t Google be required to rank them? And, what about if they break Google’s TOS?!

She concludes the letter by stating that this “misdirection” violates the law as unfair competition.

Without weighing on whether Intuit did this deliberately and whether it was right or wrong, the fact that choosing what pages to index on a site could remotely be considered illegal is a very scary prospect for any one conducting any SEO efforts.

The reality is that Google is not required to rank or even index any site, and choosing what and how to expose pages to Google is a vital tool in an SEO toolkit.

Unless this somehow goes away, expect there to be some sort of congressional discussion on how Google interprets a robots directive and should site be required to make pages available to Google.