Eli Schwartz

Category

SEO

Home / SEO
SEO

SEO can’t be replaced by software

S

There are several tools that I love using daily and without them I would not be able to work on any SEO projects. Yet, I think tools are just that – tools to help complete a project they are not the solution in of themselves. I believe there are many areas where tools will supplant humans and be able to do a job beginning to end without human intervention, but anything in the realm of marketing in my opinion must have human input.

Analogy to construction

It’s best to explain my opinion with an analogy to home construction. While any construction worker would likely disagree that their job could be completely automated, it is not farfetched to see how the robots could take over once each task is broken down. There are devices which hammer in nails automatically, hold walls straight and even follow a schematic to build a frame.

For now, there is no single robot hammer that can crawl across a house frame and know exactly where to hammer in a nail, but years ago there wasn’t a robot that could vacuum a house either. It is entirely possible that one day an inventor will realize that it is cheaper and safer to use robots for monotous labor intensive tasks that can be guided by a repetitive algorithm.

However, there is one area of construction that will never be replaced by a robot and that is the design and architecture. A robot will never be able to understand the human emotions and personal choices that go into deciding how a home should look, where the front door should be and how big the walk-in closet can be. A robot can certainly perfectly build a blueprint to spec, but it can’t translate desire into a plan.

Human marketing

I believe this same concept applies to SEO and marketing, one day there may be tools that construct the perfect website based on findings on what works in search, generates the best keyword ideas and maybe even writes the content, but the human element will always be missing. At best software can mimic what others seem to be doing well, but it would be impossible to have  a creative idea on how to get ahead. Even further, while software can even write content based on keywords and it may even appear well in search, it will lack the human emotion that is necessary to resonate with the humans who need to engage with that content.

Automatic SEO

Every once in awhile there’s an article about a tool that does SEO “automatically” getting a round of funding. There’s usually some breathless proclamation about how it will disrupt the entire industry, but all these articles always neglect to mention the AI factor that already exists. Google is already using AI to understand and rank content, the way to “beat” Google’s AI is not to have a duel with another AI tool but to put a human in the mix.

If all of SEO could be distilled down to doing keyword research and structuring web pages, then SEO could be disrupted by software. However, successful SEO is so much more. An SEO effort includes knowing how to architect a website, the types of content to create, the personas of the potential users, learning from performance to optimize for growth and most of all building a product that resonates with real users. Until we live in a world where robots do all our shopping, none of this could ever be disrupted by software.

In my opinion, all the sites that try to use software only to manage their SEO, leave a gaping hole for a human driven SEO campaign to beat them in search visibility.

Think about all of the most successful sites on the web, and then imagine if it were possible to replicate their success. Could a machine have built Wikipedia? Would automated reviews helped Yelp, TripAdvisor and Amazon win their categories? Would Google News be a dominant source of news if all it did was index machine written content?

I don’t think humans would have considered these sites key sources of information if machines had build the strategy, websites and content. Anyone looking to replicate their success would be better served by finding the smartest humans rather than looking for the next automated shiny object.

SEO

SEO should be viewed as a product

One of the reasons a company may leave its SEO potential unfulfilled is because they inadvertently box in the person or people responsible for SEO. They leave the responsible party on their own as an individual contributor forced to go through their manager to get anything done.

This flaw in structure is because SEO is viewed as a marketing function with tasks structured as campaigns relying on other marketing contributions such as content and design. Technical tasks like building or launching a page happen somewhere else in another org within the company.

Instead if SEO were viewed as a product, engineering tasks would be a part of the product roadmap and launch process from the start. Product roles are always reliant on other teams and are inherently cross functional.

This does not at all need to change the reporting structure of the individual as in many cases it makes perfect sense for SEO to be on the marketing team. Rather approaching SEO as a product function helps clarify inputs and outcomes on multiple levels.

Planning– When planning for SEO goals it is critical that all required resources from other teams be allocated at the exact same time. It wouldn’t make a whole lot of sense to plan to launch a number of pages or a new microsite but not pre-allocate the design time and engineering plan. On the product management side, new initiatives are never approved with a hope and prayer that everything will just work out when the time is right. All products that are prioritized, will get the resources to complete the project.

Budgeting – When it comes to budgeting on a marketing plan, SEO usually falls to the bottom of the pile since the story on investment to output is harder to tell. This means that SEO will get the short end of the stick on hiring, software and contractors whereas paid marketing teams might be flush with cash. Thinking of SEO as a product instead realigns the expectations on investment as if it’s a product that needs investment because it is a priority. Typically product teams aren’t resourced as if they have a direct line to ROI but rather because it is a business necessity.

Output and reporting – On the same note, when SEO is thought of as marketing the KPI’s need to be similar to other marketing KPI’s. Paid teams have LTV goals (hopefully), brand teams have impression share, so therefore SEO ends up being measured on rankings. This is a terrible way of looking at SEO as rankings are just a vanity metric. Instead the same way any product is measured by adoption and engagement, the same lens should be applied to SEO. For this SEO this would be measured by growth in impressions on search pages. Obviously clicks are important too, but the clicks are a result of on page structure which is not necessarily SEO itself.

Resourcing – Making the case to add more headcount for SEO can be very difficult if the metrics for success are too hard to reach or they are inappropriate for the channel. Viewing SEO as product, the primary headcount metric moves from KPI driven to deadline driven. The question that should be asked is what is the headcount necessary to meet the goal within the desired time frame.

Not much really has to change on reporting, salary and even titles to make SEO more aligned with product, it is really just an exercise in awareness and management. If the current method for managing SEO is leaving value on the table, it may be helpful to change the structure of how SEO should be conducted in a company.

SEO

In large companies winning is all about incrementality

The best way to be successful in a process driven large company is to always keep the focus on small incremental wins. Little wins will ladder up into bigger wins as the smalls wins begin to add up in the impact they are having within a company.

Building a detailed plan for incrementality is so much more effective than creating a plan that will never get executed upon unless there is executive buy-in and a dearth of initiatives across the company – which of course there never will be.

Proposing a complete website revamp is a surefire way to a backburner and the purgatory of no budget, but a refresh of a particular page is a far easier sell. The page refresh might need to be implemented in piece meal, but at least its not a project size that makes stakeholders recoil in fear.

Small wins

When setting these small win targets, it’s really important to make the little wins as small as possible. It could be something like just changing the title of a page which surprisingly could be difficult at a large company. It may even be a smaller goal like getting buy-in from a cross-functional counterpart – again not a given in a highly individual team goal focused culture.

This method for growing by using incremental wins to succeed requires acknowledging that by their very nature large companies are just very different from smaller organizations.  Within small companies with only a handful of employees, culture can be set by founders and a company can be oriented towards results. As an organization grows process is introduced which can add levels of complexity to getting things done. Much of the process will be vital for the future success of the company, but inevitably it will also lead to bureaucracy. For a growth minded product manager or marketer, the bureaucracy can be negated by embracing the process.

Big vs large comparison

As an example of the key differences between large and small companies just compare the planning processes. In big organizations a product plan that is blessed by all stakeholders can take a lot of time to develop, however for a company of this size, the alternative is untenable. A startup can pivot on a dime with founders, board members, and internal leaders reorienting teams on a whim, but in a large company behavior like this what leads to employee attrition. Employees will not feel valued to put a lot of work into a project and then to be directed on to a brand new initiative with no warning.

People

Things are also very different on the employee side too. Employees at smaller companies can have fuzzy titles and responsibilities and goals that change at the speed of the business, while in a large company this would be the complete opposite. Titles are typically specific and aren’t changed unless there is a business need. Responsibilities are narrowly defined and while adjacent responsibilities might be added via projects, big responsibility changes only happen in a job update. Within this environment, quarterly and annual goals are set long in advance and employees are expected to march towards those goals.

The types of people that end up in a company will have their own biases towards an atmosphere with more or less process. There are even extremists at both ends who could not imagine themselves in the opposite environment: a startup employee who shudders at needing to do the same thing for an entire quarter or the large corporate employee who gets night terrors from the thought of a loosely defined job structure that could change weekly.

Summary

There is no right and wrong when it comes to processes and each company will have its own way of doing things that is right for them

The gap between freestyle and rigid is very broad and there will be many companies that fall various places within the spectrum. Companies can even change when something big like a reorganization or layoff forces it. Whatever the culture of the company may there will be a way to work within that system to make things happen.

Keep in mind that in a smaller company it is easier to quote “get things done”, it is incorrect to just throw up one’s hands at a large company and just give up. Initiatives absolutely do get executed at bigger companies, its just that the pathway to making things happen is a bit windy and there are a lot more rest breaks along the way than a fast moving startup.

Realistically even smaller companies need to walk before they run, its just that they are a lot more nimble on their feet.

SEO

SEO is top of the funnel and is the assist on a conversion

Within marketing teams the most attention both good and bad is paid to the initiatives that cost significant sums of money. There will be frequent executive check-ins, quarterly reviews, detailed reporting and of course an attribution system that relies on something a lot more sophisticated that a gut belief. In fact, the entire company wide attribution system might be tightly tuned to have a deep line of sight into all paid efforts at the expense of other channels.

In this world view, organic search channels could end up with the short end of the stick both from a resourcing standpoint and on attribution. Everyone sort of has a belief that SEO works and is overall beneficial to the bottom but there’s not as strong of a drive to understand exactly how the traffic performs. Without accurate reporting executives and SEO teams could end falling back on useless metrics like rankings.

Even worse, a natural consequence is that when budgets are tight the channel that “kind of” works will fall behind the channel or channels with deep visibility. This leaves SEO teams always strapped for resources and scrambling to prove their efforts are worthwhile. In a weird script twist, the paid team only has to defend their budgets not their jobs while the SEO team without the budget has more existential issues.  

I think the root of this issue comes from a fundamental lack of understanding of where SEO fits in the marketing mix. Unlike a performance channel which is designed to go direct to conversion, SEO is a hybrid between branding and performance traffic. Judging it purely as a brand channel would overlook the tremendous impact it will have on the bottom line, but at the same time it can’t be viewed as just a performance channel.

SEO in the marketing mix?

By its very nature SEO will typically live a lot higher in the buyer funnel and in many cases users will not have any buying intent whatsoever. Stepping back from being marketers for a moment and thinking about our own search activities, much of it is just research and curiosity. Queries about weather, information, sports scores, stock prices and the link have no commercial intent.

On the flip side, organic traffic on the brand name will be a lot lower in the funnel, but to be totally honest, its not really even organic traffic. A brand should rank for its own brand name or something is very wrong.

The real SEO

True SEO efforts will have a site earning significant visibility on the long tail – the types of words that it would be hardly profitable to put paid dollars behind simply because it would take too long to convert. As the user moves down the funnel, their queries will skew closer to head terms or this is when they might engage with paid advertising.

Once the user gets to the bottom of the funnel and has buyer intent, they are more likely to click a paid ad – either on the brand name or retargeting on another site. A last click driven attribution system will then give 100% of the conversion credit to paid channel and completely discount all the organic clicks that happened over the prior time period.

Organic is an assist

Applying this to a sports metaphor, that last click might be the basketball slam dunk or the hockey goal, but it was all the other prior clicks that set up the perfect sequence for someone to bring the ball or puck home.

In reality, changing attribution systems is complex and unlikely to happen in a short period of time just because someone wants to. However, there is still no excuse for not having a better view on the performance of the organic channel and why to invest more into it. To that end, executives need to be aware of where SEO fits in the funnel and manage expectations accordingly.

To illustrate this with an example, let’s look at someone using search to plan a vacation.

The first query might be very general just to get ideas.

As they move further down the funnel they settle on a place to travel.

Assuming they know the dates they want to travel they start exploring transportation.

They also check out their hotel options.

Throughout this entire process they may have visited many various sites from local chambers of commerce, review sites, hotel sites, online travel agencies and aggregators.

As they finally decide on their options and get any necessary traveling partners on board, they are ready to purchase. They search directly for the site where they found the best deal.

If a paid ad comes up first, so be it; they are clicking. In the last click attribution world most less sophisticated sites use,  all of the credit would have gone to that very last click. The potentially months- worth of effort on planning that vacation through various pathways would have all fallen by the wayside from an attribution standpoint.

Multi-touch attribution is the goal

The ultimate goal of every site should be to use a multi-touch attribution model, but getting to this ideal is not as simple as changing a t-shirt. There is a significant amount of effort to gather data, build data lakes, test out models and buy the tools necessary to support the process.

There may never be a perfect way to attribute organic traffic, but at least with the knowledge of where SEO traffic really fits in the marketing mix, the best integrated marketing strategy can be built. SEO should carry the baton on all the deeper research efforts, but the baton can be passed to performance channel when customers are ready to pull out their credit cards.

SEO

Visualize internal linking like an airline route map

Links are a critical part of Google’s ranking algorithms as a link to a page is a vote of popularity and at times contextual relevance. The authority lent by an inbound link doesn’t just apply to external sites linking in, but the same applies to internal links (pages within a site) too. A website draws its overall authority score – Pagerank as Google’s ranking patents refers to it, by the sum of all the authority of sites that link into the site.

The best way of explaining this is to use the words from Sergey Brin and Larry Page’s original research:

Academic citation literature has been applied to the web, largely by counting citations or backlinks to a given page. This gives some approximation of a page’s importance or quality. PageRank extends this idea by not counting links from all pages equally, and by normalizing by the number of links on a page. PageRank is defined as follows: We assume page A has pages T1…Tn which point to it (i.e., are citations). The parameter d is a damping factor which can be set between 0 and 1. We usually set d to 0.85. There are more details about d in the next section. Also C(A) is defined as the number of links going out of page A. The PageRank of a page A is given as follows: PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn)) Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages’ PageRanks will be one.

To the layman this is just saying that each page begins with a score of 1, but the final score is a function of all its outbound links added to the score of all of its inbound links.

In this calculation, the most linked page on a website will tend to be its homepage which then distributes that authority throughout the rest of the website. Pages that are close to the homepage or linked to more frequently from pages linked from the homepage will score higher. In this regard achieving this right mix via internal linking is critical.

Inbound link authority

Additionally, the homepage will never be the only page that receives authoritative external links, so if an internal page is the recipient of a powerful external link but doesn’t link to other pages, that external link is essentially wasted. When pages link to each other the authority of all external links is funneled around a site to the overall benefit of all pages.

For sites with flat architecture or only a handful of pages, a proper internal link structure is simple and straightforward, but on large sites improving internal links can be as powerful as acquiring authoritative external links. (A large site can be even be one that only has one hundred pages.)

Large site challenge

Due to the nature of how many large sites are structured there are invariably going to be orphaned pages – defined as pages that don’t have any or many links pointing into them. Even a media site like a blog or daily news site which has a very clean architecture – each post/article lives under a specific day will also have an internal linking challenge.

More than likely, the site will desire organic traffic that isn’t just someone searching out that day’s or recent news. There will be posts that they might hope would be highly visible many years into the future. Think of the review of a product on its launch day which is relevant as long as the product is on the shelf. Or a well-researched item which explains how something works, like the electoral college as an example. Granted these posts were published on a certain day but they are relevant for many queries essentially forever.

Ideal link architecture

As you might imagine, for all sites with this challenge, creating an ideal link architecture that flows links around the site can have a huge impact on overall traffic as these orphaned or weakly linked pages join the internal site link graph. 

How to improve the link graph

Implementations of related page algorithms on each page – quite simply a module with related links – to crosslink other pages can go a long way into supporting this link flow, but that’s only if the algorithm isn’t tightly tuned to a specific relationship. Sometimes when these algorithms are developed, they key off specific connections between pages which has the effect of creating heavy internal linking between popular topics while still leaving pages orphaned or near-orphaned.

There are three possible ways to overcome this effect:

  • Add a set of random links in the algorithm and either hard code these random offerings into the page or refresh the set of random pages whenever the cache updates. Updating on every page load might be resource intensive, so as slow as every day would achieve the same outcome.
  • In addition to related pages include a linking module for ‘interesting’ content– which is driven by pure randomization – also refreshed as in the first recommendation.
  • Include a module on every page for the most recent content which insures that older pages are linking into new pages.

As an aside, I also like to always build an HTML sitemap for all large sites as this gives one place that every single page is linked. If the sitemap is linked in the footer it will achieve the goal of having most pages just one click from the homepage. Transparently, Google’s John Mueller suggested that HTML sitemaps aren’t necessary, but I have always found that on large sites they can be very powerful.

Visualizing internal link graphs

To visualize what a desired structure of internal linking should be, I tend to the think of a site’s link graph like an airline route map.

Singapore Airlines

The least effective internal link graph looks like the route map of a national carrier for a small country. These air carriers will have a single hub in their capital city and then have spokes pointing around the world from that hub. Here is the route map for Singapore airlines which has impressive reach for a flag carrier, with only a few exceptions all their flights terminate in Singapore.

Flipping this visual over to websites, think of the hub as the homepage. The homepage links out to all of the other pages, but very few of the internal pages link to other pages.   

United Airlines

The most common type of link graph looks like the route map of a large global carrier. Look at United Airlines as an example. There are very clear hubs (San Francisco, Los Angeles, Chicago, Newark, Houston, Denver…) and these hubs connect to each other and other smaller satellite cities.

Image result for united airlines route map

Again, flipping this over to websites, the homepage would be the biggest city on the route map: Newark which links to all the other big cities in addition to all the hubs. The other hubs would be important category pages which have a lot of inbound links and then links out to all the other smaller pages. In this link graph, important but smaller pages would only have one pathway to get to them. (As an example, Mumbai is only connected to Newark.)

The most ideal internal link graph looks like the route map of a budget airline that thrives on point to point connections. To the bicoastal business traveler this route map makes no sense, but the wandering tourist can get to anywhere they need to go as long as they can handle many stopovers. Southwest Airlines is a great example of this structure.

Southwest Airlines

Southwest has such a complicated route map, they don’t even show it on their website. You would have to choose a particular city and then see all the places you can get to directly. There are certainly some more popular cities within their route map, but their direct flights almost seem to be random. A traveler can get fly directly from Cleveland to major travel gateways like Atlanta, Chicago and Dallas, but they can also go to Nashville, St Louis, Tampa and Milwaukee.

This is how a website should be structured. Pages should link to important pages, but also link to other pages that seem to be random. And, those pages should link back to important pages, and link to other random pages.

Summary

To summarize, think of a search engine crawler passing from one page to another calculating authority as a traveler intent on flying to every city on an airline’s route map without ever needing to go to a single city more than once.

On Singapore Airlines, a traveler could get from Mumbai to Frankfurt via Singapore, but to get to Paris (without a codeshare) they would need to get back to Paris.

On United Airlines, a traveler could get from Portland to Dallas via Denver and then could go on to Fort Lauderdale via Houston. They would certainly make it to a number of cities, but at some point they would find themselves connecting through Houston or Denver again.

On Southwest Airlines, a traveler could begin their journey in Boise, Idaho on any one of the ten non-stop flights and make it to nearly every city without ever needing to repeat a city.

Build your internal link architecture like the Southwest Airlines route map and you will never have an orphaned or sub-optimally linked page again.

SEO

SEO tools: My full list

With the trillion dollar plus annual value of SEO traffic, there is no lack of tools that help to understand Google and optimize websites accordingly. This is not meant to be an exhaustive list of tools that can be used to help with SEO rather it is a list of tools that I personally use and find useful. I will continuously update this list as I discover more tools.

Crawling

When working on a large domain, understanding the size of a site and how it might be viewed on Google is critical. A good crawler will crawl a site similar to how a search engine would discover all the pages on a site. The crawl of the site should tell you any technical SEO issues that might exist and should be fixed. For crawling I have a handful of go-to options, each with specific pros and cons.

Screaming Frog’s desktop crawler should be the staple of anyone doing technical SEO. For smaller sites the free version should be perfectly fine, and for larger sites there is a nominal fee to access the pro version.  Screaming Frog allows users to crawl a site and then manipulate the data in a spreadsheet which is the preferable way, in my opinion, to handle large data sets.

My new favorite desktop crawler is SiteBulb which offers many of the same crawling capabilities as Screaming Frog, but also has some amazing visualization tools that negate the need to build charts in Excel. Additionally, it doesn’t have the memory leak issues of Screaming Frog and I have crawled hundreds of thousands of pages without needing to chunk out pieces of the site like I have done with Screaming Frog.

For cloud based crawling, I use both Oncrawl and Deepcrawl. They both offer similar capabilities and the decision over which one to use would be one of personal preference.

Backlink research

I learned SEO by doing backlink research on Yahoo Site Explorer and then replicating my competitors backlinks on my own site, so backlink research is an SEO process near and dear to me. Throughout my SEO career I have experimented with nearly every backlink tool from Moz to Majestic to SEMRush. My current favorite tool for backlink research (and many other functions) is Ahrefs. Although, again this is a personal preference as most backlink tools will have similar enough data to take action on for a linkbuilding campaign.

Competitive Research

Most link building tools will help with basic competitor research which will show the types of links that competitors have as well as the keywords they ranking on. For more specific competitor research I like to use Similar Web which gives me data around total traffic and the percentage of organic traffic. I also use Alexa.com (yes, it’s still around) to show keyword intersections between websites.

Keyword Research

When I first started in SEO there were only a few keyword research tools and none were very good. We are lucky that there are so many options to generating keywords that real users search. I typically try to optimize towards users and the queries I need for my sites rather than on search volume alone, but having this data can be very helpful for prioritization.

When I am working with a large site and a big Google Ads spend, I find the data in search query reports from Google ads and the Google Keyword Planner to be very useful.

I also use the competitive tools mentioned above to find new potential keyword ideas as well as keywords that competitors rank on.

For gathering Google suggest terms – which will be terms that users actually search and therefore Google suggests them, I use KeywordTool.io. This tool will also offer suggested ideas from other search engines including Youtube and Amazon.

To build huge lists of keywords, I have used Scrapebox, KeywordSheeter, and Kwfinder.

As people view Google more as a friend than a search engine, they ask more questions within search and AnswerthePublic is a great tool to get question based queries.

International

International SEO efforts will be similar to domestic SEO except for that it is focused on another country or language. Most of the SEO tools on this list will work for international SEO except the language or country needs to be changed.

For sites that need to build out Hreflang tags, this tool from Aleyda Solis can be very handy/ , but larger enterprises that might need help can use
hreflangbuilder.com.

Site speed

Site speed is a part of the  Google ranking algorithm, but not a major part. The main reason anyone should care about the speed of a website is because users will bounce if a site is too slow. For site speed, I usually use multiple tools because they are free and quick so more data can’t hurt. Pingdom.com gives a waterfall of how a page loads. Gtmetrix.com displays helpful information on what can be fixed to improve a page/site’s speed. I also use Google’s page speed tool which adds some additional info on mobile page speed.

My favorite site speed tool is simply to lower my mobile device to 3G and to see how fast a webpage loads. If it takes too long to load, then I can assume a real user would bounce on a similarly slow connection.

Rankings

As I have written previously, I am absolutely not a fan of ever checking search rankings because I think it is the wrong metric to look at. However, there are specific use cases where I find rankings to be very helpful. If a site made a number of changes, I might download a prior months queries and then put them into a rankings tool to see if there has been a massive shift from previously reported Google Search Console positions.

The two tools I use for rankings are Link Assistant’s Rank Tracker which runs ranking queries off of my desktop until I get captcha blocked by Google or my favorite: Rank Ranger. In addition to rankings, Rank Ranger has a ton of other features including competitive insights, schema creators, and social analytics.

Data

My go to SEO tool is Google Search Console which is free and everyone should be using even if they don’t trust the data as much as they should. Google Search Console data can be pivoted in multiple ways to find insights, but the UI is still a bit limited. The way around this is to pull down data via the API. To access the API, I use a Google Sheets plugin called the Search Analytics for Sheets. I have found that this tool has had issues recently with maxing out its API calls so I have had to fall back to building my own lookup tools in R Studio which you can do to if you follow this guide.

Optimization and testing

Most of my testing and experimentation happens manually because getting the keys to the kingdom that is the codebase of a website is challenging. If you are able to get access to either Cloudflare or another CDN, you can use tools like Distilled ODN and Clickflow to do SEO learning at scale. Absent that you should absolutely be testing by making single variable changes on multiple pages and then recording the clicks/impressions over a long time period to discover if there are any statistically significant learnings.

Enterprise

For SEO teams that need to have quick access tools that can do everything on all in one place they might want to use an enterprise SEO tool that can also help with writing bug requests, tracking and dashboards. Searchmetrics is the enterprise SEO tool I have been using for many years and they now also have tools which help content teams draft content that includes all the related keywords that are ranking on other sites.

The goal of any SEO tool is to make SEO less manual and more efficient. These are the tools that I use on most SEO projects and I love that new tools keep being produced to make SEO even easier! If there are any tools I missed or might no have heard about, please let me know.

SEO

SEO Personas as the foundation for keyword research

The idea of building elaborate customer personas is very popular on design and in various marketing teams, but so few people actually use them in their daily work life that the investment in creating them is hardly worth it. Many times, when companies build out these personas, they go overly deep into developing exactly who these customers might be and all of their character traits.

Personas for SEO

Personas might be passé, but when it comes to SEO I think that some sort of persona research must be the foundation of any good keyword research. Too many people begin a process of keyword research by firing up their favorite keyword tool and then picking keywords off the list with high monthly search volume that are relevant to their business.

Starting with keywords sorted by volume puts the emphasis on the wrong metric and leads to creating content that might not match the intent of a user or the needs of a website. In my opinion, it makes the most sense to prioritize exactly the kind of content that is needed to help a website monetize. Some keyword tools might show this after the content has already been written by doing a “keyword gap analysis” vs a competitor, but it is a lot easier to just determine what content is necessary at the outset.

The easiest way to figure out exactly what content is necessary is to go through a persona exercise to understand exactly how, why and what users want from the website. Only then once the users’ needs are taken into account does it make sense to distill those topics into precise keywords.

Persona research should answer questions such as where in the buying funnel a user might be. This will guide the depth of content a user expects to see. It is also important to understand the devices that a user will be using to access the website. Is it a desktop? A mobile device? Or maybe the user can be served just with a voice enabled device. Knowing this can quickly help decide whether long form or image heavy content is even appropriate.

Before embarking on this effort, it is worth acknowledging that existing personas likely will not be detailed enough for you to use for SEO and it is not a wasted effort to build personas from scratch just for SEO. The current personas that the company might be using will have details that are not necessarily helpful for SEO like age, gender, and career details.

Steps to build SEO personas

With that in mind, here are the best practices on developing personas specifically for SEO.

  • Identify all potential users of a website or product
    1. This is where keyword research as a start of an SEO effort typically falls short. Just because a website or product exists doesn’t mean that users will automatically want to search for it. Taking a step back to think about who the users of the website might be gives a good foundation for what kind of content and keywords to focus on. For example, an ecommerce website might want to target people that have a specific need and the focus of SEO should be on solving that need rather than just optimizing the product page. A SAAS product might have a similar phenomenon and targeting the problem rather than the solution would yield more search traffic.

  • Determine how the users might search based on where they are in the funnel.
    1. Again, traditional keyword research would only identify the popular terms for a vertical but not how the targeted users will search. Users very high in a funnel will be searching for a solution to a problem while users at the very bottom will be looking for the brand plus pricing info.


  • Slot them into the type of content that they might expect.
    1. There is a lot of advice around what kind of content is best for SEO, but none of that advice considers the granular needs of a specific user. If a user wants just a price or a list of features, they will be ill-served by a long form piece of content while a user that wants a detailed product review would similarly not be helped by a quick list of bullets.

  • Match them with a specific call to action that is relevant for the place they are in the buying funnel.
    1. Search traffic is a means to an end and is rarely the end itself. Even on a media site that targets readership, an increasing user count is of no benefit if the users don’t do a follow on engagement action. Where the users are in a buying funnel should determine the appropriate call to action (CTA) for the content. A reader that is very low in the buying funnel might be looking for a way to contact a salesperson while a user high in the funnel should be encouraged to just read more or maybe subscribe to a mailing list. When content is written for users rather than keywords it becomes a lot easier to have a targeted action type for users to take.

  • Classify the types of devices they would be using to access the content.
    1. While we constantly hear the refrain that the mobile web is dominant, this is not necessarily carried forward into executing SEO efforts. If it was, long form content would have fallen by the wayside in favor of short punchy shareable bits. Even though nearly every web user has a mobile device, there are some things that will always be done on a desktop. Buying business software or expensive shopping is probably going to involve a bigger screen somewhere in a buying cycle. Writing content to where the users are in a buying cycle should play a key factor into the screen size they will be potentially using to access the content.

  • As a bonus, pigeonhole them into a precise language or culture for internationalized content
    1. One last thing for sites that have international audiences, its critical to know what language they might expect to see content in and if there are any cultural nuances that should be addressed. What many people that have never done international marketing might not know is that its OK to have just English content for an international audience. They might not expect a translated page, so it is better to just give them content in English that contains the international options they need like shipping or currency. Understanding the users will prevent a website from creating language specific content unnecessarily.

With these best practices in mind, hopefully you will be able to develop SEO specific personas that will guide keyword research. Keyword research like everything in SEO should be targeted at real users – not search engines – and a persona exercise will go a long way into knowing who those real users might be.

SEO

Crawl Budget – It’s really a simple SEO concept

The phrase “crawl budget” is an SEO term that is frequently included in discussions about technical SEO, but it is typically used incorrectly. Most of the time when people refer to crawl budget they are considering it a technical SEO enhancement to improve the way Google understands a website. In fact, it is far simpler than that it is simply a budget.

The best way of understanding various aspects of Google’s algorithms is to view them from a financial standpoint. Crawling and indexing the web is a very expensive proposition and Google was able to beat out every search engine to dominance because they figured out how to do that before the money ran out.

While it would be ideal for Google’s crawlers to simply gobble up the entire web in one fell swoop that would be technically impossible. Crawlers need to literally crawl through the web discovering link after link and then as they land on a page they need to build a copy of the page into the database.

It’s about the Benjamins!

In the early days of search, while Google was still living on venture capital money, the engineers needed to come up with a way to efficiently crawl the web without going broke in the process. That way was to decide how much “budget” each site was allocated based upon its importance to Google and the web as a whole. That is crawl budget.

If a site is very important to the ecosystem, Wikipedia for example, Google would have wanted to allocate a lot of their hypothetical dollars to crawling as much of the site as they could. Alternatively, a brand new website with no authority on the web would be allocated a significantly smaller amount of budget.

New websites

This all makes logical sense. Taking this logic one step further, if the brand new website would have thousands of pages but only a few of them were valuable, it would have been very likely that their budget would have been eaten up by the crawler ingesting the lower quality pages without ever seeing the good ones.

The best approach for a website in this position is to simply declare – via no indexes or canonical tags – which are the lower quality pages and then the crawlers could just skip them.

A happy example

To illustrate this with an example, think about a website like a Happy Meal with a toy inside. You have a certain amount of daily budget to buy Happy Meals, but you only need the unique toys to complete a series. The only way you could find out whether the toy is unique is by buying the meal and opening the box. So, every time a Happy Meal is bought and the duplicate toy shows up, that days budget is wasted – unless you were very hungry. The most efficient way to do the toy collecting is simply to show the name of the toy on the outside of the box and then you would choose only that box.

Continuing this Happy Meal to website analogy, those no index directives and canonical tags are the best way of informing a search engine to ignore a particular box.  The crawler then has more awareness on how to most efficiently spend their limited budget.

Crawl budget summary

This idea of crawl budget applies to every website on the web regardless of authority, its just that more authoritative websites have more budget to be expended by the crawler. As websites gain authority, likely via links or other user engagement signals their budget will expand but without that there is no other way to get more budget.

Google refers to this as “crawl demand” and while they don’t specifically mention authority in their blog post on crawl budget, they sort of beat around it by calling it “popularity.”

Even if the crawl rate limit isn’t reached, if there’s no demand from indexing, there will be low activity from Googlebot. The two factors that play a significant role in determining crawl demand are:

  • Popularity: URLs that are more popular on the Internet tend to be crawled more often to keep them fresher in our index.
  • Staleness: our systems attempt to prevent URLs from becoming stale in the index.

This idea of budget was a key component of Google’s crawling algorithm and it is still exists today although the budget is vastly expanded. Google now has lots more money and resources to crawl the web, but the web is also bigger and more complicated.

Crawl budget today

One other change is that budget was likely initially calculated in small amounts of kilobytes which equated to a number of pages, that budget can be eaten faster if a site has dynamic scripts that are more expensive for the crawler to run.

SEO

Annual and quarterly planning for SEO

In addition to shorter days and colder nights (at least in the Northern Hemisphere of the world), September brings a special gift to the office which everyone claims to welcome – but in truth they despise: annual planning. Many large companies will have some sort of quarterly planning process where in the last couple weeks of a quarter teams will have to detail how they have progressed on the current quarter’s goals and what they hope to accomplish in the coming quarter.

Annual planning

While this process isn’t necessarily the most fun, it can be straightforward as it has the benefit of making predictions about that things that are already in flight. This can’t be said for an annual planning process which is typically conducted in the closing quarter of a year and requires a team to pick a goal for the coming year and negotiate for the resources to achieve said goal.

Winning at annual planning

Team that “win” the planning process  in the sense that their goals are accepted and their resource requests are met can in some ways rest for the next 12 months until this process begins again. In most organizations they will not be necessarily held to their goals because of course a lot can change over a year. Their real win is that they were given the resources (defined as budget and/or headcount) to meet their goal(s), and if they did not meet that goal for whatever external resources, they were still able to deploy their resources to make some other flashy business impact.

This business impact – even if it was not the one stated in the annual planning process – positions them in a very good light in the following year to again state lofty goals and request resources to help them get there.

Losing at annual planning

On the flipside of the teams that win the planning process are the teams that “lose.” These are teams whose goals are not adopted as goals worthy of being resourced and their requests will be turned down in favor of other teams. They will still have a goal for the coming year, but it will likely be somewhat watered down due to the lack of resources. This team could find themselves generally deprioritized and locked out of new headcount and budget for the entire next year. When it comes time to develop goals and make a request for resources in the following year, they start off with a significantly weaker hand as they don’t have the benefit of making a huge impact in the prior year.

Looking at planning in this light, the stakes could not be higher. Teams must win at the planning process and successfully pitch executives to get behind their goals or risk becoming a non-essential line item until something fundamentally changes in the business. The only exception to this favored/non-favored child reality is if there is a high-level executive backing a team or if there is a shift in power within a company like a re-org, fundraising event or new product need.

Due to the nature of where SEO teams sit within a large organization, they immediately begin this planning process behind the curve. In most companies, SEO either sits within marketing or product and this placement has a big effect on how they do in a planning process.

SEO in marketing

If an SEO team is a part of the marketing team, they likely report up to a marketing leader of some sort. This leader will be juggling budget and headcount requests from teams that can make far stronger 1+2 = 3 pitches. For example, a paid marketing team can show very clear math of how additional paid budget will impact acquisition, retention or awareness. They can also show how added headcount will improve the efficiency of their spend thereby adding more value to the organization.

The same argument might work for a content team that could show how output (the metric that content teams are measured by) will improve by a factor of how much more is spent on producing content.

Contrasted with the SEO team that has fuzzy math for how SEO works and even fuzzier math for what more spend might get, a pitch for resources is a losing proposition on a marketing team. This is why most SEO teams even in companies with large other marketing-subteams will still only consist of 1 or 2 people.

SEO in product

When an SEO team sits on a product team, the uphill climb to win resources is slightly easier but not by much. Rather than a pitch for just budget and headcount, the resource ask will also likely include engineering time. In a company that does not prioritize SEO, the engineering time request might get shunted aside in favor of building new products or improving existing products.

Logically, this makes a lot of sense as what product leader would rather have a roadmap full of improvements to existing products rather than one with all sorts of exciting new builds.

How to win resources

To win at the annual planning game, the SEO team is going to have to morph into the kind of team everyone loves to fund and support. This will apply regardless of where the SEO team sits in the org.

First and foremost, the whole idea of using wishy washy data to forecast SEO impact must be completely discarded. There is data within the company and Google Search Console which very clearly tells the story of how valuable SEO traffic is for top line acquisition. There may even be data on how it impacts the bottom line too. Get that data and use that as the guiding light within an annual goal.

Is the plan to increase SEO traffic by 10%? That’s not a good goal! Back into what a 10% organic traffic increase might mean to top line revenue and use that number as the goal. An increase of 10% of overall web revenue from organic sources sounds a whole hell of a lot more sexy than an unclear 10% increase of traffic.

Second, many other teams will pitch ideas that they don’t really know how they will implement but want resources to try. They certainly aren’t pitching the process on how they may or may not get there, rather they are saying they are going to build X and X requires engineers.  The SEO team should do the exact same thing.

Instead of asking for engineers to update a whole bunch of SEO requirements, instead ask for engineers (or content or money) to build X for SEO.

Third, and this is specific by company, there is a process for how every other product and marketing team pitches for resources. Make sure that the SEO pitch looks exactly the same and there is data to show even greater impact than other efforts. Keep the SEO jargon out of the pitch and use the same language that everyone else uses. The last thing an SEO team wants is to have their pitch stand out just because it was different.

Annual planning from a leadership standpoint

The prior advice mostly applies to SEO practitioners pitching for resources or those pitching SEO asks to the C-suite, but it can just as well be modified for leaders being asked for the resources. Know that SEO is incredibly important, and if the pitch for resources doesn’t have the clear 1+2 = 3 approach then push it back to the team for a revision, don’t just reject it out of hand.

Summary

Annual planning is a process no one really enjoys other than the people who run the process for everyone else. It is a necessary evil and it isn’t going to go away if it is ignored.  If an SEO team does not put their best foot forward they could risk losing an entire year with no impact efforts.  

SEO

PSA: Using rankings to track SEO success is dumb

It is amazing to me that in 2019 anyone would use still use rankings as a success metric for an SEO campaign. Rankings are a vanity metric and do not directly or even directly contribute to the success of a business.

Using a rankings report to as the only way to measure SEO progress is as asinine as using a paid marketing budget total as a metric of success. All a big budget shows is that someone can spend money, it says nothing about whether it is profitable. Rankings just exhibit the ability to be ON Google and not whether anyone even clicks or buys.

Rankings in the past

When I first started my career in search engine optimization the critical metric of success was ranking in search results, and most importantly it was how many number one positions one occupied. It did not matter whether those were useful positions or even if anyone clicked, just having them was a bragging right. As an added bonus, Google wasn’t the only search engine that anyone cared about, so having a top result on MSN.com or Yahoo also generated some SEO applause.

Keywords to rank in a top position were chosen using keyword research tools with preference given to exact match words that had high average monthly search volume with no heed paid to intent. Those keywords were then plugged in on-page meta, spun into content at a high keyword density and most significantly used in anchor text for deliberately built external links.

To keep track of these positions, SEO’s had to use a slew of tool’s whose primary function was to scrape search engines on a weekly, daily or even hourly basis for the latest rankings. Executives asked to see these reports and hence having a huge list of prominent top positions was a key component of an SEO role. This whole process worked and then it didn’t because things changed.

It wasn’t just one change, EVERYTHING changed.

For starters, Google became the dominant search engine if not the only search engine anyone cares about. Google earned this role by rapidly improving the search user’s experience which was a direct result of rooting out the kinds of practices that made pages rank undeservingly.

The list of changes on the search side over the last ten years are endless, but some of the key highlights are:

  • Panda – This Google update from 2010 deprioritizes thin content that is targeted at just generating search clicks. It used to be a separate algorithm that ran alongside the core algorithm, but now it just a part of the ranking algorithm.
  • Penguin –  This algorithm update from 2012 negates the impact of illegitimate links and penalizes sites that engage in these practices intentionally.
  • Local –  A national/global result is irrelevant if Google determines that there’s a local intent behind a query.
  • Google suggest – Google’s query suggestions continually to become richer as they use real time data and trends to direct people into a search funnel.
  • Intent matching – Results for synonyms, misspells, and pluralization are nearly identical provided that the intent behind the phrasing is the same. Choosing word order and pluralization is not a necessary SEO task as they will usually be similar if the intent is the same.
  • Entity matching – Google can parse the entity a user is seeking and will show relevant results. Additionally, the amount of entities that can be highlighted in search continues to grow.
  • Mobile – The rise of mobile search changes everything with regards to how people search with touch-only keyboards or voice search. Users with Android phones can already conduct searches with their cameras through Google Lens but expect the way that searches will be conducted to continue to grow as device technology improves.
  • Artificial Intelligence – Google is not just reliant on words that are in the query or even on the page. They can parse meaning and intent without any apparent match between content and query.
  • Artificial Intelligence and deep learning – Whether Google uses engagement metrics in rankings is up for debate, Google certainly has a significant amount of machine learning to know how content should perform in search results without even needing it to gather real engagement metrics.
  • Ranking signals – in years prior it was thought that content + metadata + links were the key components to ranking. They are still important, but Google claims to use hundreds of others so even with the most optimized content and links, its impossible to force a result.
  • Zero click results – To minimize on information arbitrage results and to provide an even better user experience Google will put answers directly into search results. This negates the value of a top ranking result, as many users will choose not to click any results at all.
  • Size of the web – for anything that is of high value there are now hundreds to thousands of sites chasing the same traffic. Generating search traffic requires being far more creative than just picking keywords off a list from a keyword research tool.

With all of this in mind, manipulating a particular ranking can be virtually impossible and even if one were successful all that effort might be for naught.

SEO success metrics

The primary success metric for SEO is and should always have been the same for every marketing channel: the amount of revenue, leads, visitors etc that the business needs to be successful. If every other marketing channel is contributing to the bottom line or at least the top line, and organic traffic is not, there is an issue that needs to be addressed. Patting oneself on the back for great rankings in this situation will be little solace if the business goes under for a lack of cash.

Some businesses, especially those with long sales pipelines,  may have attribution challenges in tracking channel performance back to organic traffic as typically search traffic will be mostly top of funnel. In this case, the fallback should be clicks from search engines, but effort should still be made to determine that the clicks are of value.

Even if it’s impossible to determine the business outcomes, the business should still be looking at the engagement rate – bounce rate, pages per visit, and time on site – from this source of traffic. If the engagement rates are too low to ever lead to a conversion event, there is an issue and the rankings leading to the clicks are of no value.  

Summary

Rankings alone as a KPI for SEO is vanity metric and it should never be used in budgeting, financial modeling or any other important business conversation. SEO should be judged in the same vein as every other marketing channel and if it can’t be then appropriate proxies that correlate to business KPI’s should be used. It is 2019, and Google has robotaxis on the road, we should stop pretending that they are the same search engine of 2009.

1 2 3 4 5 6 7