Eli Schwartz

SEO – Page 5 – 📌Eli Schwartz


Home / SEO

The head keyword is obsolete

Initially, Google’s core algorithms were focused on ranking its index of websites in an ordered list in relation to a user’s query. As the algorithms matured, Google incorporated artificial intelligence to try to better understand what a user is seeking and to help them search better.

(Tangential note: Google recently announced they have deployed a neural learning algorithm called BERT which is focused just on better query matching. )

With this goal in mind, Google uses a few very visible tools:

  1. “Did you mean” – When Google believes you meant to search for something other than what you typed, they will suggest another query. Depending on how certain they are of this other query, they might show the new query’s results by default or just give a clickable link to run that new query. This feature frequently comes up on misspells, but it will also be triggered by other signals like word combinations or location.
  2. Google suggest – As a user types a query, Google will be one step ahead of the user and try to determine what the user is seeking. Naturally, this will push users down certain query funnels that they might not have used if they were left to their own devices. Google suggest is constanty running and you can see how useful this is just by typing one letter into Google and not hitting enter. The engine for Google suggest comes from real time queries of other users and not simply a guess on what Google thinks users should be searching. This feature was recently dissected in a Wall Street Journal investigative report which claimed that Google scrubbed Suggest to push people down paths that they (Google) wants. In my opinion, this is highly unlikely, but nonetheless the power of this feature in directing search users.
  3.  Related queries – Very similar to Suggest, Google helps people discover new queries that might better help them to find what they seek, but instead of doing it in real time, Google just links to other queries that will kick off a new search.
  4. People also ask – This is a new feature in Google’s results which both kicks off a new search and will also (many times) display a featured snippet response to the question. This is a particularly interesting feature in Google search and highlights the answering feature of search that Google might prefer.


In the early days of search and SEO, websites were very focused on ranking at the top of the results page on specific terms which were assumed to have high monthly search volume. Due to the immature (at the time) algorithms of search engines, users had been trained to only use those big head terms if they wanted to find useful results.

With all of Google’s features aimed at getting users to search better, I would argue that the entire idea of a head keyword is obsolete. Generally, super head terms like “hotel”, “car” “restaurant”  and similar will yield such useless results that Google already modifies the results for these queries based on location. This means that no single website could rank nationally (or globally) on these terms for all searches.

Head search is a waste of time

Additionally, if a user were to search these terms, Google would push them down a more specific path that better matches what they are seeking. I have also noticed that all of these search suggestions are completely personalized based on my past search behavior.

There was a time when Google personalized search results based on specific users past searches but they deemed that to be unsuccessful. Instead Google uses past search behavior to help a user search better.

Here’s an example of personalized “People also ask”. If I search for things to do nearby on a rainy day, Google will help me to refine my query with locations I have actually been.

If I conduct the same query in an incognito window, my suggested questions are completely different

The same would also apply for Google suggest. Suggested queries will change based on time of day:


And past search behavior as I was just search food, the first suggestions are food related.

I have not seen related queries change that much, but that is likely because they are part of a query set. Once Google pushes a user into one query set the related queries are already relevant for that query and don’t need any further personalization.

What this all means, is that the idea of trying to rank on a single popular head term would likely not work out as intended by the website. Due to the non-specific nature of their search, the users that might click through on such results would just be tire kickers rather than actual buyers.

Rather than trying to rank on head terms, websites should focus on understanding their users just the same and target the keywords that they would search in reality. A novel concept for sure, targeting users instead of search engines.


SEO is a continuous process

Within large organizations where SEO efforts belong to a dedicated team, there’s a common misconception that SEO is an action that needs to be conducted as one-time event during a product process. It may be that products “need to be cleared with the SEO team” or once they are complete the are “sent to SEO” as if SEO just need to sprinkled like seasoning.

SEO is not marketing

Part of the reason that there are those who think SEO needs to happen at the end of a product building process is because SEO is considered to be marketing instead of product or engineering. Typically, when a product is ready to be launched it is shipped over to a product marketing team to create a marketing plan who then place responsibility with the marketing team to generate users. While at times product marketers might be participate in the product creation phase, it would never occur to anyone to bring traditional marketers into the tent at that time. There’s likely not much that a paid, email, or brand marketer might add to the product plan.

This is not the case at all for SEO. This misconception to bring SEO in at the end is founded on a common lack of understanding of what SEO is. Before understanding what SEO is, it’s important to clarify what it is not.

SEO is not magic – if there is no search interest for a particular topic, no amount of SEO can create search volume. There’s also no silver bullet to ever guarantee that a page or site will generate search traffic. Apply SEO processes to something does not equate to generating traffic.

SEO is not a singular task – How to optimize a page or site will vary widely depending on what is being optimized. Therefore there is no allowed time frame for how much SEO efforts there can be or should be.

SEO does not operate in a vacuum – The very processes of optimizing a page or site for search is not an independent action which can be divorced from everything else that goes into writing content, constructing a site and laying out a page.

With this in mind, it’s easy to elaborate what SEO is in the context of building something new. SEO is a process of building in best practices for how a piece of content or website will get the highest visibility and traffic from search engines. These best practices could be anything from doing research on what the best way to word content might be, how lengthy content should be, and even the grade level of the content.

These best practices can’t be sprinkled in after the content is already written. Sending something over to another team for an SEO approval after the fact is a recipe for creating internal conflict. The SEO team will provide recommendations that will make the product team who initiated request feel like the SEO team is creating unnecessary bottlenecks.

On the engineering side, sending something over the SEO team after the fact is even worse. Finding out that a page or website will not generate any search traffic after hundreds of hours have been invested is not ideal. The engineering team might want to shoot the messenger – the SEO team, but that will not change the reality.


The solution is to incorporate SEO best practices at every stage in any process. If search traffic is at all a priority knowing what the best practices for achieving that traffic should be paramount to include before it’s too late. Some of the decisions I have seen made prior to even knowing about the product have either caused expensive redos or have forced the product and engineering teams to accept that they will unlikely ever see search traffic to their pages.

One example was where a product intended to drive all of its users from organic search was built using client side scripting. It had not occurred to the team to check with anyone knowledgeable on SEO until the product had been completed – one year after it started!

The solutions offered at that time were to rebuild entirely, use a headless browser or create static versions of some of the content. All of these options were considered to be too expensive and they would be fixed in the beta version of the product. It took six years to revamp the product for SEO! Since the product didn’t generate any search traffic, it was unable to get the engineering resources it needed to fix the product. It was caught in a vicious cycle that would have been avoided had the product been built right from the start.

Where SEO can help

Some of the areas where the SEO team can provide input early on might be very simple at the outset but complex later on. These are some examples:

URL structure – there are best practices on URL’s that can be incorporated as the product is being built, but after launch this can be very complicated

Content structure – Knowing what content will achieve the most visibility from search before content is written is a lot better than writing content for search that will not generate traffic.

Forecasting – Many times product managers will make unrealistic forecasts for how much search traffic might utilize a particular product, but they may lack the knowledge to understand the inputs of their forecast. Partnering with someone who spends all of their working day in search could help them build a more bulletproof estimation on growth potential.

Engineering choices- As a product is being scoped this is the time when engineers weigh in on how they suggest the product be built. If there is an approach that is not friendly to search engines, someone knowledgeable about SEO could encourage it to be rejected immediately. This will allow the product and engineering teams to focus only on solutions that will achieve their desired objectives.

SEO process

SEO is an optimization and debugging process just like software engineering. The same way a product can’t be built and never debugged SEO should be a constant at all phases of development. Understanding that SEO is a process that contains a touch of engineering, product and marketing should lead to a different approach  and expectations of SEO.


SEO is not dead nor is it black magic

What is SEO?

When search engines were first unleashed on to the world, early technology adopters quickly realized the tremendous economic windfall that could be realized from search engines sending traffic to their websites. Initial versions of search engines were essentially  online yellow pages, but at the same time early web users were infinitely more curious than yellow page book readers and were apt to click on lots of result.

The convergence of a growing swath of users and the profit potential gave birth to search engine optimization. At its core, the efforts behind search engine optimization were as benign as a Wall Street trader hunting for a trend that will lead to profits. However, unlike stock trading search engine optimization earned itself a negative reputation as many times the people on the receiving end of these optimization strategies were regular people just trying to use the Internet.

Someone looking for a vendor on a search engine could potentially land on an unscrupulous website simply because that website employed optimization tactics that allowed them to compete against the legitimate vendors. In this case, the characterization of SEO (search engine optimization) is entirely fair; however, it has been more than a decade since these kinds of operators dominated the web.

Today’s search landscape is dominated nearly in entirety by Google and they spend their vast artificial intelligence resources in neutering illegitimate tactics to win search visibility. SEO in this paradigm is both incredibly different than it was in the early days and vastly more valuable for nearly every website.

SEO of today

The SEO individual or team is the person responsible for understanding what search engines seek in websites and for translating that knowledge into recommendations and actions for product managers and engineers creating web interfaces that will be consumed by search engines. In smaller companies, the person that understands SEO might also be the product manager or engineer, but in larger organizations this will be a role on its own.

Paralleled with the misconception that SEO is a dark art is the idea that Google and SEO practitioners are at odds with each other. This may have somewhat true many years ago, but not now. Today Google relies on SEO practioners to incorporate the best practices Google needs in order to have the best search engine. Without the conduit of SEO, Google would have to work much harder to index a web that is not in line with the way they crawl.

As an example, Google has repeatedly said for years that they crawl Javascript. However, they also wink-nod at the SEO community while sharing that they may not yet crawl Javascript as efficiently as they would like to. As a result, SEO practitioners have done Google the favor of steering web designer and product managers away from pure Javascript websites. This allows for much of the web to be still produced in a way that Google can efficiently index the web.

The value of SEO

The collective value of all organic traffic in the world is a $Trillion+ opportunity that cannot and should not be neglected. For many websites and products, organic is one of the only ways to generate web traffic – short of building a massive brand that drives direct traffic. Even building a brand is prohibitively expensive and may never be profitable. Organic search efforts on the other hand is significantly less costly than any other acquisition channel.  

Google is not just going to automatically start sending boatloads free traffic to a website just because it exists. One day that may be the case, but we are a long ways from that. The solution is SEO.

Using SEO methodology, websites will construct a website in the ways that will most effectively maximize their visibility in organic search.

Relying on someone or multiple that will help guide the building of a website that will best positioned for search traffic will ensure that users are not being left on the table. Without SEO, a site would just be relying on dumb luck and Google’s good graces.

SEO in the future

For as long as people use search engines to find information, there will be a need for SEO efforts. Search is very much zero sum, so if one site is getting the click inevitably another site can’t. One day there may be even more artificial intelligence involved in the search ranking process which will make optimization even harder, but why would a website give into the AI? Even in that world, they will want to best understand how that algorithm works and try to put their best efforts into getting the traffic.

People complaining that SEO is only getting harder is a byproduct of all the AI that is already included in the algorithm. Whereas early in the history of search and SEO it was somewhat easy to “game” search with creative strategies or budget, AI and better search algorithms have negated these tactics.

Google’s algorithms continue to improve to an ultimate goal of ranking the web as a human might. SEO becoming more difficult means that the loopholes and hacks that are a feature of a software driven ranking will continue to close as Google becomes smarter. Yet, SEO is still necessary because someone does need to translate the search engine’s desired state into a coherent SEO effort.

SEO will never cease to exist; rather the efforts that make up SEO will change.


SEO can’t be replaced by software


There are several tools that I love using daily and without them I would not be able to work on any SEO projects. Yet, I think tools are just that – tools to help complete a project they are not the solution in of themselves. I believe there are many areas where tools will supplant humans and be able to do a job beginning to end without human intervention, but anything in the realm of marketing in my opinion must have human input.

Analogy to construction

It’s best to explain my opinion with an analogy to home construction. While any construction worker would likely disagree that their job could be completely automated, it is not farfetched to see how the robots could take over once each task is broken down. There are devices which hammer in nails automatically, hold walls straight and even follow a schematic to build a frame.

For now, there is no single robot hammer that can crawl across a house frame and know exactly where to hammer in a nail, but years ago there wasn’t a robot that could vacuum a house either. It is entirely possible that one day an inventor will realize that it is cheaper and safer to use robots for monotous labor intensive tasks that can be guided by a repetitive algorithm.

However, there is one area of construction that will never be replaced by a robot and that is the design and architecture. A robot will never be able to understand the human emotions and personal choices that go into deciding how a home should look, where the front door should be and how big the walk-in closet can be. A robot can certainly perfectly build a blueprint to spec, but it can’t translate desire into a plan.

Human marketing

I believe this same concept applies to SEO and marketing, one day there may be tools that construct the perfect website based on findings on what works in search, generates the best keyword ideas and maybe even writes the content, but the human element will always be missing. At best software can mimic what others seem to be doing well, but it would be impossible to have  a creative idea on how to get ahead. Even further, while software can even write content based on keywords and it may even appear well in search, it will lack the human emotion that is necessary to resonate with the humans who need to engage with that content.

Automatic SEO

Every once in awhile there’s an article about a tool that does SEO “automatically” getting a round of funding. There’s usually some breathless proclamation about how it will disrupt the entire industry, but all these articles always neglect to mention the AI factor that already exists. Google is already using AI to understand and rank content, the way to “beat” Google’s AI is not to have a duel with another AI tool but to put a human in the mix.

If all of SEO could be distilled down to doing keyword research and structuring web pages, then SEO could be disrupted by software. However, successful SEO is so much more. An SEO effort includes knowing how to architect a website, the types of content to create, the personas of the potential users, learning from performance to optimize for growth and most of all building a product that resonates with real users. Until we live in a world where robots do all our shopping, none of this could ever be disrupted by software.

In my opinion, all the sites that try to use software only to manage their SEO, leave a gaping hole for a human driven SEO campaign to beat them in search visibility.

Think about all of the most successful sites on the web, and then imagine if it were possible to replicate their success. Could a machine have built Wikipedia? Would automated reviews helped Yelp, TripAdvisor and Amazon win their categories? Would Google News be a dominant source of news if all it did was index machine written content?

I don’t think humans would have considered these sites key sources of information if machines had build the strategy, websites and content. Anyone looking to replicate their success would be better served by finding the smartest humans rather than looking for the next automated shiny object.


SEO should be viewed as a product

One of the reasons a company may leave its SEO potential unfulfilled is because they inadvertently box in the person or people responsible for SEO. They leave the responsible party on their own as an individual contributor forced to go through their manager to get anything done.

This flaw in structure is because SEO is viewed as a marketing function with tasks structured as campaigns relying on other marketing contributions such as content and design. Technical tasks like building or launching a page happen somewhere else in another org within the company.

Instead if SEO were viewed as a product, engineering tasks would be a part of the product roadmap and launch process from the start. Product roles are always reliant on other teams and are inherently cross functional.

This does not at all need to change the reporting structure of the individual as in many cases it makes perfect sense for SEO to be on the marketing team. Rather approaching SEO as a product function helps clarify inputs and outcomes on multiple levels.

Planning– When planning for SEO goals it is critical that all required resources from other teams be allocated at the exact same time. It wouldn’t make a whole lot of sense to plan to launch a number of pages or a new microsite but not pre-allocate the design time and engineering plan. On the product management side, new initiatives are never approved with a hope and prayer that everything will just work out when the time is right. All products that are prioritized, will get the resources to complete the project.

Budgeting – When it comes to budgeting on a marketing plan, SEO usually falls to the bottom of the pile since the story on investment to output is harder to tell. This means that SEO will get the short end of the stick on hiring, software and contractors whereas paid marketing teams might be flush with cash. Thinking of SEO as a product instead realigns the expectations on investment as if it’s a product that needs investment because it is a priority. Typically product teams aren’t resourced as if they have a direct line to ROI but rather because it is a business necessity.

Output and reporting – On the same note, when SEO is thought of as marketing the KPI’s need to be similar to other marketing KPI’s. Paid teams have LTV goals (hopefully), brand teams have impression share, so therefore SEO ends up being measured on rankings. This is a terrible way of looking at SEO as rankings are just a vanity metric. Instead the same way any product is measured by adoption and engagement, the same lens should be applied to SEO. For this SEO this would be measured by growth in impressions on search pages. Obviously clicks are important too, but the clicks are a result of on page structure which is not necessarily SEO itself.

Resourcing – Making the case to add more headcount for SEO can be very difficult if the metrics for success are too hard to reach or they are inappropriate for the channel. Viewing SEO as product, the primary headcount metric moves from KPI driven to deadline driven. The question that should be asked is what is the headcount necessary to meet the goal within the desired time frame.

Not much really has to change on reporting, salary and even titles to make SEO more aligned with product, it is really just an exercise in awareness and management. If the current method for managing SEO is leaving value on the table, it may be helpful to change the structure of how SEO should be conducted in a company.


In large companies winning is all about incrementality

The best way to be successful in a process driven large company is to always keep the focus on small incremental wins. Little wins will ladder up into bigger wins as the smalls wins begin to add up in the impact they are having within a company.

Building a detailed plan for incrementality is so much more effective than creating a plan that will never get executed upon unless there is executive buy-in and a dearth of initiatives across the company – which of course there never will be.

Proposing a complete website revamp is a surefire way to a backburner and the purgatory of no budget, but a refresh of a particular page is a far easier sell. The page refresh might need to be implemented in piece meal, but at least its not a project size that makes stakeholders recoil in fear.

Small wins

When setting these small win targets, it’s really important to make the little wins as small as possible. It could be something like just changing the title of a page which surprisingly could be difficult at a large company. It may even be a smaller goal like getting buy-in from a cross-functional counterpart – again not a given in a highly individual team goal focused culture.

This method for growing by using incremental wins to succeed requires acknowledging that by their very nature large companies are just very different from smaller organizations.  Within small companies with only a handful of employees, culture can be set by founders and a company can be oriented towards results. As an organization grows process is introduced which can add levels of complexity to getting things done. Much of the process will be vital for the future success of the company, but inevitably it will also lead to bureaucracy. For a growth minded product manager or marketer, the bureaucracy can be negated by embracing the process.

Big vs large comparison

As an example of the key differences between large and small companies just compare the planning processes. In big organizations a product plan that is blessed by all stakeholders can take a lot of time to develop, however for a company of this size, the alternative is untenable. A startup can pivot on a dime with founders, board members, and internal leaders reorienting teams on a whim, but in a large company behavior like this what leads to employee attrition. Employees will not feel valued to put a lot of work into a project and then to be directed on to a brand new initiative with no warning.


Things are also very different on the employee side too. Employees at smaller companies can have fuzzy titles and responsibilities and goals that change at the speed of the business, while in a large company this would be the complete opposite. Titles are typically specific and aren’t changed unless there is a business need. Responsibilities are narrowly defined and while adjacent responsibilities might be added via projects, big responsibility changes only happen in a job update. Within this environment, quarterly and annual goals are set long in advance and employees are expected to march towards those goals.

The types of people that end up in a company will have their own biases towards an atmosphere with more or less process. There are even extremists at both ends who could not imagine themselves in the opposite environment: a startup employee who shudders at needing to do the same thing for an entire quarter or the large corporate employee who gets night terrors from the thought of a loosely defined job structure that could change weekly.


There is no right and wrong when it comes to processes and each company will have its own way of doing things that is right for them

The gap between freestyle and rigid is very broad and there will be many companies that fall various places within the spectrum. Companies can even change when something big like a reorganization or layoff forces it. Whatever the culture of the company may there will be a way to work within that system to make things happen.

Keep in mind that in a smaller company it is easier to quote “get things done”, it is incorrect to just throw up one’s hands at a large company and just give up. Initiatives absolutely do get executed at bigger companies, its just that the pathway to making things happen is a bit windy and there are a lot more rest breaks along the way than a fast moving startup.

Realistically even smaller companies need to walk before they run, its just that they are a lot more nimble on their feet.


SEO is top of the funnel and is the assist on a conversion

Within marketing teams the most attention both good and bad is paid to the initiatives that cost significant sums of money. There will be frequent executive check-ins, quarterly reviews, detailed reporting and of course an attribution system that relies on something a lot more sophisticated that a gut belief. In fact, the entire company wide attribution system might be tightly tuned to have a deep line of sight into all paid efforts at the expense of other channels.

In this world view, organic search channels could end up with the short end of the stick both from a resourcing standpoint and on attribution. Everyone sort of has a belief that SEO works and is overall beneficial to the bottom but there’s not as strong of a drive to understand exactly how the traffic performs. Without accurate reporting executives and SEO teams could end falling back on useless metrics like rankings.

Even worse, a natural consequence is that when budgets are tight the channel that “kind of” works will fall behind the channel or channels with deep visibility. This leaves SEO teams always strapped for resources and scrambling to prove their efforts are worthwhile. In a weird script twist, the paid team only has to defend their budgets not their jobs while the SEO team without the budget has more existential issues.  

I think the root of this issue comes from a fundamental lack of understanding of where SEO fits in the marketing mix. Unlike a performance channel which is designed to go direct to conversion, SEO is a hybrid between branding and performance traffic. Judging it purely as a brand channel would overlook the tremendous impact it will have on the bottom line, but at the same time it can’t be viewed as just a performance channel.

SEO in the marketing mix?

By its very nature SEO will typically live a lot higher in the buyer funnel and in many cases users will not have any buying intent whatsoever. Stepping back from being marketers for a moment and thinking about our own search activities, much of it is just research and curiosity. Queries about weather, information, sports scores, stock prices and the link have no commercial intent.

On the flip side, organic traffic on the brand name will be a lot lower in the funnel, but to be totally honest, its not really even organic traffic. A brand should rank for its own brand name or something is very wrong.

The real SEO

True SEO efforts will have a site earning significant visibility on the long tail – the types of words that it would be hardly profitable to put paid dollars behind simply because it would take too long to convert. As the user moves down the funnel, their queries will skew closer to head terms or this is when they might engage with paid advertising.

Once the user gets to the bottom of the funnel and has buyer intent, they are more likely to click a paid ad – either on the brand name or retargeting on another site. A last click driven attribution system will then give 100% of the conversion credit to paid channel and completely discount all the organic clicks that happened over the prior time period.

Organic is an assist

Applying this to a sports metaphor, that last click might be the basketball slam dunk or the hockey goal, but it was all the other prior clicks that set up the perfect sequence for someone to bring the ball or puck home.

In reality, changing attribution systems is complex and unlikely to happen in a short period of time just because someone wants to. However, there is still no excuse for not having a better view on the performance of the organic channel and why to invest more into it. To that end, executives need to be aware of where SEO fits in the funnel and manage expectations accordingly.

To illustrate this with an example, let’s look at someone using search to plan a vacation.

The first query might be very general just to get ideas.

As they move further down the funnel they settle on a place to travel.

Assuming they know the dates they want to travel they start exploring transportation.

They also check out their hotel options.

Throughout this entire process they may have visited many various sites from local chambers of commerce, review sites, hotel sites, online travel agencies and aggregators.

As they finally decide on their options and get any necessary traveling partners on board, they are ready to purchase. They search directly for the site where they found the best deal.

If a paid ad comes up first, so be it; they are clicking. In the last click attribution world most less sophisticated sites use,  all of the credit would have gone to that very last click. The potentially months- worth of effort on planning that vacation through various pathways would have all fallen by the wayside from an attribution standpoint.

Multi-touch attribution is the goal

The ultimate goal of every site should be to use a multi-touch attribution model, but getting to this ideal is not as simple as changing a t-shirt. There is a significant amount of effort to gather data, build data lakes, test out models and buy the tools necessary to support the process.

There may never be a perfect way to attribute organic traffic, but at least with the knowledge of where SEO traffic really fits in the marketing mix, the best integrated marketing strategy can be built. SEO should carry the baton on all the deeper research efforts, but the baton can be passed to performance channel when customers are ready to pull out their credit cards.


Visualize internal linking like an airline route map

Links are a critical part of Google’s ranking algorithms as a link to a page is a vote of popularity and at times contextual relevance. The authority lent by an inbound link doesn’t just apply to external sites linking in, but the same applies to internal links (pages within a site) too. A website draws its overall authority score – Pagerank as Google’s ranking patents refers to it, by the sum of all the authority of sites that link into the site.

The best way of explaining this is to use the words from Sergey Brin and Larry Page’s original research:

Academic citation literature has been applied to the web, largely by counting citations or backlinks to a given page. This gives some approximation of a page’s importance or quality. PageRank extends this idea by not counting links from all pages equally, and by normalizing by the number of links on a page. PageRank is defined as follows: We assume page A has pages T1…Tn which point to it (i.e., are citations). The parameter d is a damping factor which can be set between 0 and 1. We usually set d to 0.85. There are more details about d in the next section. Also C(A) is defined as the number of links going out of page A. The PageRank of a page A is given as follows: PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn)) Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages’ PageRanks will be one.

To the layman this is just saying that each page begins with a score of 1, but the final score is a function of all its outbound links added to the score of all of its inbound links.

In this calculation, the most linked page on a website will tend to be its homepage which then distributes that authority throughout the rest of the website. Pages that are close to the homepage or linked to more frequently from pages linked from the homepage will score higher. In this regard achieving this right mix via internal linking is critical.

Inbound link authority

Additionally, the homepage will never be the only page that receives authoritative external links, so if an internal page is the recipient of a powerful external link but doesn’t link to other pages, that external link is essentially wasted. When pages link to each other the authority of all external links is funneled around a site to the overall benefit of all pages.

For sites with flat architecture or only a handful of pages, a proper internal link structure is simple and straightforward, but on large sites improving internal links can be as powerful as acquiring authoritative external links. (A large site can be even be one that only has one hundred pages.)

Large site challenge

Due to the nature of how many large sites are structured there are invariably going to be orphaned pages – defined as pages that don’t have any or many links pointing into them. Even a media site like a blog or daily news site which has a very clean architecture – each post/article lives under a specific day will also have an internal linking challenge.

More than likely, the site will desire organic traffic that isn’t just someone searching out that day’s or recent news. There will be posts that they might hope would be highly visible many years into the future. Think of the review of a product on its launch day which is relevant as long as the product is on the shelf. Or a well-researched item which explains how something works, like the electoral college as an example. Granted these posts were published on a certain day but they are relevant for many queries essentially forever.

Ideal link architecture

As you might imagine, for all sites with this challenge, creating an ideal link architecture that flows links around the site can have a huge impact on overall traffic as these orphaned or weakly linked pages join the internal site link graph. 

How to improve the link graph

Implementations of related page algorithms on each page – quite simply a module with related links – to crosslink other pages can go a long way into supporting this link flow, but that’s only if the algorithm isn’t tightly tuned to a specific relationship. Sometimes when these algorithms are developed, they key off specific connections between pages which has the effect of creating heavy internal linking between popular topics while still leaving pages orphaned or near-orphaned.

There are three possible ways to overcome this effect:

  • Add a set of random links in the algorithm and either hard code these random offerings into the page or refresh the set of random pages whenever the cache updates. Updating on every page load might be resource intensive, so as slow as every day would achieve the same outcome.
  • In addition to related pages include a linking module for ‘interesting’ content– which is driven by pure randomization – also refreshed as in the first recommendation.
  • Include a module on every page for the most recent content which insures that older pages are linking into new pages.

As an aside, I also like to always build an HTML sitemap for all large sites as this gives one place that every single page is linked. If the sitemap is linked in the footer it will achieve the goal of having most pages just one click from the homepage. Transparently, Google’s John Mueller suggested that HTML sitemaps aren’t necessary, but I have always found that on large sites they can be very powerful.

Visualizing internal link graphs

To visualize what a desired structure of internal linking should be, I tend to the think of a site’s link graph like an airline route map.

Singapore Airlines

The least effective internal link graph looks like the route map of a national carrier for a small country. These air carriers will have a single hub in their capital city and then have spokes pointing around the world from that hub. Here is the route map for Singapore airlines which has impressive reach for a flag carrier, with only a few exceptions all their flights terminate in Singapore.

Flipping this visual over to websites, think of the hub as the homepage. The homepage links out to all of the other pages, but very few of the internal pages link to other pages.   

United Airlines

The most common type of link graph looks like the route map of a large global carrier. Look at United Airlines as an example. There are very clear hubs (San Francisco, Los Angeles, Chicago, Newark, Houston, Denver…) and these hubs connect to each other and other smaller satellite cities.

Image result for united airlines route map

Again, flipping this over to websites, the homepage would be the biggest city on the route map: Newark which links to all the other big cities in addition to all the hubs. The other hubs would be important category pages which have a lot of inbound links and then links out to all the other smaller pages. In this link graph, important but smaller pages would only have one pathway to get to them. (As an example, Mumbai is only connected to Newark.)

The most ideal internal link graph looks like the route map of a budget airline that thrives on point to point connections. To the bicoastal business traveler this route map makes no sense, but the wandering tourist can get to anywhere they need to go as long as they can handle many stopovers. Southwest Airlines is a great example of this structure.

Southwest Airlines

Southwest has such a complicated route map, they don’t even show it on their website. You would have to choose a particular city and then see all the places you can get to directly. There are certainly some more popular cities within their route map, but their direct flights almost seem to be random. A traveler can get fly directly from Cleveland to major travel gateways like Atlanta, Chicago and Dallas, but they can also go to Nashville, St Louis, Tampa and Milwaukee.

This is how a website should be structured. Pages should link to important pages, but also link to other pages that seem to be random. And, those pages should link back to important pages, and link to other random pages.


To summarize, think of a search engine crawler passing from one page to another calculating authority as a traveler intent on flying to every city on an airline’s route map without ever needing to go to a single city more than once.

On Singapore Airlines, a traveler could get from Mumbai to Frankfurt via Singapore, but to get to Paris (without a codeshare) they would need to get back to Paris.

On United Airlines, a traveler could get from Portland to Dallas via Denver and then could go on to Fort Lauderdale via Houston. They would certainly make it to a number of cities, but at some point they would find themselves connecting through Houston or Denver again.

On Southwest Airlines, a traveler could begin their journey in Boise, Idaho on any one of the ten non-stop flights and make it to nearly every city without ever needing to repeat a city.

Build your internal link architecture like the Southwest Airlines route map and you will never have an orphaned or sub-optimally linked page again.


SEO tools: My full list

With the trillion dollar plus annual value of SEO traffic, there is no lack of tools that help to understand Google and optimize websites accordingly. This is not meant to be an exhaustive list of tools that can be used to help with SEO rather it is a list of tools that I personally use and find useful. I will continuously update this list as I discover more tools.


When working on a large domain, understanding the size of a site and how it might be viewed on Google is critical. A good crawler will crawl a site similar to how a search engine would discover all the pages on a site. The crawl of the site should tell you any technical SEO issues that might exist and should be fixed. For crawling I have a handful of go-to options, each with specific pros and cons.

Screaming Frog’s desktop crawler should be the staple of anyone doing technical SEO. For smaller sites the free version should be perfectly fine, and for larger sites there is a nominal fee to access the pro version.  Screaming Frog allows users to crawl a site and then manipulate the data in a spreadsheet which is the preferable way, in my opinion, to handle large data sets.

My new favorite desktop crawler is SiteBulb which offers many of the same crawling capabilities as Screaming Frog, but also has some amazing visualization tools that negate the need to build charts in Excel. Additionally, it doesn’t have the memory leak issues of Screaming Frog and I have crawled hundreds of thousands of pages without needing to chunk out pieces of the site like I have done with Screaming Frog.

For cloud based crawling, I use both Oncrawl and Deepcrawl. They both offer similar capabilities and the decision over which one to use would be one of personal preference.

Backlink research

I learned SEO by doing backlink research on Yahoo Site Explorer and then replicating my competitors backlinks on my own site, so backlink research is an SEO process near and dear to me. Throughout my SEO career I have experimented with nearly every backlink tool from Moz to Majestic to SEMRush. My current favorite tool for backlink research (and many other functions) is Ahrefs. Although, again this is a personal preference as most backlink tools will have similar enough data to take action on for a linkbuilding campaign.

Competitive Research

Most link building tools will help with basic competitor research which will show the types of links that competitors have as well as the keywords they ranking on. For more specific competitor research I like to use Similar Web which gives me data around total traffic and the percentage of organic traffic. I also use Alexa.com (yes, it’s still around) to show keyword intersections between websites.

Keyword Research

When I first started in SEO there were only a few keyword research tools and none were very good. We are lucky that there are so many options to generating keywords that real users search. I typically try to optimize towards users and the queries I need for my sites rather than on search volume alone, but having this data can be very helpful for prioritization.

When I am working with a large site and a big Google Ads spend, I find the data in search query reports from Google ads and the Google Keyword Planner to be very useful.

I also use the competitive tools mentioned above to find new potential keyword ideas as well as keywords that competitors rank on.

For gathering Google suggest terms – which will be terms that users actually search and therefore Google suggests them, I use KeywordTool.io. This tool will also offer suggested ideas from other search engines including Youtube and Amazon.

To build huge lists of keywords, I have used Scrapebox, KeywordSheeter, and Kwfinder.

As people view Google more as a friend than a search engine, they ask more questions within search and AnswerthePublic is a great tool to get question based queries.


International SEO efforts will be similar to domestic SEO except for that it is focused on another country or language. Most of the SEO tools on this list will work for international SEO except the language or country needs to be changed.

For sites that need to build out Hreflang tags, this tool from Aleyda Solis can be very handy/ , but larger enterprises that might need help can use

Site speed

Site speed is a part of the  Google ranking algorithm, but not a major part. The main reason anyone should care about the speed of a website is because users will bounce if a site is too slow. For site speed, I usually use multiple tools because they are free and quick so more data can’t hurt. Pingdom.com gives a waterfall of how a page loads. Gtmetrix.com displays helpful information on what can be fixed to improve a page/site’s speed. I also use Google’s page speed tool which adds some additional info on mobile page speed.

My favorite site speed tool is simply to lower my mobile device to 3G and to see how fast a webpage loads. If it takes too long to load, then I can assume a real user would bounce on a similarly slow connection.


As I have written previously, I am absolutely not a fan of ever checking search rankings because I think it is the wrong metric to look at. However, there are specific use cases where I find rankings to be very helpful. If a site made a number of changes, I might download a prior months queries and then put them into a rankings tool to see if there has been a massive shift from previously reported Google Search Console positions.

The two tools I use for rankings are Link Assistant’s Rank Tracker which runs ranking queries off of my desktop until I get captcha blocked by Google or my favorite: Rank Ranger. In addition to rankings, Rank Ranger has a ton of other features including competitive insights, schema creators, and social analytics.


My go to SEO tool is Google Search Console which is free and everyone should be using even if they don’t trust the data as much as they should. Google Search Console data can be pivoted in multiple ways to find insights, but the UI is still a bit limited. The way around this is to pull down data via the API. To access the API, I use a Google Sheets plugin called the Search Analytics for Sheets. I have found that this tool has had issues recently with maxing out its API calls so I have had to fall back to building my own lookup tools in R Studio which you can do to if you follow this guide.

Optimization and testing

Most of my testing and experimentation happens manually because getting the keys to the kingdom that is the codebase of a website is challenging. If you are able to get access to either Cloudflare or another CDN, you can use tools like Distilled ODN and Clickflow to do SEO learning at scale. Absent that you should absolutely be testing by making single variable changes on multiple pages and then recording the clicks/impressions over a long time period to discover if there are any statistically significant learnings.


For SEO teams that need to have quick access tools that can do everything on all in one place they might want to use an enterprise SEO tool that can also help with writing bug requests, tracking and dashboards. Searchmetrics is the enterprise SEO tool I have been using for many years and they now also have tools which help content teams draft content that includes all the related keywords that are ranking on other sites.

The goal of any SEO tool is to make SEO less manual and more efficient. These are the tools that I use on most SEO projects and I love that new tools keep being produced to make SEO even easier! If there are any tools I missed or might no have heard about, please let me know.


SEO Personas as the foundation for keyword research

The idea of building elaborate customer personas is very popular on design and in various marketing teams, but so few people actually use them in their daily work life that the investment in creating them is hardly worth it. Many times, when companies build out these personas, they go overly deep into developing exactly who these customers might be and all of their character traits.

Personas for SEO

Personas might be passé, but when it comes to SEO I think that some sort of persona research must be the foundation of any good keyword research. Too many people begin a process of keyword research by firing up their favorite keyword tool and then picking keywords off the list with high monthly search volume that are relevant to their business.

Starting with keywords sorted by volume puts the emphasis on the wrong metric and leads to creating content that might not match the intent of a user or the needs of a website. In my opinion, it makes the most sense to prioritize exactly the kind of content that is needed to help a website monetize. Some keyword tools might show this after the content has already been written by doing a “keyword gap analysis” vs a competitor, but it is a lot easier to just determine what content is necessary at the outset.

The easiest way to figure out exactly what content is necessary is to go through a persona exercise to understand exactly how, why and what users want from the website. Only then once the users’ needs are taken into account does it make sense to distill those topics into precise keywords.

Persona research should answer questions such as where in the buying funnel a user might be. This will guide the depth of content a user expects to see. It is also important to understand the devices that a user will be using to access the website. Is it a desktop? A mobile device? Or maybe the user can be served just with a voice enabled device. Knowing this can quickly help decide whether long form or image heavy content is even appropriate.

Before embarking on this effort, it is worth acknowledging that existing personas likely will not be detailed enough for you to use for SEO and it is not a wasted effort to build personas from scratch just for SEO. The current personas that the company might be using will have details that are not necessarily helpful for SEO like age, gender, and career details.

Steps to build SEO personas

With that in mind, here are the best practices on developing personas specifically for SEO.

  • Identify all potential users of a website or product
    1. This is where keyword research as a start of an SEO effort typically falls short. Just because a website or product exists doesn’t mean that users will automatically want to search for it. Taking a step back to think about who the users of the website might be gives a good foundation for what kind of content and keywords to focus on. For example, an ecommerce website might want to target people that have a specific need and the focus of SEO should be on solving that need rather than just optimizing the product page. A SAAS product might have a similar phenomenon and targeting the problem rather than the solution would yield more search traffic.

  • Determine how the users might search based on where they are in the funnel.
    1. Again, traditional keyword research would only identify the popular terms for a vertical but not how the targeted users will search. Users very high in a funnel will be searching for a solution to a problem while users at the very bottom will be looking for the brand plus pricing info.

  • Slot them into the type of content that they might expect.
    1. There is a lot of advice around what kind of content is best for SEO, but none of that advice considers the granular needs of a specific user. If a user wants just a price or a list of features, they will be ill-served by a long form piece of content while a user that wants a detailed product review would similarly not be helped by a quick list of bullets.

  • Match them with a specific call to action that is relevant for the place they are in the buying funnel.
    1. Search traffic is a means to an end and is rarely the end itself. Even on a media site that targets readership, an increasing user count is of no benefit if the users don’t do a follow on engagement action. Where the users are in a buying funnel should determine the appropriate call to action (CTA) for the content. A reader that is very low in the buying funnel might be looking for a way to contact a salesperson while a user high in the funnel should be encouraged to just read more or maybe subscribe to a mailing list. When content is written for users rather than keywords it becomes a lot easier to have a targeted action type for users to take.

  • Classify the types of devices they would be using to access the content.
    1. While we constantly hear the refrain that the mobile web is dominant, this is not necessarily carried forward into executing SEO efforts. If it was, long form content would have fallen by the wayside in favor of short punchy shareable bits. Even though nearly every web user has a mobile device, there are some things that will always be done on a desktop. Buying business software or expensive shopping is probably going to involve a bigger screen somewhere in a buying cycle. Writing content to where the users are in a buying cycle should play a key factor into the screen size they will be potentially using to access the content.

  • As a bonus, pigeonhole them into a precise language or culture for internationalized content
    1. One last thing for sites that have international audiences, its critical to know what language they might expect to see content in and if there are any cultural nuances that should be addressed. What many people that have never done international marketing might not know is that its OK to have just English content for an international audience. They might not expect a translated page, so it is better to just give them content in English that contains the international options they need like shipping or currency. Understanding the users will prevent a website from creating language specific content unnecessarily.

With these best practices in mind, hopefully you will be able to develop SEO specific personas that will guide keyword research. Keyword research like everything in SEO should be targeted at real users – not search engines – and a persona exercise will go a long way into knowing who those real users might be.

1 2 3 4 5 6 7 8