Eli Schwartz

Posts by

Eli

Home / Blog Archive
SEO

Mobile SEO is just SEO

With the rise of mobile in the collective marketing consciousness, there are some that might think they need a separate mobile SEO strategy. For most sites, this approach would be entirely unnecessary and in fact might even force them to split their resources unnecessarily. At its core, any mobile marketing strategy is just a traditional web marketing effort made for a smaller screen.

When it comes to SEO this reality is the same. Google ranks websites on mobile the same way they do on a desktop. The nuances between SEO for desktop and mobile are in how users interact with search and websites after they click.

From a search perspective, websites that rank highly for a query on desktop are going to rank equally on a mobile screen; however, the downside is that there are less results meaning a number five slot is essentially like being on page two of results.

Google announced last year that they are using a mobile first index which isn’t as ominous as it sounds. This just means that they are ranking the content of  a website that is visible to a crawler which emulates a mobile browser. If a website were to have content that is only visible to desktop users AND the website was included in Google’s mobile first index, this desktop content would be invisible to Google.

Google’s motivation behind having a mobile (first) index is that in a mobile first world, webmasters should make every effort to make all of their content or at least their best content visible to mobile browsers. Google recommends having a mobile responsive site that will look and function great on a mobile, tablet or desktop environment. Longer menus should be collapsed rather than hidden completely. Content should be paginated or scaled rather than removed. Although it might take some effort to implement a responsive site, this approach might end up costing significantly less than having a mobile-only and a desktop-only site or worse just not having a great mobile experience.

In this light a mobile SEO strategy is really just optimizing content as anyone might do on a desktop only site, but ensure that the technology serving the content is friendly to mobile users.

User experience optimization

The second part of optimizing for mobile is where the focus is on the user, and this is far more critical. If a significant portion of the users are going to be using mobile devices, the entire layout and content have to be mobile user friendly. This means that buttons have to be easy to tap, images should load quickly, and the page should scale to the size of the screen.

Optimizing for user experience is a key part of any SEO strategy and this should be done for any user regardless of the device they use. In scenarios where there are more mobile than desktop users, optimizing for smaller devices should take precedence.

In this light, there are going to be sites that should not bother with any mobile optimization at all. If there are primarily desktop users – something like a complex web utility or a B2B tool, it may not be worth the effort and resources to optimize for mobile. Not to say if a website was started from scratch that mobile should not play a role, its more that in an existing website the tradeoffs to optimize for mobile may not be worth the expense.

I find it hard to believe that there will be sites that could completely ignore mobile, but if expenses and resources are a concern it is worthwhile to calculate the ROI before making a substantial investment in mobile.

SEO

Voice search is not going to take over text search

Given all the attention voice search gets in the media, you wouldn’t be alone in thinking that traditional search is going to cease to exist in the very near future. This prospect is terrifying for many that rely on organic or even paid search as a primary source of online users and customers.

In my opinion, we will never see a day where search has completely moved to voice only and there are number of reasons why.

Financial

First and foremost is the profit motive of Google and other search engines. If Google were to give only a single result in response to a voice query and that result was organic, Google could no longer monetize those queries. From a raw financial perspective, it is unlikely that Google would ever give up any of their juggernaut of paid advertising.

In addition, Google has increasingly moved in the direction of more paid search options and not less. Over the last few years, Google has placed more of search engine page layout in the hands of advertisers. Throughout the history of search and especially paid search, there have rarely been instances where there have been just one paid result for a query. Even if Google were to give a paid response (and it’s unclear how that might work) for a voice query, that would have to be just one paid advertiser and not the multiple they do now. Google would essentially be making the top advertiser the only advertiser.

Auction

This could possibly increase the price per click a site might pay to be that top advertiser, but it could just as well very likely destroy the auction as advertisers not willing to pay for the top spot would not participate in the auction at all.

Second, while there is a lot of pressure in the organic world to obtain a top ranking on a Google query, by no means do the lower ranking results get zero clicks. There are even clicks that happen on search result pages beyond the first one. This is because search is far from perfect. Even though uses artificial intelligence to read minds and understand what a user wants, many times even the user doesn’t even know what they are looking for.

Refined searches

A user will search, click a result, go back, click another result or even conduct another search in their quest to find the information they seek. The very diversity of multiple results is what help the user determine the best result. This process cannot ever be reproduced purely by voice simply because giving a single result to a query would mean that Google would have to know EXACTLY what the user wants.

This is a very easy thing to do when there’s only one possible result like a query on weather, numbers, directions or the names of sports players. This get much harder when the results are completely subjective like finding the best vacation spot, the latest play by play of a game or an opinion piece on the news.

Even with full personalization it is theoretically impossible to know exactly what a user wants unless the user explicitly said what they want. This of course happens at times, but usually that is the final query in a series.

Imagine this query train:  “Best hotel in Miami”, “Best Marriott hotel in Miami”, “Marriot hotel in Miami with free parking” “Marriott hotels with suites and free parking”, Marriott hotel with suites and free parking that have a lounge” and finally you might get to “address of Marriott Biscayne Bay Miami”

What you might notice is that all of the queries in that chain have multiple answers and would be completely impossible for Google to give just one response.

In the future, we might see voice search prompts after a query is done, but that might only be applicable in a place where a user can’t do a full desktop or mobile search like in a car. More than likely this whole clarification process will take so long and be so cumbersome, users would prefer a visual search with multiple responses rather than a smart device that prompts for clarifications.

Essentially, the number one reason that voice search is never going to replace multiple results is that voice must be perfect and perfect is never possible in our changing world. Perfect will always change as users realize how much information is possible to obtain by just conducting an online search for information.

Ten years ago who could have ever imagined that people would be able to ask their phones to read them a recipe or tell them whether they need to go out with an umbrella. In the future, we may be able to ask our devices if we have the flu based on a number of symptoms, but we are not going to be able to find the perfect gift idea for a special someone.

SEO

The head keyword is obsolete

Initially, Google’s core algorithms were focused on ranking its index of websites in an ordered list in relation to a user’s query. As the algorithms matured, Google incorporated artificial intelligence to try to better understand what a user is seeking and to help them search better.

(Tangential note: Google recently announced they have deployed a neural learning algorithm called BERT which is focused just on better query matching. )

With this goal in mind, Google uses a few very visible tools:

  1. “Did you mean” – When Google believes you meant to search for something other than what you typed, they will suggest another query. Depending on how certain they are of this other query, they might show the new query’s results by default or just give a clickable link to run that new query. This feature frequently comes up on misspells, but it will also be triggered by other signals like word combinations or location.
  2. Google suggest – As a user types a query, Google will be one step ahead of the user and try to determine what the user is seeking. Naturally, this will push users down certain query funnels that they might not have used if they were left to their own devices. Google suggest is constanty running and you can see how useful this is just by typing one letter into Google and not hitting enter. The engine for Google suggest comes from real time queries of other users and not simply a guess on what Google thinks users should be searching. This feature was recently dissected in a Wall Street Journal investigative report which claimed that Google scrubbed Suggest to push people down paths that they (Google) wants. In my opinion, this is highly unlikely, but nonetheless the power of this feature in directing search users.
  3.  Related queries – Very similar to Suggest, Google helps people discover new queries that might better help them to find what they seek, but instead of doing it in real time, Google just links to other queries that will kick off a new search.
  4. People also ask – This is a new feature in Google’s results which both kicks off a new search and will also (many times) display a featured snippet response to the question. This is a particularly interesting feature in Google search and highlights the answering feature of search that Google might prefer.

Ranking

In the early days of search and SEO, websites were very focused on ranking at the top of the results page on specific terms which were assumed to have high monthly search volume. Due to the immature (at the time) algorithms of search engines, users had been trained to only use those big head terms if they wanted to find useful results.

With all of Google’s features aimed at getting users to search better, I would argue that the entire idea of a head keyword is obsolete. Generally, super head terms like “hotel”, “car” “restaurant”  and similar will yield such useless results that Google already modifies the results for these queries based on location. This means that no single website could rank nationally (or globally) on these terms for all searches.

Head search is a waste of time

Additionally, if a user were to search these terms, Google would push them down a more specific path that better matches what they are seeking. I have also noticed that all of these search suggestions are completely personalized based on my past search behavior.

There was a time when Google personalized search results based on specific users past searches but they deemed that to be unsuccessful. Instead Google uses past search behavior to help a user search better.

Here’s an example of personalized “People also ask”. If I search for things to do nearby on a rainy day, Google will help me to refine my query with locations I have actually been.

If I conduct the same query in an incognito window, my suggested questions are completely different

The same would also apply for Google suggest. Suggested queries will change based on time of day:

Location

And past search behavior as I was just search food, the first suggestions are food related.

I have not seen related queries change that much, but that is likely because they are part of a query set. Once Google pushes a user into one query set the related queries are already relevant for that query and don’t need any further personalization.

What this all means, is that the idea of trying to rank on a single popular head term would likely not work out as intended by the website. Due to the non-specific nature of their search, the users that might click through on such results would just be tire kickers rather than actual buyers.

Rather than trying to rank on head terms, websites should focus on understanding their users just the same and target the keywords that they would search in reality. A novel concept for sure, targeting users instead of search engines.

SEO

SEO is a continuous process

Within large organizations where SEO efforts belong to a dedicated team, there’s a common misconception that SEO is an action that needs to be conducted as one-time event during a product process. It may be that products “need to be cleared with the SEO team” or once they are complete the are “sent to SEO” as if SEO just need to sprinkled like seasoning.

SEO is not marketing

Part of the reason that there are those who think SEO needs to happen at the end of a product building process is because SEO is considered to be marketing instead of product or engineering. Typically, when a product is ready to be launched it is shipped over to a product marketing team to create a marketing plan who then place responsibility with the marketing team to generate users. While at times product marketers might be participate in the product creation phase, it would never occur to anyone to bring traditional marketers into the tent at that time. There’s likely not much that a paid, email, or brand marketer might add to the product plan.

This is not the case at all for SEO. This misconception to bring SEO in at the end is founded on a common lack of understanding of what SEO is. Before understanding what SEO is, it’s important to clarify what it is not.

SEO is not magic – if there is no search interest for a particular topic, no amount of SEO can create search volume. There’s also no silver bullet to ever guarantee that a page or site will generate search traffic. Apply SEO processes to something does not equate to generating traffic.

SEO is not a singular task – How to optimize a page or site will vary widely depending on what is being optimized. Therefore there is no allowed time frame for how much SEO efforts there can be or should be.

SEO does not operate in a vacuum – The very processes of optimizing a page or site for search is not an independent action which can be divorced from everything else that goes into writing content, constructing a site and laying out a page.

With this in mind, it’s easy to elaborate what SEO is in the context of building something new. SEO is a process of building in best practices for how a piece of content or website will get the highest visibility and traffic from search engines. These best practices could be anything from doing research on what the best way to word content might be, how lengthy content should be, and even the grade level of the content.

These best practices can’t be sprinkled in after the content is already written. Sending something over to another team for an SEO approval after the fact is a recipe for creating internal conflict. The SEO team will provide recommendations that will make the product team who initiated request feel like the SEO team is creating unnecessary bottlenecks.

On the engineering side, sending something over the SEO team after the fact is even worse. Finding out that a page or website will not generate any search traffic after hundreds of hours have been invested is not ideal. The engineering team might want to shoot the messenger – the SEO team, but that will not change the reality.

Solution

The solution is to incorporate SEO best practices at every stage in any process. If search traffic is at all a priority knowing what the best practices for achieving that traffic should be paramount to include before it’s too late. Some of the decisions I have seen made prior to even knowing about the product have either caused expensive redos or have forced the product and engineering teams to accept that they will unlikely ever see search traffic to their pages.

One example was where a product intended to drive all of its users from organic search was built using client side scripting. It had not occurred to the team to check with anyone knowledgeable on SEO until the product had been completed – one year after it started!

The solutions offered at that time were to rebuild entirely, use a headless browser or create static versions of some of the content. All of these options were considered to be too expensive and they would be fixed in the beta version of the product. It took six years to revamp the product for SEO! Since the product didn’t generate any search traffic, it was unable to get the engineering resources it needed to fix the product. It was caught in a vicious cycle that would have been avoided had the product been built right from the start.

Where SEO can help

Some of the areas where the SEO team can provide input early on might be very simple at the outset but complex later on. These are some examples:

URL structure – there are best practices on URL’s that can be incorporated as the product is being built, but after launch this can be very complicated

Content structure – Knowing what content will achieve the most visibility from search before content is written is a lot better than writing content for search that will not generate traffic.

Forecasting – Many times product managers will make unrealistic forecasts for how much search traffic might utilize a particular product, but they may lack the knowledge to understand the inputs of their forecast. Partnering with someone who spends all of their working day in search could help them build a more bulletproof estimation on growth potential.

Engineering choices- As a product is being scoped this is the time when engineers weigh in on how they suggest the product be built. If there is an approach that is not friendly to search engines, someone knowledgeable about SEO could encourage it to be rejected immediately. This will allow the product and engineering teams to focus only on solutions that will achieve their desired objectives.

SEO process

SEO is an optimization and debugging process just like software engineering. The same way a product can’t be built and never debugged SEO should be a constant at all phases of development. Understanding that SEO is a process that contains a touch of engineering, product and marketing should lead to a different approach  and expectations of SEO.

SEO

SEO is not dead nor is it black magic

What is SEO?

When search engines were first unleashed on to the world, early technology adopters quickly realized the tremendous economic windfall that could be realized from search engines sending traffic to their websites. Initial versions of search engines were essentially  online yellow pages, but at the same time early web users were infinitely more curious than yellow page book readers and were apt to click on lots of result.

The convergence of a growing swath of users and the profit potential gave birth to search engine optimization. At its core, the efforts behind search engine optimization were as benign as a Wall Street trader hunting for a trend that will lead to profits. However, unlike stock trading search engine optimization earned itself a negative reputation as many times the people on the receiving end of these optimization strategies were regular people just trying to use the Internet.

Someone looking for a vendor on a search engine could potentially land on an unscrupulous website simply because that website employed optimization tactics that allowed them to compete against the legitimate vendors. In this case, the characterization of SEO (search engine optimization) is entirely fair; however, it has been more than a decade since these kinds of operators dominated the web.

Today’s search landscape is dominated nearly in entirety by Google and they spend their vast artificial intelligence resources in neutering illegitimate tactics to win search visibility. SEO in this paradigm is both incredibly different than it was in the early days and vastly more valuable for nearly every website.

SEO of today

The SEO individual or team is the person responsible for understanding what search engines seek in websites and for translating that knowledge into recommendations and actions for product managers and engineers creating web interfaces that will be consumed by search engines. In smaller companies, the person that understands SEO might also be the product manager or engineer, but in larger organizations this will be a role on its own.

Paralleled with the misconception that SEO is a dark art is the idea that Google and SEO practitioners are at odds with each other. This may have somewhat true many years ago, but not now. Today Google relies on SEO practioners to incorporate the best practices Google needs in order to have the best search engine. Without the conduit of SEO, Google would have to work much harder to index a web that is not in line with the way they crawl.

As an example, Google has repeatedly said for years that they crawl Javascript. However, they also wink-nod at the SEO community while sharing that they may not yet crawl Javascript as efficiently as they would like to. As a result, SEO practitioners have done Google the favor of steering web designer and product managers away from pure Javascript websites. This allows for much of the web to be still produced in a way that Google can efficiently index the web.

The value of SEO

The collective value of all organic traffic in the world is a $Trillion+ opportunity that cannot and should not be neglected. For many websites and products, organic is one of the only ways to generate web traffic – short of building a massive brand that drives direct traffic. Even building a brand is prohibitively expensive and may never be profitable. Organic search efforts on the other hand is significantly less costly than any other acquisition channel.  

Google is not just going to automatically start sending boatloads free traffic to a website just because it exists. One day that may be the case, but we are a long ways from that. The solution is SEO.

Using SEO methodology, websites will construct a website in the ways that will most effectively maximize their visibility in organic search.

Relying on someone or multiple that will help guide the building of a website that will best positioned for search traffic will ensure that users are not being left on the table. Without SEO, a site would just be relying on dumb luck and Google’s good graces.

SEO in the future

For as long as people use search engines to find information, there will be a need for SEO efforts. Search is very much zero sum, so if one site is getting the click inevitably another site can’t. One day there may be even more artificial intelligence involved in the search ranking process which will make optimization even harder, but why would a website give into the AI? Even in that world, they will want to best understand how that algorithm works and try to put their best efforts into getting the traffic.

People complaining that SEO is only getting harder is a byproduct of all the AI that is already included in the algorithm. Whereas early in the history of search and SEO it was somewhat easy to “game” search with creative strategies or budget, AI and better search algorithms have negated these tactics.

Google’s algorithms continue to improve to an ultimate goal of ranking the web as a human might. SEO becoming more difficult means that the loopholes and hacks that are a feature of a software driven ranking will continue to close as Google becomes smarter. Yet, SEO is still necessary because someone does need to translate the search engine’s desired state into a coherent SEO effort.

SEO will never cease to exist; rather the efforts that make up SEO will change.

SEO

SEO can’t be replaced by software

S

There are several tools that I love using daily and without them I would not be able to work on any SEO projects. Yet, I think tools are just that – tools to help complete a project they are not the solution in of themselves. I believe there are many areas where tools will supplant humans and be able to do a job beginning to end without human intervention, but anything in the realm of marketing in my opinion must have human input.

Analogy to construction

It’s best to explain my opinion with an analogy to home construction. While any construction worker would likely disagree that their job could be completely automated, it is not farfetched to see how the robots could take over once each task is broken down. There are devices which hammer in nails automatically, hold walls straight and even follow a schematic to build a frame.

For now, there is no single robot hammer that can crawl across a house frame and know exactly where to hammer in a nail, but years ago there wasn’t a robot that could vacuum a house either. It is entirely possible that one day an inventor will realize that it is cheaper and safer to use robots for monotous labor intensive tasks that can be guided by a repetitive algorithm.

However, there is one area of construction that will never be replaced by a robot and that is the design and architecture. A robot will never be able to understand the human emotions and personal choices that go into deciding how a home should look, where the front door should be and how big the walk-in closet can be. A robot can certainly perfectly build a blueprint to spec, but it can’t translate desire into a plan.

Human marketing

I believe this same concept applies to SEO and marketing, one day there may be tools that construct the perfect website based on findings on what works in search, generates the best keyword ideas and maybe even writes the content, but the human element will always be missing. At best software can mimic what others seem to be doing well, but it would be impossible to have  a creative idea on how to get ahead. Even further, while software can even write content based on keywords and it may even appear well in search, it will lack the human emotion that is necessary to resonate with the humans who need to engage with that content.

Automatic SEO

Every once in awhile there’s an article about a tool that does SEO “automatically” getting a round of funding. There’s usually some breathless proclamation about how it will disrupt the entire industry, but all these articles always neglect to mention the AI factor that already exists. Google is already using AI to understand and rank content, the way to “beat” Google’s AI is not to have a duel with another AI tool but to put a human in the mix.

If all of SEO could be distilled down to doing keyword research and structuring web pages, then SEO could be disrupted by software. However, successful SEO is so much more. An SEO effort includes knowing how to architect a website, the types of content to create, the personas of the potential users, learning from performance to optimize for growth and most of all building a product that resonates with real users. Until we live in a world where robots do all our shopping, none of this could ever be disrupted by software.

In my opinion, all the sites that try to use software only to manage their SEO, leave a gaping hole for a human driven SEO campaign to beat them in search visibility.

Think about all of the most successful sites on the web, and then imagine if it were possible to replicate their success. Could a machine have built Wikipedia? Would automated reviews helped Yelp, TripAdvisor and Amazon win their categories? Would Google News be a dominant source of news if all it did was index machine written content?

I don’t think humans would have considered these sites key sources of information if machines had build the strategy, websites and content. Anyone looking to replicate their success would be better served by finding the smartest humans rather than looking for the next automated shiny object.

SEO

SEO should be viewed as a product

One of the reasons a company may leave its SEO potential unfulfilled is because they inadvertently box in the person or people responsible for SEO. They leave the responsible party on their own as an individual contributor forced to go through their manager to get anything done.

This flaw in structure is because SEO is viewed as a marketing function with tasks structured as campaigns relying on other marketing contributions such as content and design. Technical tasks like building or launching a page happen somewhere else in another org within the company.

Instead if SEO were viewed as a product, engineering tasks would be a part of the product roadmap and launch process from the start. Product roles are always reliant on other teams and are inherently cross functional.

This does not at all need to change the reporting structure of the individual as in many cases it makes perfect sense for SEO to be on the marketing team. Rather approaching SEO as a product function helps clarify inputs and outcomes on multiple levels.

Planning– When planning for SEO goals it is critical that all required resources from other teams be allocated at the exact same time. It wouldn’t make a whole lot of sense to plan to launch a number of pages or a new microsite but not pre-allocate the design time and engineering plan. On the product management side, new initiatives are never approved with a hope and prayer that everything will just work out when the time is right. All products that are prioritized, will get the resources to complete the project.

Budgeting – When it comes to budgeting on a marketing plan, SEO usually falls to the bottom of the pile since the story on investment to output is harder to tell. This means that SEO will get the short end of the stick on hiring, software and contractors whereas paid marketing teams might be flush with cash. Thinking of SEO as a product instead realigns the expectations on investment as if it’s a product that needs investment because it is a priority. Typically product teams aren’t resourced as if they have a direct line to ROI but rather because it is a business necessity.

Output and reporting – On the same note, when SEO is thought of as marketing the KPI’s need to be similar to other marketing KPI’s. Paid teams have LTV goals (hopefully), brand teams have impression share, so therefore SEO ends up being measured on rankings. This is a terrible way of looking at SEO as rankings are just a vanity metric. Instead the same way any product is measured by adoption and engagement, the same lens should be applied to SEO. For this SEO this would be measured by growth in impressions on search pages. Obviously clicks are important too, but the clicks are a result of on page structure which is not necessarily SEO itself.

Resourcing – Making the case to add more headcount for SEO can be very difficult if the metrics for success are too hard to reach or they are inappropriate for the channel. Viewing SEO as product, the primary headcount metric moves from KPI driven to deadline driven. The question that should be asked is what is the headcount necessary to meet the goal within the desired time frame.

Not much really has to change on reporting, salary and even titles to make SEO more aligned with product, it is really just an exercise in awareness and management. If the current method for managing SEO is leaving value on the table, it may be helpful to change the structure of how SEO should be conducted in a company.

SEO

In large companies winning is all about incrementality

The best way to be successful in a process driven large company is to always keep the focus on small incremental wins. Little wins will ladder up into bigger wins as the smalls wins begin to add up in the impact they are having within a company.

Building a detailed plan for incrementality is so much more effective than creating a plan that will never get executed upon unless there is executive buy-in and a dearth of initiatives across the company – which of course there never will be.

Proposing a complete website revamp is a surefire way to a backburner and the purgatory of no budget, but a refresh of a particular page is a far easier sell. The page refresh might need to be implemented in piece meal, but at least its not a project size that makes stakeholders recoil in fear.

Small wins

When setting these small win targets, it’s really important to make the little wins as small as possible. It could be something like just changing the title of a page which surprisingly could be difficult at a large company. It may even be a smaller goal like getting buy-in from a cross-functional counterpart – again not a given in a highly individual team goal focused culture.

This method for growing by using incremental wins to succeed requires acknowledging that by their very nature large companies are just very different from smaller organizations.  Within small companies with only a handful of employees, culture can be set by founders and a company can be oriented towards results. As an organization grows process is introduced which can add levels of complexity to getting things done. Much of the process will be vital for the future success of the company, but inevitably it will also lead to bureaucracy. For a growth minded product manager or marketer, the bureaucracy can be negated by embracing the process.

Big vs large comparison

As an example of the key differences between large and small companies just compare the planning processes. In big organizations a product plan that is blessed by all stakeholders can take a lot of time to develop, however for a company of this size, the alternative is untenable. A startup can pivot on a dime with founders, board members, and internal leaders reorienting teams on a whim, but in a large company behavior like this what leads to employee attrition. Employees will not feel valued to put a lot of work into a project and then to be directed on to a brand new initiative with no warning.

People

Things are also very different on the employee side too. Employees at smaller companies can have fuzzy titles and responsibilities and goals that change at the speed of the business, while in a large company this would be the complete opposite. Titles are typically specific and aren’t changed unless there is a business need. Responsibilities are narrowly defined and while adjacent responsibilities might be added via projects, big responsibility changes only happen in a job update. Within this environment, quarterly and annual goals are set long in advance and employees are expected to march towards those goals.

The types of people that end up in a company will have their own biases towards an atmosphere with more or less process. There are even extremists at both ends who could not imagine themselves in the opposite environment: a startup employee who shudders at needing to do the same thing for an entire quarter or the large corporate employee who gets night terrors from the thought of a loosely defined job structure that could change weekly.

Summary

There is no right and wrong when it comes to processes and each company will have its own way of doing things that is right for them

The gap between freestyle and rigid is very broad and there will be many companies that fall various places within the spectrum. Companies can even change when something big like a reorganization or layoff forces it. Whatever the culture of the company may there will be a way to work within that system to make things happen.

Keep in mind that in a smaller company it is easier to quote “get things done”, it is incorrect to just throw up one’s hands at a large company and just give up. Initiatives absolutely do get executed at bigger companies, its just that the pathway to making things happen is a bit windy and there are a lot more rest breaks along the way than a fast moving startup.

Realistically even smaller companies need to walk before they run, its just that they are a lot more nimble on their feet.

SEO

SEO is top of the funnel and is the assist on a conversion

Within marketing teams the most attention both good and bad is paid to the initiatives that cost significant sums of money. There will be frequent executive check-ins, quarterly reviews, detailed reporting and of course an attribution system that relies on something a lot more sophisticated that a gut belief. In fact, the entire company wide attribution system might be tightly tuned to have a deep line of sight into all paid efforts at the expense of other channels.

In this world view, organic search channels could end up with the short end of the stick both from a resourcing standpoint and on attribution. Everyone sort of has a belief that SEO works and is overall beneficial to the bottom but there’s not as strong of a drive to understand exactly how the traffic performs. Without accurate reporting executives and SEO teams could end falling back on useless metrics like rankings.

Even worse, a natural consequence is that when budgets are tight the channel that “kind of” works will fall behind the channel or channels with deep visibility. This leaves SEO teams always strapped for resources and scrambling to prove their efforts are worthwhile. In a weird script twist, the paid team only has to defend their budgets not their jobs while the SEO team without the budget has more existential issues.  

I think the root of this issue comes from a fundamental lack of understanding of where SEO fits in the marketing mix. Unlike a performance channel which is designed to go direct to conversion, SEO is a hybrid between branding and performance traffic. Judging it purely as a brand channel would overlook the tremendous impact it will have on the bottom line, but at the same time it can’t be viewed as just a performance channel.

SEO in the marketing mix?

By its very nature SEO will typically live a lot higher in the buyer funnel and in many cases users will not have any buying intent whatsoever. Stepping back from being marketers for a moment and thinking about our own search activities, much of it is just research and curiosity. Queries about weather, information, sports scores, stock prices and the link have no commercial intent.

On the flip side, organic traffic on the brand name will be a lot lower in the funnel, but to be totally honest, its not really even organic traffic. A brand should rank for its own brand name or something is very wrong.

The real SEO

True SEO efforts will have a site earning significant visibility on the long tail – the types of words that it would be hardly profitable to put paid dollars behind simply because it would take too long to convert. As the user moves down the funnel, their queries will skew closer to head terms or this is when they might engage with paid advertising.

Once the user gets to the bottom of the funnel and has buyer intent, they are more likely to click a paid ad – either on the brand name or retargeting on another site. A last click driven attribution system will then give 100% of the conversion credit to paid channel and completely discount all the organic clicks that happened over the prior time period.

Organic is an assist

Applying this to a sports metaphor, that last click might be the basketball slam dunk or the hockey goal, but it was all the other prior clicks that set up the perfect sequence for someone to bring the ball or puck home.

In reality, changing attribution systems is complex and unlikely to happen in a short period of time just because someone wants to. However, there is still no excuse for not having a better view on the performance of the organic channel and why to invest more into it. To that end, executives need to be aware of where SEO fits in the funnel and manage expectations accordingly.

To illustrate this with an example, let’s look at someone using search to plan a vacation.

The first query might be very general just to get ideas.

As they move further down the funnel they settle on a place to travel.

Assuming they know the dates they want to travel they start exploring transportation.

They also check out their hotel options.

Throughout this entire process they may have visited many various sites from local chambers of commerce, review sites, hotel sites, online travel agencies and aggregators.

As they finally decide on their options and get any necessary traveling partners on board, they are ready to purchase. They search directly for the site where they found the best deal.

If a paid ad comes up first, so be it; they are clicking. In the last click attribution world most less sophisticated sites use,  all of the credit would have gone to that very last click. The potentially months- worth of effort on planning that vacation through various pathways would have all fallen by the wayside from an attribution standpoint.

Multi-touch attribution is the goal

The ultimate goal of every site should be to use a multi-touch attribution model, but getting to this ideal is not as simple as changing a t-shirt. There is a significant amount of effort to gather data, build data lakes, test out models and buy the tools necessary to support the process.

There may never be a perfect way to attribute organic traffic, but at least with the knowledge of where SEO traffic really fits in the marketing mix, the best integrated marketing strategy can be built. SEO should carry the baton on all the deeper research efforts, but the baton can be passed to performance channel when customers are ready to pull out their credit cards.

SEO

Visualize internal linking like an airline route map

Links are a critical part of Google’s ranking algorithms as a link to a page is a vote of popularity and at times contextual relevance. The authority lent by an inbound link doesn’t just apply to external sites linking in, but the same applies to internal links (pages within a site) too. A website draws its overall authority score – Pagerank as Google’s ranking patents refers to it, by the sum of all the authority of sites that link into the site.

The best way of explaining this is to use the words from Sergey Brin and Larry Page’s original research:

Academic citation literature has been applied to the web, largely by counting citations or backlinks to a given page. This gives some approximation of a page’s importance or quality. PageRank extends this idea by not counting links from all pages equally, and by normalizing by the number of links on a page. PageRank is defined as follows: We assume page A has pages T1…Tn which point to it (i.e., are citations). The parameter d is a damping factor which can be set between 0 and 1. We usually set d to 0.85. There are more details about d in the next section. Also C(A) is defined as the number of links going out of page A. The PageRank of a page A is given as follows: PR(A) = (1-d) + d (PR(T1)/C(T1) + … + PR(Tn)/C(Tn)) Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages’ PageRanks will be one.

To the layman this is just saying that each page begins with a score of 1, but the final score is a function of all its outbound links added to the score of all of its inbound links.

In this calculation, the most linked page on a website will tend to be its homepage which then distributes that authority throughout the rest of the website. Pages that are close to the homepage or linked to more frequently from pages linked from the homepage will score higher. In this regard achieving this right mix via internal linking is critical.

Inbound link authority

Additionally, the homepage will never be the only page that receives authoritative external links, so if an internal page is the recipient of a powerful external link but doesn’t link to other pages, that external link is essentially wasted. When pages link to each other the authority of all external links is funneled around a site to the overall benefit of all pages.

For sites with flat architecture or only a handful of pages, a proper internal link structure is simple and straightforward, but on large sites improving internal links can be as powerful as acquiring authoritative external links. (A large site can be even be one that only has one hundred pages.)

Large site challenge

Due to the nature of how many large sites are structured there are invariably going to be orphaned pages – defined as pages that don’t have any or many links pointing into them. Even a media site like a blog or daily news site which has a very clean architecture – each post/article lives under a specific day will also have an internal linking challenge.

More than likely, the site will desire organic traffic that isn’t just someone searching out that day’s or recent news. There will be posts that they might hope would be highly visible many years into the future. Think of the review of a product on its launch day which is relevant as long as the product is on the shelf. Or a well-researched item which explains how something works, like the electoral college as an example. Granted these posts were published on a certain day but they are relevant for many queries essentially forever.

Ideal link architecture

As you might imagine, for all sites with this challenge, creating an ideal link architecture that flows links around the site can have a huge impact on overall traffic as these orphaned or weakly linked pages join the internal site link graph. 

How to improve the link graph

Implementations of related page algorithms on each page – quite simply a module with related links – to crosslink other pages can go a long way into supporting this link flow, but that’s only if the algorithm isn’t tightly tuned to a specific relationship. Sometimes when these algorithms are developed, they key off specific connections between pages which has the effect of creating heavy internal linking between popular topics while still leaving pages orphaned or near-orphaned.

There are three possible ways to overcome this effect:

  • Add a set of random links in the algorithm and either hard code these random offerings into the page or refresh the set of random pages whenever the cache updates. Updating on every page load might be resource intensive, so as slow as every day would achieve the same outcome.
  • In addition to related pages include a linking module for ‘interesting’ content– which is driven by pure randomization – also refreshed as in the first recommendation.
  • Include a module on every page for the most recent content which insures that older pages are linking into new pages.

As an aside, I also like to always build an HTML sitemap for all large sites as this gives one place that every single page is linked. If the sitemap is linked in the footer it will achieve the goal of having most pages just one click from the homepage. Transparently, Google’s John Mueller suggested that HTML sitemaps aren’t necessary, but I have always found that on large sites they can be very powerful.

Visualizing internal link graphs

To visualize what a desired structure of internal linking should be, I tend to the think of a site’s link graph like an airline route map.

Singapore Airlines

The least effective internal link graph looks like the route map of a national carrier for a small country. These air carriers will have a single hub in their capital city and then have spokes pointing around the world from that hub. Here is the route map for Singapore airlines which has impressive reach for a flag carrier, with only a few exceptions all their flights terminate in Singapore.

Flipping this visual over to websites, think of the hub as the homepage. The homepage links out to all of the other pages, but very few of the internal pages link to other pages.   

United Airlines

The most common type of link graph looks like the route map of a large global carrier. Look at United Airlines as an example. There are very clear hubs (San Francisco, Los Angeles, Chicago, Newark, Houston, Denver…) and these hubs connect to each other and other smaller satellite cities.

Image result for united airlines route map

Again, flipping this over to websites, the homepage would be the biggest city on the route map: Newark which links to all the other big cities in addition to all the hubs. The other hubs would be important category pages which have a lot of inbound links and then links out to all the other smaller pages. In this link graph, important but smaller pages would only have one pathway to get to them. (As an example, Mumbai is only connected to Newark.)

The most ideal internal link graph looks like the route map of a budget airline that thrives on point to point connections. To the bicoastal business traveler this route map makes no sense, but the wandering tourist can get to anywhere they need to go as long as they can handle many stopovers. Southwest Airlines is a great example of this structure.

Southwest Airlines

Southwest has such a complicated route map, they don’t even show it on their website. You would have to choose a particular city and then see all the places you can get to directly. There are certainly some more popular cities within their route map, but their direct flights almost seem to be random. A traveler can get fly directly from Cleveland to major travel gateways like Atlanta, Chicago and Dallas, but they can also go to Nashville, St Louis, Tampa and Milwaukee.

This is how a website should be structured. Pages should link to important pages, but also link to other pages that seem to be random. And, those pages should link back to important pages, and link to other random pages.

Summary

To summarize, think of a search engine crawler passing from one page to another calculating authority as a traveler intent on flying to every city on an airline’s route map without ever needing to go to a single city more than once.

On Singapore Airlines, a traveler could get from Mumbai to Frankfurt via Singapore, but to get to Paris (without a codeshare) they would need to get back to Paris.

On United Airlines, a traveler could get from Portland to Dallas via Denver and then could go on to Fort Lauderdale via Houston. They would certainly make it to a number of cities, but at some point they would find themselves connecting through Houston or Denver again.

On Southwest Airlines, a traveler could begin their journey in Boise, Idaho on any one of the ten non-stop flights and make it to nearly every city without ever needing to repeat a city.

Build your internal link architecture like the Southwest Airlines route map and you will never have an orphaned or sub-optimally linked page again.

1 2 3 4 5 6 7 15 16