The purpose of an SEO strategy hasn’t changed in all these years: to help a website earning the highest degree of organic visibility in search engines to get qualified traffic, and thus increase the total volume of conversions and micro-conversions generated by the positioned pages.
What is ever-changing, however, is the planning of the strategic plan necessary to achieve this goal and, consequently, the actions and tactics that must be implemented.
However, before we get to the heart of the matter, it is necessary to clear up a misconception, which I see many SEOs often have.
We rightly talk about creating quality content or optimizing a site thinking about the needs and tastes of our potential users, but we tend to forget that between us and our users there is a filter, or medium: the search engine (or Google, in the vast majority of occasions).
This means that not only we must be able – like any other marketing discipline – to identify the needs and desires of our audience and, therefore, to be able to offer them the best possible answers and solutions, but that we must also be able to correctly identify how Google identifies and responds to the same needs and desires of that same audience.
Only in this way, creating a convergence between the two analyzes, will we be able:
- to be considered by Google as a reliable and authoritative source capable of responding and solving those needs,
- to get approval – via conversions and growth in return and brand recognition and traffic – from our target audience.
The strategy that Google suggests to us
Google has always been pretty clear in telling us what its strategy is to get the most people by using it as The Search Engine.
The first draws us a generic mark:
to organize the world’s information and
make it universally accessible and
The second one enters into the details of this generic strategy:
- Organizing the content of the web:
- Finding information by crawling;
- Organizing information by indexing
- Google Search Index;
- Knowledge Graph.
- Matching searches (algorithms):
- Meaning of the intent behind a query
- Relevance of webpages
- Keyword matching and synonyms (Neural Matching and RankBrain);
- Aggregated and anonymized interaction data to assess whether search results are relevant to queries, which are transformed to signals implemented in Machine Learning algorithms to better objectively estimate relevance (Search quality test, Side-by-side experiments, Live traffic experiments).
- Quality of content
- Aggregated feedback from the Search Quality team;
- Spam analysis (the quality guidelines section in the Guidelines linked above).
- Usability of webpages.
- Core Web Vitals.
- Context and Setting
- Personalization (geolocation, search history…);
- Settings (language, SafeSearch…).
- Presenting results in helpful ways:
If we look at this detailed list, we can easily see how each one of the items listed could be included in one of three groups composing the famous E-A-T, even if Google – finally – tends to talk about E-A-T almost only in terms of “content”:
- Expertise, which means having the ability to solve with success, ease and speed something that involves some difficulty. It implies answering to the right search intent, clarity, usability, and velocity.
- Authoritativeness means being recognized as an authority by other people and institutions each independent of the other, which – in search terms – translates to being cited, linked, and referred to as an authority by other entities (websites, persons…).
- Trustworthiness, finally, not only means security (remember the old but still valid “Panda questions”), but it also linked to authoritativeness: if we are cited as an authority, then we will more likely be trusted too.
As you can see, the strategy that Google is suggesting to us is a mix of technical and marketing objectives.
Therefore, to achieve the evergreen purpose of any SEO strategy that I defined at the beginning of this post, we must set up these goals
- To fill or diminish the most we can the technical gap, which is possibly impeding our website to be efficiently crawled and indexed.
- To collaborate with developers and web designers so that our website will successfully respond to the Core Web Vitals, values that include both Speed and Usability.
- Conducting a Search Intent gap analysis, and review if the web pages of our website are correctly responding to the search intents Google individuated for the query sets those same pages are targeting.
- Filling the content gap, we may have individuated when conducting a competitive content audit, being the competitors those web pages that are ranking on the first page for the query we target, and a SERP audit.
- Implementing all those content and technical actions that can allow us to be ostensibly visible both in the organic search results (Rich Results and Featured Snippets) and the Google Search Features (Images Box, Video Carousel, Top News, Interesting Finding…).
- Working on the so-called Brand Search, so improve the visibility of our online identity in the Search Results (Knowledge Graph and Discover).
- Dedicating special attention to creating content targeting our Audience Persona, aka those persons and websites that are not potential customers but own a proven influence on them.
Seven goals may seem many, especially if we consider all the tactical tasks that must be done for fulfilling them.
However, many of them, if not all, are not goals that you must achieve starting from zero, because – as you surely noted – no one of them is really something new, and many of them are something that you should be working on since a long time already.
Crawling and Indexing
Tracking and fixing crawling and indexing problems is part of the daily work of an SEO.
However, there are two events already fixed in the calendar, and which should be at the center of our attention, especially in the first half of this year:
- March 2021: the definitive transition to Mobile-First Indexing;
- May 2021: Core Web Vitals become a full-fledged ranking factor.
Even if our website already is under Mobile-First Indexing, we need to check that the mobile version of our website correctly responds to these points:
- That it is seen correctly by Googlebots;
- That there are no disparities between the content of the desktop version and that of the mobile version;
- That the audiovisual resources of our website are optimized and
- That the structured data are implemented as they are in the desktop version on our website.
Regarding the visibility of the mobile version, and in the case that there is a robots.txt dedicated to it, we will have to check that there are no inconsistencies between robots.txt desktop and robots.txt mobile.
Furthermore, we will have to check the correctness of the implementation of features such as lazy loading, which if poorly implemented can make the relevant content of our pages practically invisible to the crawlers (go to this page to review the best practices of lazy loading).
Identical desktop and mobile content
This problem is obvious, but it can be more complex to solve than we can imagine at first sight.
This is especially so if the textual content of the desktop version has been partially “inflated” due to an improper conception of On Page SEO and, therefore, replicating the same huge amount of content would be counterproductive in terms of usability in the mobile version.
In this case, we will have to completely review the textual content of our pages trying to maintain the same degree of relevance, which they possessed in the previous version in such a way as not to affect the organic visibility obtained.
Visual SEO and Structured Data consistency
Bueno … very often, in reality, the degree of optimization of visual elements (images and videos) is practically nil even in desktop sites, as we know well.
However, this does not mean that it should remain so.
So, with the excuse of the definitive move to Mobile-First Indexing by Google, we should correct once and for all the deficiencies existing in this secluded and, in the event that a correct SEO for images has been done in the desktop version, that it replicates itself exactly in the mobile version.
Finally, let’s remember that structured data is essential for two reasons:
- To help search engines better understand the content of a page;
- To obtain Rich Results, which we know well how they can contribute to obtaining higher CTRs.
Therefore, it is essential to check that they are not only present in our mobile version, but that they are correctly implemented.
Core Web Vital (and the importance of the UX)
Little can I add that has not already been said in dozens of posts and presentations, among which I post these:
as well as the official Google guide.
The most important thing, which we must once and for all make ours, is that SEO and user experience now go hand in hand, not only because in May the metrics related to the UX will become a ranking factor, but above all, because all the negative aspects that accompany a bad user experience such as:
- slow page speed,
- jumbled page design and
- annoying presence of elements that distract or even make it impossible to use the content of a page correctly
they are such as to negatively impact the memory of those people who we want not only to visit our site once but several times throughout the life of our website, and therefore contribute to a negative perception of our brand.
We most likely won’t see a Core Web Vitals apocalypse next May (unless we have a horrible website right now already) or a huge positive jump in rankings just to have an emerald green CLS, but yes we will have better on-site permanence data for our website, better long-click and dwell time, better conversion and micro-conversion metrics, better short, medium and long-term impact on the level of satisfaction of our users and, therefore, we will contribute to the construction and consolidation of return traffic to our site if we work to optimize CWV metrics.
Search Intent and Content Quality
That introducing Search Intent and Content Quality into an SEO strategy is important is something we should know ever since Core Updates were called Quality Updates or “Unnamed Updates” (or Fred).
The two concepts, although not synonymous, are linked to each other: in fact, the better we know how to identify the Search Intent of a query set, the better the quality of the content we create to target that same query set will tend to be.
I talked about how to identify the correct Search Intent in my last post, so I avoid repeating myself.
What I would focus on, however, is on this aspect: a search almost always implies more than one search intent, and that the prevalence of these different search intents for the same keyword depends on factors that are sometimes not taken into account, such as seasonality.
For example, if we search for “winter tires” in spring, the prevailing search intention will most likely be purely informative, as spring is a period of low commercial seasonality for that type of product. We will therefore see above all articles and posts.
Conversely, if we do the same search in the fall, then we will see how Google begins to present us with search results, which respond to a different search intent (“I need help deciding what to buy”), for which search results of the type will predominate. “Best X …”) or product category pages.
What does this mean?
It means that we must focus on identifying the ambivalent and changing nature of the implicit main search intent existing for nearly all queries and, therefore, on creating (or improving) content that is capable of targeting these different search intentions uniquely.
By doing so, not only will we have content that can respond correctly to specific search intent, but we will also be able to be visible in search results with multiple pages of our website for the same search and, in so doing, reach a wider spectrum of potential users.
At the same time, however, we must also check that we no have more than one content targeting the same search intent for a specific query set. This, in fact, is the real cannibalization that we must avoid: Search Intent Cannibalization.
Focus more on your audience persona
Authoritativeness means being recognized as an authoritative source on a topic.
In Search this still mostly means being linked by websites, which are considered authoritative on that same issue too.
Since these sites are authoritative, they are also considered to be trusted and, therefore, have a great ability to influence the decision making of those who trust in them.
So, it seems logical to devote part of our SEO strategy (and our SEO budget) to precisely identifying those sites and people who:
- can help generate a positive opinion towards our site and our brand;
- can create a flow of referral traffic to our website (both follow and nofollow links, in this case they are indifferent);
- can create signals capable of clearly indicating to Google that we have authoritativeness on a specific theme (this time yes followed link);
- can contribute to the creation – both at our final target and in Google – of co-occurrences, which are the basis for the creation of the so-called branded searches (keyword + brand);
- can, by a mechanism similar to the previous one, create associations between our brand / entity and other entities with which we want it to be associated.
These sites and persons are not a Buyer Persona, but an Audience Personas or “Amplifiers”.
Is this SEO? Yes, because from the link graph depends 1 of the 3 ranking factors groups, as once said by Paul Haar:
Structured data, On Serps SEO, and optimizing for Entity Search
We should think of Google as a pig: nothing should be wasted of it.
Being visible organically only with a simple search result doesn’t make much sense now, and the only way to be truly successful and grow organically is to “occupy” as much space as possible in the SERPs.
This translates into analyzing which Search Features Google presents for the keywords (and related query sets) we are targeting and answering the following questions.
In the case of Rich Results:
1) If there are any, what are they?
2) Did we correctly implemented in our pages those structured data, which can generate the Rich Results that we have observed to be present in the SERPs (always use the Google Search Gallery as your guide)?
3) Really only that / those types of Rich Results can be shown for that type of SERP, or is there is room to experiment with others?
4) Did we regularly audit the structured data optimization with crawlers like Sitebulb?
[to be continued]