To understand what constitutes modern keyword research today, we must look at how keyword research was done years back.
This isn’t to say that traditional keyword research no longer works. You can still rank web pages using the methodologies involved in this process, but they’re just not as effective as before.
By discussing the traditional approach, more experienced SEOs will quickly learn if their tactics are outdated as well.
Identify Low-Hanging Fruit Keywords
A low-hanging fruit keyword is typified by having a relatively high search volume and low competition.
Regarding search volume, there is no tangible value that qualifies as “high” since it depends on the niche and industry of your website. Popular niches have keywords with hundreds and thousands of monthly searches. The monthly searches of a niche’s main keywords dwindle as the niche decreases in popularity.
On the other hand, low competition is identified by its keyword difficulty (KD). The higher the value, the less desirable the keyword becomes due to its stiff competition.
Most SEO tools have a feature to find the KD in a few clicks. Using Ahrefs, for example, click on the Keyword Explorer, type in the keyword, and you can see the KD of that search term. You can even expand your search by looking at keywords related to your initial search and checking their respective KDs.
This simple process has been common practice among SEOs through the years. However, “low-hanging fruit” isn’t as straightforward as it seems.
For one, different SEO tools use different data sets to provide you with actionable insights for your keyword research. Ahrefs is one of the most popular tools out there because it has the largest keyword and backlink database compared to others with over 11.7 billion keywords across ten search engines and 229 countries as of writing. For comparison’s sake, SEMrush has 300 million keywords from over 120 databases.
Therefore, if there’s not enough data about your search queries, it may not produce the data you need for the metrics.
Second, each tool computes the KD of a keyword differently. While the KD score may be similar, don’t expect the score for the same keyword to be the same across different SEO tools.
Running the same keyword above, here is its KD according to SEMrush:
There is a slight difference in the KD and the local search volume, but they are not far off each other.
Now, compare the keyword data from SEO tool SE Ranking:
While SE Ranking produced lower KD than Ahrefs and SEMrush, does that mean the two latter tools are better keyword tools? To an extent, yes, but this brings us to the last point:
Researching for low-hanging fruit keywords forces people to depend on metrics that tools show you. As a result, critical thinking is removed from the equation as people start focusing on variables like KD, Domain Rating (DR), and the like.
Here’s the thing: even if Ahrefs is the most popular choice among SEOs, it can only provide you with an estimation of a keyword’s data.
These tools can amass as much data as they can. But as long as they don’t know the exact way Google ranks pages for a keyword, even the best tools can only offer you the closest approximation for information about a keyword. We’ll talk about this more once we tackle zero search volume keywords.
As you can see, you can’t put your blind faith in keyword tools for your research as they can only provide you with so much data. While tools can help speed up the research process, you must put some thought into finding these keywords and discern which ones you should optimize for your website.
Optimize For Long Tail Keywords
The perfect embodiment of a “low-hanging fruit” keyword is long-tail keywords.
A long-tail keyword contains at least three or more words that are specific, according to industry experts. In fact, the more words the keyword has, the more straightforward it is.
“Specific” is what makes long-tail keywords vital to your SEO strategy. When optimizing your web pages for keywords, you need to target keywords that answer the question, “what do people want to find in a page when searching for this keyword on Google.”
Image Source: Web FX
For example, “dog” as a search query is vague and doesn’t provide publishers such as yourself with enough information about what to do about the page optimized for “dog.” As a result, you’ll probably have to talk about everything about dogs, which could span tens and thousands of words.
To help narrow down their search, users add qualifiers to their search query. In this case, “guard dog” should provide you with more precise information about what pages users want to see for this keyword. At the very least, it’s not just about dogs in general, but a particular type of dog.
However, the keyword is still broad to optimize in your content. “Guard dog” may require you to discuss big and small guard dogs, the best guard dogs for families and first-time owners, and others.
Again, you’ll be forced to cover as much ground as possible about guard dogs simply because the search query isn’t specific enough.
This is where long-tail keywords come in. They are search queries narrowed down to a particular subtopic within a subtopic.
In the example above, “best breed of guard dog for families with children” is as narrowed-down as they come. You’ll be hard-pressed to find a similar keyword that’s more specific than this one.
When optimizing for this content, you’ll simply have to mention the best breed and explain why, which is precisely the question users are asking when typing in this keyword.
The only caveat of long-tail keywords is their low search volume. Due to the precise nature of the query, you can expect that only a few users will be searching for this query. Here is its search volume on Ahrefs:
Now, compare this to “best breed of guard dog:”
Both are long-tail keywords, but the first one is much more specific than the second one. Therefore, the first one has fewer monthly searches, if not non-existent.
In this case, the solution to this problem is to optimize for as many long-tail keywords as possible. By covering as much ground about a topic with this keyword type, you can build traffic incrementally from the content you’ve created, which should amount to thousands of organic traffic down the line if done correctly.
However, the biggest issue that long-tail keywords present is that site owners focus on optimizing one keyword per page. By putting too much emphasis on specific keywords, people keep chasing the tail. In other words, they search for keywords that’ll bring them traffic over time instead of approaching keyword research topically.
For example, it’s possible to exhaust long-tail keywords about a subtopic. But instead of focusing on much broader keywords about the same topic, some people pivot to long-tail keywords about a different matter. As a result, they cannot build upon the topic relevance necessary for helping your website rank on top of organic search.
We’ll discuss how modern keyword research has replaced this approach by researching and optimizing for keywords on a topical level.
Creating Skyscraper Content
Just as crucial as the researched keywords are what you intend on doing with them.
Of course, you’re supposed to create content optimized for each one. However, you must take this approach on a different level as it’s expected that your competitors will do the same.
This is what Brian Dean did with the type of content he coined “Skyscraper Technique.”
The technique isn’t anything new up to that point, which Brian covered here in great depth. It is essentially creating content that’s much better than your best competitors.
However, Brian dangled the content created using the Skyscraper Technique as linkbait to generate backlinks and help push the page’s ranking on top of organic search for its keyword.
Here’s a general overview of the Skyscraper Technique:
- Find link-worthy content – Find content that has the most backlinks for your target keywords using Ahrefs.
- Make something even better – Recreate the same content on your website but add more information, provide better examples, and feature rich media — anything that can make your content stand out from the rest.
- Reach out to other people – Email people asking them to link to your much better content.
Again, while this technique encompasses keyword research, it nonetheless plays a vital role in the success of your campaign. Choosing the right keyword will help trigger a domino effect to your content if everything falls in your favor.
However, while the technique was massively popular for some time and lots of people found success with this approach, it also has some flaws.
For one, the technique does not apply to all industries and niches. If the top content on your niche has few to no backlinks at all, then you’re better off using a different approach in your keyword research and content creation.
Also, bigger doesn’t always mean better. Just because you’re creating “much better” content than your competitors, it doesn’t follow that you’ll get backlinks from your outreach campaign, let alone rank on top of Google.
As mentioned, some factors need to play in your favor for this technique to work. Aside from being in the correct niche, you should also have brand equity to help convince bloggers to replace your competitor’s link with yours.
At the same time, the content should answer the question posed by the search query. Going back to “best breed of guard dog for families with children,” you can’t talk about the history of dogs just to pad your content with information, no matter how irrelevant they are.
Ultimately, while Skyscraper Technique has a place in your SEO strategy, it’s rooted deep into the traditional keyword research approach that its effectiveness has decreased over time.
Instead of researching for keywords to create one-and-off content for your site, you need to develop a strategy that gives all the keywords you’ll research and the content you’ll create and publish with a greater sense of purpose in the grand scheme of things.
Why Move To Modern Keyword Research?
To reiterate, traditional keyword research still works to a degree. These aren’t tactics that you should abandon altogether because they still make sense under the right circumstances.
However, if you’re expecting to achieve the results of those who leveraged these tactics years ago, you’re out of luck.
The paradigm shift away from traditional keyword research is due to the fundamental changes with Google.
It has changed that ranking for keywords isn’t as simple as adding your keyword at the right places in your content.
As mentioned above, building topical relevance allows you to create website authority not by targeting keywords but for topics.
Let’s put it this way: if you’re going to create a website about guard dogs, make sure to create informative content that shines a light on this type of dog. Research for keywords and publishing content not about guard dogs dilutes the site’s relevance for the topic.
Aside from the algorithm changes through the years, keyword research is shaped by Google’s ability to read contextual information and provide users with the correct details.
The Knowledge Graph released in 2012 laid the groundwork for how people should approach SEO as a practice. Living by the saying “things, not strings,” Google can provide users with search results relevant to their query even if the question itself can mean many things.
For example, if you searched for “the rock,” Google knows whether you’re looking for information about Dwayne Johnson’s former wrestling name, the 1995 movie starting Sean Connery and Nicholas Cage, or the naturally forming minerals.
The search results provide more than enough information to help you find what you’re looking for in a few clicks.
Concerning optimizing your website, optimizing for a keyword per post is a thing of the year that started around 2012. As mentioned, it’s possible for a page to rank for as many keywords s as it can. Therefore, you must look at the bigger picture and consider opportunities for ranking to other relevant keywords.
Most search rankings are determined by achieving a healthy balance of great content and backlinks. However, other signals have been added through the years, such as mobile-first indexing, loading speed, and the like.
In 2015, Google RankBrain was introduced to the public. It is the search engine’s machine-learning artificial intelligence system that helps Google process results.
It is not a deciding factor that ultimately determines which pages rank for which keywords. Considered the “third-most important signal,” according to a Bloomberg article, it assists Google in interpreting the meaning behind long-tail search queries. It provides users with the best results in conjunction with search features like Knowledge Graph.
RankBrain signals Google’s concerted efforts to understand the relationship among words. Therefore, when you ask Google the age of a public figure or ask when their birthday is, you can get the answer straight from search engine result pages (SERPs) instead of visiting a page from SERPs.
More importantly, Google will continue to provide users with better search results thanks to RankBrain’s machine-learning capabilities. As new words are provided with new usages, Google will eventually pick them up from its AI system.
In 2019, Google took machine learning to a new level with the Bidirectional Encoder Representations from Transformers or BERT for short.
In a nutshell, BERT involves two processes:
- Evaluate meaning from a context using pre-trained models
- Make sense out of it using the natural language processing (NLP) methodology on search results.
In essence, BERT is the vehicle that makes enriching search results using NLP possible. It is the logical step from RankBrain that primarily focuses on providing better search results for long-tail keywords. Also, RankBrain is only able to connect concepts meaning on a word-level basis.
With NLP via BERT, Google can leverage machine-learning fully by understanding context. It can identify search queries on a more profound understanding using these variables:
- Sentiment – determines whether the query is positive, neutral, or negative.
- Entity – refers to a word in the page representing an object that can be categorized, classified, and identified.
- Salience – measures the importance of the entity on the page.
- Category – organizes entities into groups of related entities to form a topic.
Google can provide users with more contextual and accurate search results based on the queries you entered using these four variables. In other words, NLP acknowledges the intent of the keyword, which explains to search engines why a user searched for the query.
To recap, below are the changes that Google has undergone to revamp the quality of its search results:
- Knowledge Graph – from strings of characters to things, i.e., entities
- RankBrain – from entities to relationships among entities
- BERT – from relationships among entities to context and intent
Through the different changes, it is evident that Google’s methodology has gotten more and more sophisticated. It can now extract various factors from entities to help users provide much better and contextually relevant search results.
Its advancements on search bring up this particular question:
Is it possible for Google to move away from backlinks as a ranking factor?
Think about it: Google has gotten smarter in determining which pages should appear on SERPs for keywords using the NLP methodology. And with a perpetually learning AI system, its technology can seamlessly carry over to provide users with high-quality search results years to come without skipping a beat.
Google’s NLP can be so good that it can use words mentioned on pages as “lexical references that connect topics and keywords to a brand, website, or page.”
This is what Rand Fishkin says in his thinkpiece on SparkToro. He hypothesizes that “inferred links” will replace the link graph as a ranking factor.
Image source: SparkToro
Instead of building dofollow links from authoritative sites that are “financially-motivated,” inferred links are more authentic because they’re coming from an unbiased place, at least for
now. People who create inferred links unknowingly build them through reviews written on Reddit or Amazon, showing how impartial they are in nature.
Backlinks hold a lot of value as a ranking signal because they serve as recommendations in the eyes of search algorithms. The fact that a website links back to your site means it trusts you enough to refer to you as a resource about a topic.
But as mentioned, most backlinks are motivated by ranking on top of Google search for their keywords. By building more links from high-authority sites, your site also increases in ranking.
This is an issue Google may have towards backlinks as a ranking factor. They play in favor of site owners who have the budget to launch full-scale link building campaigns to grow their organic traffic. This leaves newbie websites with the slimmest of chances to rank over their well-established competitors.
However, this all changes assuming that Google goes the way of inferred links that Rand raised in his post. Instead of relying on a site’s link profile to determine where the page will rank on Google search, using inferred links could provide for a more objective and arguably even better search experience for users.
And this could be possible thanks to the NLP methodology of analyzing context not only for search queries but also for content in general. Its ability to process content using the different variables available in NLP paves the way for Google to possibly put more weight on this model instead of link profiles.
Again, this is just a theory by Rand. In his post, some people disagreed with this approach as it is a drastic shift to how people think Google comes up with how pages appear on SERPs. Then again, wilder things have happened, so who knows at this point.
Now, you may be asking: what does this have to do with keyword research?
Raising the different phases in Google’s methodology has gone through sheds light on how Google is becoming more contextual in its approach. Now, this isn’t to say that “inferred links” will take precedence over traditional ranking signals, but it’s an example of the possibilities where Google can take NLP concerning SERPs.
About keyword research, how you apply context, intent, and sentiment as you search for keywords to optimize for your website will play a crucial role in your success with it. Also, mapping out the different keywords you should use in your content creation campaign and how you will structure them into your website will aid in building your site’s topical relevance.
Request a free website audit today or call us at +91-9717778130 to start using SEO for your website!