How to Rank for a Keyword
Learn how to rank higher on Google for a keyword (42 minute video):
Ecommerce SEO expert Ed Baker reverse engineers Google Search, provides SEO best practices for ranking, and shows the future of SEO with keyword clusters for your content creation process.
Updated: 12/4/2020.
Core Algorithm Update: December 2020
Later today, we are releasing a broad core algorithm update, as we do several times per year. It is called the December 2020 Core Update. Our guidance about such updates remains as we’ve covered before. Please see this blog post for more about that:https://t.co/e5ZQUAlt0G
— Google SearchLiaison (@searchliaison) December 3, 2020
As I am finishing my edits to this article, Google pushed the December 2020 Core Algorithm update live! I couldn't have been any luckier on timing. While everyone enters quarterly meltdown mode about their website rankings fluctuating again I am looking at green plus marks & dollar signs. Despite my industry (special events) crawling away from death - I am still grateful to be waking up to customer e-mails, voice mails, and new paid orders.
If my fellow SEOs take one tip of advice this year as to how to tackle SEO in 2021 please fight the macro war, not the micro battle. If you are optimizing and getting no where, have you ever considered you may be working in the wrong sequence? Get your content aligned to the top 10 SERP results first and then optimize further!
Please keep in mind: The content on this page may seem confusing or conflicting without step 1 - watching the video!
These concepts have been driving my success in search results and rankings for a full calendar year now and I'm ready to share them with you:
As of October 2020, Google's BERT Algorithm has updated the crawler bot's core functionality. The bot utilizes natural language processing (NLP) to break down, understand, and catalog written languages. This feature has been enhanced from a Words2Vec model (left to right reading) to a new bilateral model called BERT. This gives Googlebot a greater contextual relevance, total understanding of content topics, and the ability to grow organically via Automated Machine Learning (ML). This update completely redefines the search ecosystem and will drive the future of SEO. Yet again, there are changes to the Google ranking factors you should be paying attention to in order to create pages that rank!
How Googlebot sees your website has changed! To simplify it - the Google bot went from reading left to right and right to left to derive meaning from your content to now converting every word and/or phrase on a page into identifiable entities which it can then prioritize, catalog, and compare to further understand! Related keywords (NLP terms) are on a whole new level of importance now...
Websites are collections of entities. Google analyzes, catalogs, and prioritizes these entities continually feeding an automatic AI powered / machine learning search engine brain & the online ecosystem. Entities are basically the 5 W's: who, what, where, when, and why of keywords. The more diverse & higher quality citations you can stack, the stronger your entity becomes.
This stacking creates more disambiguation and encourages the bot to award you a higher rank than you could have achieved without it. Please make sure to utilize the best entities for each individual page you create and then also consider where it belongs within a cluster of pages and/or within each content silo you create.
Types of Entities Google can Identify:
Person
Location
Organization
Event
Work of Art
Consumer Good
Other
Phone Number
Address
Date
Number
Price
Off Page CITATIONS & LOCAL SEO
Entities and entity citations apply both internally on-page and externally off-page via high authority domains like Amazon S3 bucket pages, Google My Business pages, web 2.0 / social media properties, and reputable directories.
Entities are the reason why local SEOs are selling NAP (name, address, phone number) citation stacking services for GMBs (Google My Business Pages) and offering to boost local businesses into the top 3 “map pack” on sites like Fiverr and UpWork for as little as $5.
Reinforcing your physical location, products, services, and topics has been magnetic for how BERT prioritizes and pairs results within organic search traffic. No, I don’t endorse this map stuffing technique, I just wanted to point out this is why services like this even exist. “I will get you 9,000,000 citations for $3.” Marketers ruin everything.😂
Entities are prioritized from most prominent to least via the Google NLP API. Ensuring your top 1-10 entities on a page align with the existing top search results will make sure you rank better too.
Use entities in the correct ratio of your competitors to rank on the first page for your keywords. You can gauge this step via salience scores - a measure of on page prominence.
Salience scores will be applied to entities found in any piece of written content as well: any written text copied & pasted into the API, books, essays, product descriptions, etc.
NLP with BERT is not exclusively useful for search engines and websites, nor is the Googlebot learning from web pages alone. Why do you think Gmail is FREE and can sort your incoming mail into Primary, Social, or Promotion message tabs? When the product is FREE, you are the product!
Sometimes a writer focuses on a slice of negative within a topic, uses undesirable choice words, or is just a "Debbie Downer" in tone to read & learn from. Increasing the positivity and aligning the tone of the article with other sites in the top 10 for your keywords can help you rank when all other items are comparable. Analyzing tone is also helpful in AI powered chatbots and advanced automated customer service as it has social listening triggers. Learn more from IBM's Watson sentiment tone analyzer.
What is the best kept secret to ranking on Google? Correlative SEO.
Be careful, correlation flows and trends both ways - it is quite literally a double edged sword. If you choose the wrong competitors to mimic or correlate your content towards you can hurt your content score, topical authority, and drop rank! Picking the most relevant, best fitting, highest intent competitors that match search results is the key to your rankings & success.
Correlation SEO compares any URL to competitor search results for any target audience keyword or popular keyword phrase. Correlation software like Surfer allows you to reverse engineer and model yourself after much larger and better pages whose entities already provide contextual understanding to the bot about the niche topic you want to rank for. By building focused, semantic, content themes you can build trust and authority around a specific topic and stand beside the existing websites and brands in the top ten!
🐐🐐🐐🐐🐐
Surfer SEO is the GOAT! (Greatest Of All Time. 5 stars...or rather 5 goats!)
It makes utilizing qualified NLP terms in the correct prominence via the Salience score easy. No spreadsheet calculations or data manipulation required anymore. It saves tons of time!
It makes writing or outsourcing brand new top quality articles & content easier than ever before.
It helps with keywords, relevancy, and now even features content planner - a keyword clustering suggestion engine to help you plan out your pages and blog post sequences.
Surfer is the first tool I have ever done affiliate marketing for because it has worked beyond my wildest expectations for my business.
I use to use CORA Reports for my correlation analysis, but I'm not a mathematically inclined person or the best at manipulating data in Excel. The Surfer interface instantly won me over! They enhanced the right side dashboard and I’m truly enjoying writing for the first time in my career (as you can tell.)
Surfer took an article I wrote from #18 on Google to #3 in an ultra competitive niche and for a high search volume keyword. I am now the first result below Amazon. I estimate that this single article optimization has brought in enough new revenue to pay for my Surfer license for the next (5) years and it's just getting started as a piece of content that passively farms sales & leads for me.
Other pages I have optimized with Surfer’s Audit have done phenomenally well too! Surfer even helped my company to break a World record by showing us how to rank a product page above Amazon & Party City at #1. This resulted in our largest, single, six figure order to date. We sold an entire semi truck worth of product!
All that being said, I'm pretty sure the tool has paid for itself for the span of my lifetime? Not opinion, facts.
So in short - thank you to Michal, Tomasz, Slawek, Lucjan, Karolina, and the rest of the support team for bringing so much value to on page content & rankings! And most importantly for ALWAYS listening to the customer to make the tool better!
The last thing I wanted to create is "another affiliate video for a paid software." I don't make my living off of being an affiliate marketer. I think I have made $377 to date? My primary income is I own (3) ecommerce stores in the special events industry. We distribute equipment to major events, festivals, concerts, artists on tour, trade shows, casinos, venues, etc.
The SEO process I'm sharing here is simply how I grew my company's revenue 150% per year for the past (4) consecutive years, before COVID-19 nerfed the events industry and I found myself with nothing to do...but make a video and write this article.
I am a small business owner who got tired of the agencies peddling snake oil and the gurus filling Facebook groups with scams. So, I dedicated (2) hours a day to reading about SEO for the past (5) years and began to execute against what I learned daily.
I learned to rank SEO with content ONLY because I was broke and could not afford to advertise my business much like many of you. Hopefully, I just explained how the Googlebot and SEO works in a way that others can truly understand and take advantage of the latest algorithm update. I am working on a keyword research & keyword mapping video course next, but until then, just know: Surfer with NLP credits is like entering cheat codes for the search engines!
The primary objective of my walkthrough video was to show you how the search engines, specifically Googlebot, are changing up the game with artificial intelligence (AI) and automatic machine learning (AutoML).
After a 1-year head start executing against this algorithmic change, I can only preach how bad you need a correlation tool like Surfer to increase your rankings & organic traffic.
At the time I am writing this, there are 2,800 people in the Surfer SEO Facebook Group. With so few people in the World even aware of it as a tool and so few people aware of how Google actually ranks pages, you have a runway of opportunity in front of you.
Use NLP & BERT as an opportunity to be one of the successful SEOs you hear about. Guys like me, who somehow, despite all odds, with crappy & ugly little sites, manage to get themselves to the top, do millions of dollars in sales, and break all of the rules that once existed.
I'm not going to break down the top ranking factors for on page SEO like every other article. Instead, I'm going to explain why on page SEO works so well. It all starts with a technical topic called disambiguation.
Website owners must practice disambiguation: the removal of ambiguity by making something repetitively clear.
Removing ambiguity from the equation, removes your chances of confusing the bot. Anytime the bot is confused, you face scrutiny by enabling it to use it's subjectivity.
Your primary goal should be to: always make it impossible for the bot to think anything otherwise.
This is why many local ranking experts are obsessed with mirroring / reinforcing your NAP (name, address, phone) on as many free, high authority, web 2.0 properties as possible. As Google crawls the web and it finds the same NAP information over and over and over - this reinforces the informational accuracy.
An entity with reinforced, validated, and frequently similar published information will become an authoritative entity in that geolocation and/or a niche over time. Each crawl, every pass, Googlebot finds more citations and more references and expands understanding of your entity.
The largest, most authoritative, and dominating entity in online sales is Amazon. Amazon captures 50% of all online transactions. It is a huge entity and this is exactly why Amazon dominates shopping intent SERPs and related searches. To outrank them, we must disambiguate our entities better!
Disambiguation starts from the basics of building your page.
I grew up near the horse track and would gamble on horses with my Uncle Sam. If you pick (3) horses in a row the bet is called a trifecta. I adapted the slogan and coined the term Positioning Trifecta which consists of:
Page Title
URL Address (Slug)
H1 Title Tag
The concept of the positioning trifecta is to match your page's keywords across the page title, website URL, and H1 on page title tag. Ideally, these should always match exactly, 100% without any deviation. Any variance in word sequence, additional words, or formatting inconsistences open subjectivity for the bot to interpret another undesired entity as the primary idea or topical focus. To prevent this, you must always remove the chance it can think anything else about your page but what it truly is.
I expect many SEOs to disagree with me here immediately. You can still rank and may in fact rank better for multiple keywords with variations in your title, address, and H1 names. The point I am trying to make is that disambiguation starts with basics by utilizing repetitive positioning with keywords in your page naming & setup.
If you 100% want to rank #1 for a specific keyword, than 100% disambiguate for the bot - use exact match naming across the board. If you have greater goals, if you are a more advanced SEO, feel free to get creative. This targeting practice gets more complex for each keyword you add. This gets challenging, fun, and shows how it really becomes The Art of SEO!
Example: If I write a page about red high heels for women it should be:
Red High Heels for Women (Page Title)
/red-high-heels-for-women (URL Address / Slug)
<h1> Red High Heels for Women </h1> (On Page H1 Tag)
The <h1> tag is the ONLY place I recommend including other marketing lingo or variations in query. If you must, surround the exact match phrase and/or include the exact match phrase to preserve it's most organic form.
Exact Match: Designer Red High Heels for Women Who Stand All Day
Don't break up your target keyword phrase into a partial match keyword - this will dilute the potency of your positioning.
Partial Match: Designer Heels for Women Who Stand All Day
And don't believe just because the words are in the sentence the bot is smart enough to put it all together. It still requires words clustered together into phrases to derive contextual understanding. It needs the precise exact match entity.
Keyword Rich: Designer Heels in Red for Women Standing All Day
This same methodology applies to domains. Many SEOs will advise against an exact match domain or partial match domain due to the potential of over optimizing content. However, if you are truly out to rank a keyword, there is nothing more overpowered than a domain with baked in core search volume. Google claims to have diminished the power of this, but it still is extremely over powered for dominating the SERP or niche because of NLP. For advanced as the bot is, is as basic of a word parser as it can still behave.
This talk may seem like an amateur hour lesson, but hang with me - understand it is this repetitive stacking that provides the bot it's first disambiguation of what you are talking about on the page. Naming correctly is the critical first step that sets the tone to rank you higher than competitors. The name stacking confirms your topic is precisely your topic with no chance of it being about anything else!
A garbage page with exact match page title, keyword rich URL, and H1 title tag can rank in the top 100 merely by this trifecta of disambiguation.
If you stake your position carefully you can almost instantly guarantee yourself a spot in the top 100 results of less competitive search queries. Once your keyword rankings are inside the top 100 results, you can tweak your way to the top 10 with a correlation tool like Surfer.
Google's company mission is: to organize the World's information and make it universally accessible and useful.
How do you do that? Accuracy.
How do you become the most accurate search engine to ever exist? You get granular and dissect languages entity by entity, word by word, phrase by phrase, page by page, cluster by cluster, silo by silo, brand by brand, site by site, city by city, state by state, country by country, planet by...well not yet...but you get me... (Go Go Elon Musk!)
If your keyword position won't rank top 100 for a keyword with the positioning trifecta, you continue the entity stack, and utilize the positioning superfecta. (My Uncle Sam & I always placed the superfecta bet later in the race day to try to win big before going out to dinner. It consisted of picking four consecutive horses typically, but depending on the race the number of horses you wished to stack into the superfecta bet could be modified.) Create a superfecta by continuously stacking your page entity:
Add an image to the page: name an image with alt tag title, description, and SEO friendly filename (seo-friendly-file-name.jpg) that exact matches your primary keyword.
Embed a video on your page with an exact match, search friendly video title. Link from the video back to the article page. There are no Google Search Console (GSC) submits at the time I am writing this. The quick hack I hear is to post YouTube videos with links to pages. The link then lives on a Google property and can't/won't be ignored for indexing. Same concept for publishing to GMB's in local search engine optimization. (Submitting sitemaps is still also possible.)
When the bot sees a continuous sequence of page title, a URL address, an H1 title tag, a photo, a video, and other rich media, embeds, & citations like: maps, podcasts, pdf documents, quotes, etc it further understands and can relate the topic. Each element you stack on the page should be exact match named and therefore help further disambiguate your content topic and drive your position.
Positioning Superfecta
Page Title
URL Address (Slug)
H1 Title Tag
Image
Video
Rich Media (SlideShare, SoundCloud, PDF Document, Google Maps, etc)
Meta description no longer counts towards SEO and in fact are often auto generated or rewritten by the Googlebot to better fit the user intent. This is exactly what we DON'T want to happen to our page's position and exactly why I practice the positioning trifecta and superfecta! Keep stacking the ultra specific keyword or phrase or subject matter you wish to be noticed for and write a similar length & intent article as the top 10 SERP and you will rank!
When you have figured out how to rank a single page for a target keyword, then SEO becomes about creating lots of top ranking pages! If you build your pages in a semantic and topically relevant way, together they will rank higher than any one individual page could on it's own.
These groups of pages begin to act as "keyword clusters" or mini "entity stacks" which validate your domain to become one of the top ranking pages for specific topics regardless if the user makes an exact match search query or not.
This is why behemoth brands like Amazon and Walmart appear in the SERP pages over and over again for keywords, even when their results are not always the best. They are shown as they have reputable entities and hold authority on many types of goods, services, and topics. With the sheer number of pages and content silos they have, Google favors their reputation and this is why they always see more SERP rankings, buoyance, and traffic than our much smaller sites.
To combat their polarizing and magnetic ways we can optimize clusters of keywords and/or clusters of content to help drive the correct rankings and traffic faster.
Did you know that Google knows the fruit, an orange, is round and nobody ever told it that? Googlebot learned this because of machine learning (ML). Whenever oranges were described words like "round", "rounded", and "circle" came up.
It connected the contextual relevance and figured out that an orange's appearance / shape is round. If you were going to rank a page for "orange" and you didn't include any reference or mention to it's shape as round, or having rounded edges, or being circular you could not possibly rank higher than others in SERP. Semantically, your page would be ineligible for the top 10 because Google must follow it's mission of delivering the most useful information at all times. If your information lacks a commonly prevailing keyword, phrase, idea, or concept, your piece of content will not meet the criteria and not be listed in the top search results. It's as simple as that!
This is also why LSI keywords (latent semantic indexing keywords) were all the rave a few years ago after knowledge graph was introduced and people wanted to rank their brand within the infobox.
Enter Surfer’s Content Planner, a keyword clustering tool, with (1) click page template creation. This is that tool that content writers dream of. Enter a keyword and Surfer crunches (it takes up to an hour) to research topical, relevant, keywords and cluster them automatically for your one click activation. If you are trying to rank for a specific niche - this is the roadmap to domination.
Another tool I can’t help but be excited about is Keyword Cupid.
Keyword Cupid’s keyword clustering tool helps design website architecture that ranks like no other.
Imagine if you knew all of the best tasting combinations of food in the World and then could open a restaurant?
Or imagine if you launched a clothing line of only the most popular products so you don’t lose money on inventory ever again?
Or imagine you knew every topic the Googlebot wanted to hear about in order to rank your site?
Since you, the user, upload your data to the tool - I’m not quite sure what the limits are just yet. Using organic online models to launch real world businesses seems like it could be fun! If you try Keyword Cupid, tell Leo the founder “Ed sent me!”
It’s like a crystal ball for keyword mapping and content planning. A MUST use before any execution. Between great keyword research and content clustering you prevent wasting time & money like never before.
Ranking content in clusters is faster than individually, one by one. Just as ranking multiple linked websites at once is easier than a single site at a time. I am not talking about using PBNs (private blog networks). I am suggesting overall, in general, it is much easier to rank a large cluster of pages for the bot as it is a multi page target to land upon. It has multiple entry points and links between each other that then hold it's attention longer. Relevant content enables bot disambiguation to occur faster!
Most sites get crawled for a measly 40 seconds or less. Then the site owner must wait for the bot to return again in order to find anything new about their entity or for their authority to grow. A year ago, some black hat SEOs I know showed me how they managed to do so much entity stacking, so much content mirroring, so much linking, embedding, and looping that the bot literally got stuck in a crawl on their website for a 15 minute crawl loop.
If the average website gets crawled for less than 40 seconds and then the bot crawls one site for 15 minutes, what does it think after? “Wow, that's a much bigger entity than I remember.”
It was seeing this duplicate content and ranking madness first hand that made me realize the larger the entity, the more page clusters, the more entity stacks, the more citations stacked, and the more content silos all interlinked together is what the bot loves. It gives it massive content & substance to chew on. The entire purpose of the bot is to crawl, so getting it to crawl longer seems to be helpful!
Learning all this, I realized it was much faster to rank once I put links between all of my ecommerce stores. Together they are stronger than it was to rank them independently of each other.
At first, I was hesitant to link them up due to the products & services being different and wanting to eventually sell off certain stores. They are not the best clusters to go together, but surprisingly their SERP buoyance was actually activated in a way I never expected because I made it clear that one umbrella company owns them all. (Not even with schema markup, just NAP in the footer and map embeds.) This one common similarity transcended each site to help rank all (3). Later, I found I could talk about custom printed based goods commonly between them for more topical relevance.
Now together, these sites help protect each other from major core algorithmic updates and changes from Google. It's a much larger target to disrupt, tank, and sink. Together, they soften the blow when ranks drop, and together they continue to climb and help each other rank despite selling different products & services, in different industries & niches!
Each site has it's own tight & semantic content silos which help create a massive web of relevancy about many topics and niches we now rank for. As long as your clusters are tight I’m not sure how tight multiple clusters actually need to be on a branch of clusters, if that makes sense? It seems to be forgiving so long as each individual cluster is of high quality content.
Your website structure (blueprint / architecture) should be made up of stacks of content called silos. Silo doesn't mean a web of pages that runs ten links deep - don't do that. Keep page navigation depths wider & flatter than deeper. Silos just refers to the fact there is indeed a page hierarchy, all pages pointing to a central informational page or content page that acts as a topical navigational hub for the user and for the bot to branch off from.
Nobody explains SEO Silos better than Bruce Clay - so I'm not even going to try. Just understand it is important to plan out your page clusters and then stack / silo these topics on your website appropriately with thoughtful navigation, anchor text, and internal linking.
Tip #1: It takes a minimum of five pages to establish a relevant topic silo on any site.
Tip #2: Two internal links hold the same juice as an external link.
This is why it is so important to do keyword mapping, URL mapping, and to create an internal linking strategy to go along with your content silos.
Mirroring content silos on Social Media platforms and web 2.0 properties reinforces their structure and relevance. Stacking videos with keyword targets & optimized titles in a YouTube playlist and then linking each individual video back to the corresponding web page, with the video also embedded on that page, is a powerful way to reinforce the content silo structure and create a loop.
This mirroring action creates stacked keywords within a hyperlinked loop, that reinforces the entity by utilizing a domain property with high authority - YouTube. Mirroring content silos works on other social networks and free blogs as well.
Stacking images on Pinterest, in exact match named boards, mirrored to match your precise categories, products, services, and blog post topics is a powerful way to reinforce your collection of entities. (Your website.)
Creating albums on Facebook and mirroring with descriptions, links, and keywords reinforces your website's content silos and your entities.
Achieving a #1 SERP result happens because your on page entity is portrayed in the most relevant and useful way for the user query.
Second to that, the bot assesses the website and collection of entities vs. all off page citations of it to best understand where it belongs positioned in the World. Think of it just like BERT, except instead of words, every page cluster is being compared against every other cluster out there.
The website entity that owns / publishes the best content cluster or collection of entities gains priority, reputation, relevancy, and dominates the topical niche.
I have never built backlinks. Never once.
This might be a profound statement to make but let me be clear, I am not saying link building isn't important or useful. I want to point out it is just not AS IMPORTANT as it once was as it has become a micro topic of a much greater macro issue!
Macro vs. micro is a point I'm continuously preaching! SEOs have it all wrong right now! Getting your content aligned to the existing top 10 SERP results is what matters most and what will make the most impact quickest.
After the content alignment, then all of the link building, load time optimizations, image size reductions, minifying code, schema markup, and optimization extras stack above and beyond that baseline of importance.
Link building is a secondary step and factor. Links are the tiebreaking votes and decorations of authority - not what drives the needle when you want to rank about a specific topic!
Link building was once an SEO's primary focus. Now, diversely and intelligently stacking your entity will be the primary way to get search engines' attention. Link building has become a smaller part of your overall disambiguation effort.
Create content full of labeled sections, citations, and target a niche target audience on every platform and vertical you can tolerate the effort to build on. It's easier to rise to the top of search engines with keywords and content alone! It’s actually much harder to obtain quality link building.
Despite what you might hear an SEO guru say, ranking target keywords is still very much possible without:
domain authority
page authority
link building
internal links
private blog network (PBN)
blog posts
press releases
social media engagement
optimized images
fast loading pages & hosting speeds
attractive website design
or even a great user experience.
Why is this? BERT makes it possible! I have been feeding this new algorithm the right keywords and NLP terms for my target audience since October 2019 when they implemented a test on just only 10% of all search results. Since then, getting my sites ranking on the first page has been faster and easier than ever before! This means the previous NLP model, Words2Vec, was still able to process and appreciate much of my content improvements accurately. BERT is simply going to expedite, expand, and solidify the future of SEO by cataloging and prioritizing every entity.
Many people will disagree and immediately argue with me over how important backlinks are or page speed or schema or whatever. They are missing my overarching point to be made. In fact, I agree all the other stuff is important too! It is the sequence in which search engine marketers focus and deploy their efforts that is entirely backwards and incorrect!
Get your content marketing into alignment first and then worry about your technical SEO. In a sundae, the ice cream is placed down before hot fudge, nuts, whipped cream, and the cherry are all placed! Stop skipping the foundation and complaining you don't rank for your keyword.
The first page comes so much easier if you plan, prioritize, and design your content first. Attack ranking in a contra competitive, backwards, unorthodox, reverse engineered, correlative kind of way.
If you are trying to rank, treat your piece of content like a doctoral thesis or text book. Include a linked table of contents for the user to click and deep dive into your content with. Honestly, this is less for the user and more for the bot. Hide it in an accordion tab if you wish to keep it minimal and prevent it from taking up too much real estate above the fold. (Mine is obnoxiously long right now and should be reduced in size.)
A "back to top" or "top" button is most useful for the user / reader if your web page or guide is extremely long. Implementing a mini on-page navigation enables long form content pages to become stacked content silos themselves.
Anchor Links, Jump Links, Deep Links - whatever you want to call them...these # sections or "passages" of content will soon be a major focus of optimization in the SEO industry. Don't be surprised if you see Passage SEO or Passage Optimization on a resume one day soon!
SEOs are going to want to understand a website as a collection of entities, written about in topical passages, which are all stacked within content silos, and then silos are positioned with clusters of related keywords to provide ranking buoyance. Clusters, and silos, and stacks - OH MY!
Through the bot's eyes: each jump link topic is treated as it's own individual, stand alone page / snippet / passage that can rank as a result.
Top of silo pages are huge organic traffic farms and provide exceptional user experience by offering jumps into deeper content, products, services, or can deep link into a unique piece of content or guide.
Passages and quotes become internal link anchors and citations and should be used to drive traffic and engagement internally. More importantly they can also be used as direct links for external traffic to arrive at specific parts of blog posts and landing pages.
Google is offering more "no click" answers & information than ever before. Your paragraph or sentence may be used in SERP because it believes you offer the most useful part of an answer. Expect to find Google ranking snippets, quotes, or passages from your web pages soon!
Be prepared for Passage Optimization: one single paragraph, phrase, or fact published on your page may rank exceptionally well while another part of the page is buried and never shown. Just as website owners and SEOs alike complain that Google rewrites their meta descriptions, expect Google to be making a lot more editorial decisions with content on the SERP pages to get the best CTR and provide the best user experience.
On page optimization experts will focus their scope even tighter and micro analyze written language, story telling sequences, and optimize the conversion rate of the words on your page. Copywriting and content marketing is emerging as the art of stacking relevance & value at epic proportions!
The more often a website is able to define, label, and compare itself: topic by topic, paragraph by paragraph, entity by entity, the more disambiguation and precise direction it provides for the bot for rankings. At a certain saturation level, every keyword and every SERP has a tipping point and can be won because of great contextual content and thousands of stacked citations reinforcing the decision to make it rank as the top page!
Ranking a keyword on Google is now the process of:
optimizing your written content about a specific entity
reinforcing the entity through disambiguation
creating relevancy by clustering pages & creating topical content silos.
The more complete, thorough, and accurate the engineering you do - the more trust and authority you will gain from the bot. Topical authority polarizes your site to rank higher for relevant search queries you didn't even design pages or write content specifically for!
Example: Amazon makes up 50% of all online sales - this is also why it shows up in nearly every shopping / ecommerce SERP that there is. In order for you to outrank Amazon, you must have the stronger entity.
Forget the Google Keyword Planner when you want to find keywords! I wrote an article Keyword Tools for Google that outlines every tool available as of October 2020. This guide includes both paid research tools and FREE keyword tools. It highlights those tools that give Google search volumes data for FREE. In addition, it has high quality keyword resources to make sure you can find all of the long tail keywords, related keywords, synonyms, and right keywords (NLP terms) that Google loves! Having all of the best and semantic keywords reinforces your positioning, drives rankings, and gets your site profitable online.
Keyword research always starts with a single seed keyword.
Prefixes and suffixes are added to this root term or keyword phrase which expands it's word count length, the phrase becomes longer, and at 3+ words is considered a long tail keyword.
No two pages should target the same primary keywords or long tail keywords otherwise your pages may cannibalize each other in the rankings. If cannibalization does occur canonical tags or internal links can help fix this issue. This is a common occurrence for ecommerce store owners who are trying to get their category pages ranked, but have a single product outranking an entire collection.
This is why serious SEOs are now spending time researching clusters and designing content silos that are semantically perfect - to prevent contextual complications like this from happening. Planning prevents the bot from ranking something undesired!
Seed keywords will be farmed into long tail keywords. Again, it is a good idea to browse my keyword research tools article is the best place to go for tools. Take a look at Long Tail Pro if you need keyword ideas. It has been my old faithful for spitting keywords and analyzing keywords for competitiveness with their famous KC (keyword competitiveness) score. I also love the +20 feature to push out your keywords around specific words - this gives you control over your keyword farming vs. getting random garbage you must spend time sifting and filtering through.
Correlation SEO is the fastest way to rank a keyword because it pins your website versus against other top ranking sites to identify patterns, trends, and gaps. "Correlation doesn't equal causation" but it is the most helpful data driven process we can utilize to rank. Years ago, to rank for a keyword on Google seemed impossible - that is until I found Surfer SEO. It is hands down the best correlative SEO tool & investment I have made for ranking higher on Google for a particular keyword.
Using a keyword rank tracker! My favorite keyword research tool Long Tail Pro has a keyword rank tracking tool built into the dashboard. (30% Off with my LTP affiliate coupon.) It’s not as elaborate as others as it is not a primary feature of the application. There are other keyword trackers who focus entirely on tracking. It’s a great to have bonus, because if you are going to get better at keyword research, then you will need to be monitoring your keywords’ positions with a paid tool! It is the only way to monitor success.
No budget? No problem! Small SEO Tools has a FREE Keyword Rank Checker that will let you run your domain against 10 keyword phrases at a time. It also lets you select the country you are searching from to help with geolocation variations in ranking due to customized search. This is a great way to step into your clients' shoes to see how they see rankings from other countries around the World or when working on International / multi-country websites.
Thanks for reading!