It’s 2024. Here’s how Google uses AI in Search — and what we learnt from the DOJ vs Google trial.
Confused about all this AI stuff in Google Search? I don’t blame you. It’s confusing. So to set things straight, I’m clarifying things for you — in the simplest terms possible.
In this article, thanks to insights gleaned by a lot of us in the DOJ vs Google trial, I cover Google’s AI-driven learning models, how they affect your business in the search results — and why they are so different from how we approached SEO in the past.
NB: If you’re a business owner with a website, please know that SEO has changed significantly in the last couple of years. A lot of what used to work no longer works. So, if you hire an SEO, they must stay on top of these changes and do the right thing for you. Or, if you’ve had an SEO company focus on link building and placing clumsy-looking keywords throughout your site, you might want to take note, or call me.
Search has changed. And we need to keep up.
First, what drives Google’s search results?
Google uses core algorithms. (That bit’s not new)
These core algorithms rank websites or pages based on their relevancy to a search query. They determine which web pages will appear at the top of your search results and which will be buried somewhere in Hell (aka anywhere not on page 1).
Next, deep learning is applied.
Deep learning is a method of artificial intelligence (AI) that teaches computers to process data similar to the way that the human brain does. (And this stuff is new-ish, like 5 or 6 years new.)
NB: This information recently came to light with the US DOJ vs Google trial, where Pandu Nayak, vice president of Google Search, explained how Google uses AI in Search.
The main three deep learning models Google uses in rankings are:
· RankBrain
· DeepRank
· RankEmbes BERT.
Now, hang with me; I won’t get too technical – I promise.
What does each learning model do?
· RankBrain looks at the top 20 or 30 documents the core algorithms deliver and ‘ranks’ them again.
· DeepRank is a version of BERT. (More on BERT below) It delves deeper into understanding natural language, refining search results based on user intent and context.
· RankEmbed BERT is a ranking algorithm which helps with ranking tasks. The ‘BERT’ bit refers to its ability to contextualise the meaning of the words from both sides.
All three learning models are trained partly on click and query data (what people type into Google and how they behave on the screen when the search results are delivered). ‘How they behave’ is stuff like clicking, mouse hovers, time on page and pogo-sticking – that’s when users jump back and forth until they find what they want. It’s all taken into account by these models. Additionally, quality rater* ratings train these algos, which includes assessing for E-E-A-T (Experience, Expertise, Authoritativeness, and Trustworthiness).
(*If you’re not familiar already, I recommend you learn about Google’s Quality Raters Guidelines)
So, these AI systems:
1. analyse these signals from users
2. turn them into algorithms
3. then uses those algorithms to rinse and repeat, if you like.
It’s an ongoing learning process, fine-tuning along the way. This is important to note because before AI, Search was very literal — and limited and relied heavily on matching keywords and having loads of links.
And that’s why certain SEO agencies that take your money and buy links and augment your page with clumsily placed keywords are wasting your time because it doesn’t work like that anymore.
But I digress.
I’ve mentioned BERT a bit...
Historically speaking, BERT is super important in the world of Search. BERT is a language model introduced in October 2018 by researchers at Google. It stands for Bidirectional Encoder Representations from Transformers. And it was the first model with the ability to analyse words on either side (that’s the bidirectional bit).
BERT helped Google enormously by understanding the nuances of natural language better.
A lot better.
Within a couple of years, BERT became known as a ubiquitous baseline for Natural Language Processing experiments throughout the world.
Wanna see BERT in action?
QUERY 1: “2019 Brazil Traveler to USA need a Visa”
Pre-BERT search result: WASHINGTON POST: US citizens can travel to Brazil without the red tape of a visa...
(This is just a news article. It doesn’t help a Brazilian traveller go get a VISA quickly.)
Post-BERT search result: Tourism & Visitor | US Embassy & Consulates in Brazil: In general, tourist travelling into the United States require valid B-2 visas...
(This gives more specific info on what a Brazilian traveller will need)
QUERY 2 : “Can you get medicine for someone pharmacy”
Pre-BERT search result: MedicinePlus – getting a prescription filled – your healthcare provider may give you a prescription in… Writing a paper prescription that you take to a local pharmacy…
(Nope, Google. That was lousy. You didn’t get it at all.)
Post-BERT search result: HHS.gov - Can a patient have a friend or family member pick up a prescription
(Bingo! Well done, Google. You clever thing, BERT!)
What does this mean for SEO?
The introduction of these language models has emphasised to writers and SEOs the importance of creating high-quality content relevant to what users are searching for.
Instead of focusing on keywords (although they are still important) — and being ridiculously pedantic over the order of keywords, we must focus on answering the user’s intent. We need to be natural and informative, not robotic. Focus your content on the user. What do they really want to get from the page? What did they come to the page asking? What were they concerned about?
Make your content comprehensive enough to fulfil their needs — but be concise and informative.
But I’ve repeated myself.
Does this mean everything changes?
No. Not at all.
Your pages still need to be search optimised to be found on Google. Remember, the first step in Search involves Google’s algorithm (which consist of hundreds of factors) ranking your webpages. So, things like keywords, meta data, good links, quality content and technical SEO still matter. Now though, there’s more flexibility and intelligence, allowing writers to write more naturally, like the living, breathing human beings we are.
I hope this has helped. In my next post, I will explain how I optimise my pages for Google’s AI models in 2024.
Oh, and it’s not over. Changes will keep coming — including the really big one — SGE. But let’s take it one step at a time. Good luck!
So who am I, anyway?
Hi, I’m Abi White. I’m an SEO consultant, SEO copywriter and conversion copywriter with over 14 years of online experience. My online career started by creating and directing one of Australia's first pure-play international e-commerce businesses, abi and joseph.
These days, I write for Colgate-Palmolive and their associated brands, e.g. Elmex, via my Paris-based agent. I also freelance for SaaS, medical and dental. From 2021, I was the senior copywriter and SEO for Retail Express by Maropost, heading Australia and Canada before pushing into other international markets. I’ve been the senior copywriter for a Melbourne digital agency for five years, and online marketing manager and SEO for a chain of Perth dental practices.
Read more about me here.
Need help with your SEO, content strategy or copywriting?
I can help. I keep myself up-to-date with Google’s updates and have completely adapted my work to their new search systems and models. Get in touch, and let’s see how I can help you.