Google Algorithm LEAK - (DON'T TRUST GOOGLE)
Summary
TLDRIn this video, the host discusses a significant leak of Google's search algorithm, shared with SEO expert Rand Fishkin. The document confirms long-suspected SEO theories, such as the existence of a 'sandbox' for new websites, site authority's impact on rankings, and the use of click data to adjust rankings. It also reveals Google's use of Chrome's clickstream data and the prioritization of certain domains during critical events. The video highlights the importance of building site authority and credibility, and the potential challenges faced by small websites in ranking, urging viewers to download and analyze the leaked document for deeper insights.
Takeaways
- 📜 A massive Google search algorithm document was leaked, revealing internal operations that were previously undisclosed.
- 🔍 The existence of a 'sandbox' for new websites was confirmed, where they are treated differently until they gain credibility, contrary to Google's past denials.
- 🏆 The concept of 'site authority' is recognized internally at Google, indicating the overall authority of a site and its impact on search rankings.
- 👀 Click-through rate (CTR) and user interaction data are used to adjust rankings, a practice Google had previously denied.
- 🌐 Google Chrome's clickstream data is utilized in search rankings, despite Google's previous statements to the contrary.
- 🔗 The importance of links is detailed, including freshness, the tier of linking pages, and detection of link spam.
- 🛑 Google has the ability to manually adjust rankings for critical events, such as during the COVID-19 pandemic or elections, to prioritize credible information.
- 👤 Google tracks and measures authorships and entities, suggesting that building a reputation as an author can positively impact content ranking.
- 🏠 Small websites may struggle to rank despite SEO best practices, highlighting the need to build site authority and credibility.
- 📉 Various demotion factors were revealed, such as anchor mismatch, exact match domain devaluation, product review quality, and SERP demotion based on user dissatisfaction.
Q & A
What is the significance of the leaked Google search algorithm document?
-The leaked document is significant because it confirms many aspects of Google's search engine operations that SEOs have long speculated about but were previously denied by Google. It provides insights into how Google treats new websites, evaluates links, uses click data, and more.
What is the 'sandbox' mentioned in the document?
-The 'sandbox' is a term used to describe how Google treats new websites differently until they earn a specific amount of credibility. This means that results for new websites may not appear immediately in search rankings.
What does the document reveal about Google's stance on domain authority?
-Contrary to Google's public statements, the document shows that they do measure an overall 'site authority' which impacts search rankings. This confirms what many SEOs have long believed.
How does click-through rate (CTR) influence search rankings according to the leaked document?
-The document reveals that Google uses systems like 'nav boost' and 'glue' that utilize click data to adjust rankings. They analyze user behavior metrics like good clicks, bad clicks, and long clicks, which play a crucial role in the search algorithm.
What role does Google Chrome data play in search rankings?
-Despite previous denials, the document shows that Google uses Chrome's clickstream data, including user interaction, site visits, views, and engagement duration, to influence the site quality score and the generation of site links in search results.
Why are links important in Google's search algorithm?
-Links are crucial as Google evaluates their freshness and the specific tier of the linking page. They also detect link spam velocity, which can help them counteract negative SEO attacks.
What is special priority and whitelisting in the context of Google search?
-Google prioritizes specific domains for certain searches, such as during critical events like COVID-19 or elections, to ensure that trustworthy sites appear at the top. This manual adjustment of rankings is aimed at ensuring the reliability of information.
How does Google track authorships and entities?
-Google explicitly tracks and measures authorships and entities. Building a reputation as an author and ensuring proper markup can positively impact the ranking of content, as Google treats recognized authors as entities in their system.
What impact does the leaked document suggest Google might have on small websites?
-The document suggests that small, personal sites may struggle to rank despite following SEO best practices. It is important for these sites to focus on building their site authority and credibility.
What are some specific demotions that can impact a site's ranking?
-Demotions can occur due to factors like anchor mismatch (where the link doesn't match the target site), exact match domain issues, product review quality, and potential user dissatisfaction measured by click data.
Outlines
🔍 Google Search Algorithm Leak Analysis
This paragraph discusses a significant leak of Google's search algorithm, which was shared with Rand Fishkin, a co-founder of Moz. The document is massive and confirms many SEO theories previously denied by Google. It reveals the existence of a 'sandbox' effect for new websites, the importance of domain authority, and the use of click data to influence search rankings. It also touches on the use of Chrome's clickstream data and the detection of link spam. The speaker recommends downloading the document and using AI tools for in-depth analysis.
🚫 Impact of Google's Algorithm on Small Websites
The second paragraph delves into the potential negative impact of Google's search algorithm on small websites. Despite following SEO best practices, small sites may struggle to rank. The document suggests that Google may treat small personal sites differently, and the speaker emphasizes the importance of building site authority and credibility. It also covers various demotion factors such as anchor mismatch, the devaluation of exact match domains, and the recent update on product review quality. The paragraph concludes with a mention of SERP demotion based on user dissatisfaction, measured through click tracking.
Mindmap
Keywords
💡Google search algorithm leak
💡Sandbox
💡Domain Authority
💡CTR (Click-Through Rate)
💡Chrome data
💡Links and Link Spam
💡Priority and Whitelisting
💡Authorships and Entities
💡Impact on small websites
💡Demotions
💡Product reviews
Highlights
A massive document revealing Google's internal search engine operations was leaked and shared with Rand Fishkin, co-founder of Moz.
The document confirms many SEO theories previously denied by Google, such as the existence of a 'sandbox' for new websites.
Google treats new websites differently until they earn credibility, contrary to previous statements by John Mueller.
The document reveals a metric called 'site authority', confirming Google measures the overall authority of a site.
Click-through rate (CTR) matters and influences search rankings, as shown by systems like 'nav boost' and 'glue'.
Google uses Chrome's clickstream data in search rankings, despite previous denials.
User interaction, such as site visits and engagement duration, influences the site quality score.
Google evaluates links based on freshness and the tier of the linking page, and can detect link spam.
Google can manually adjust rankings to ensure the reliability of information during critical events, such as the COVID-19 pandemic.
Google tracks and measures authorships and entities, suggesting that building a reputation as an author can impact rankings.
Small websites may struggle to rank despite SEO best practices, highlighting the importance of building site authority.
Specific demotions can impact a site's ranking, such as anchor mismatch and exact match domain issues.
Product review quality and trustworthiness can affect rankings, especially after recent updates.
Google tracks user dissatisfaction with search results, potentially demoting pages based on click data.
The document's analysis can be enhanced by using tools like Chat GPT or Gemini to delve deeper into its content.
The document's insights are significant for SEO professionals and can help in understanding and adapting to Google's search algorithm.
Transcripts
what's up guys so not sure you guys
heard but there's been a pretty big uh
Google search algorithm leak someone
shared with ran fishkin he's uh one of
the co-founders of mos one of the ogs in
the space so someone shared with with
him what it looks like to be a massive
document revealing Google's internal
search engine operation so here's the
document right here it's absolutely
massive I'm going to link to a bunch of
blog posts that analyze this extremely
well but if you're interested in taking
a look at this yourself um again you'll
have the link to that but I recommend
downloading this UB version turning it
into a PDF and then feeding either chat
GPT or Gemini they're going to go really
in depth for you and help you analyze
all of that I've been using it and it's
super helpful but anyway let's talk
about some of the things that stood out
to
me um and the main thing is there's so
many things that seos have been talking
about for some time things that we
believe exist based on experiments and
based on different tests that we run uh
and Google has always been saying that
they don't exist but luckily uh these
documents now confirm a lot of those
things plus a bunch of other things so
let's talk about some things that stood
out to me so the first is the sandbox so
there's a section in the document that
proves that Google treats new websites
differently until they earn a specific
amount of credibility here's John
Mueller saying that there is no sandbox
good for him good for us uh so yeah so
there is a sandbox not exactly sure how
long it is people say it's 3 months 6
months but if you do have any website
very important to know that results will
not come right away right so be patient
and just the fact that there is a
Sandbox I think will help a lot of new
website owners next domain Authority
here's Gary is saying they don't have
overall domain Authority but within the
internal documents there is a metric
called site Authority which seems to be
exactly that so it does mean that in
fact Google does measure the overall
authority of the site and that does seem
to be impacting the rankings which we
also already knew CTR matters again
another thing that we've known for a
while they've always denied uh using
click data to influence search rankings
but the leak documents show that there's
systems like nav boost and glue that use
click data to adjust rankings so so they
basically look at 13 months of Click
data to boost or lower rankings based on
user Behavior metrics like good clicks
bad clicks and last long clicks are
considered so so this proves that user
interaction within the search results
are plays a crucial crucial role uh
inside of the Google search algorithm
which we also already knew now next
thing is Chrome data inside of the
rankings so Google has denied using
Google Chrome um for their ranking
algorithms but within the leak documents
uh they reveal that Google does use
Chrome's clickstream data in the search
rankings and that includes stuff like
user interaction site visits views
engagement duration all of this
influences the site quality score and
also the generation of site links within
the search results so again it is
showing that Chrome data is an important
factor and how search results are
determined next links are very important
again we've known this for quite a while
um but the documents do go into detail
about how Google about how Google is
evaluating these links freshness and the
specific tier of the linking page and
then they also seem to detect link spam
velocity so so if you are for example
creating some type of negative SEO
attack on a website Google can see the
amount of Link spam that you're sending
to a specific website and it can detect
and probably uh nullify that negative
SEO attack which is kind of interesting
but it also makes a lot of sense so some
things that I was not expecting special
priority and whitelisting specific
domains so does seem that Google
prioritizes specific domains for certain
searches for example during covid-19
they prioritize covid related
information from credible sources so
same goes for election related queries
ensuring that trustworthy sites appear
at the top so this shows that Google can
manually adjust rankings to ensure the
reliability of information during
critical events but it is also kind of
scary how easy they can just just
manipulate specific searches next
authors and entities so Google
explicitly tracks and measures
authorships and entities so again wasn't
really not really sure how they can do
that at scale but this means that
building a reputation as an author and
ensuring that you have proper markup
will also positively impact your ranking
so Google identifies authors and treats
them as entities in the system so having
a recognized author can boost The
credibility and ranking of your content
next and this one is also quite scary
the impact on small websites it does
seem that Google might be in some way
treating small websites differently so
the documents do suggest that small
personal sites uh can struggle to rank
despite following the SEO best practices
that other massive sites also follow so
very important to focus on building your
sites Authority and credibility which
again we also already knew so then we
have demotions this one's kind of
interesting so specific demotions that
can impact your site's ranking for
example anchor mismatch if the link
doesn't match the target site it's
linking to that link is demoted exact
match domain we've known this for a
while they don't receive as much value
as they historically did back in the
early days you could just get an exact
Max domain just rank whatever you'd like
um product review so this one's probably
based on the recent update uh in terms
of quality and trustworthiness of
product reviews and then Ser demotion
this is interesting so again potential
user dissatisfaction with the page as
likely measured by click so again
they're tracking clicks they know
whether people are liking that specific
result clicking on it how long are they
clicking on it for if they're not liking
that result then they're probably going
to demote it within the serps now
there's so much more guys I've barely
covered uh the main things these are
just the things that kind of stood out
to me I'm going to add links to all the
resources in the description highly
recommend downloading it and turning
into PDF and then adding it to chbt or
Gemini it's going to really help you
guys with the analysis let me know if
you guys have any comments any questions
and I'll see you guys in the next one
浏览更多相关视频
Google Has Been Lying About Their Search Results
Google Leak: Whitelist Domain Memudahkan Ranking Google Search
How to Rank on Google in 2024: A Simple Strategy for Any Website
Did Google AI Overviews Just Destroy SEO? Find Out! 😱
Local SEO Strategy - Rank #1 From One Spreadsheet
CANVA WEBSITE SEO | How to Use Search Engine Optimization on Canva
5.0 / 5 (0 votes)