Google Phantom / Core Update History
What is the Phantom Update?
The Phantom update was coined by Glenn Gabe back in May 2015. It refers to a specific update when Google changed how they assessed the quality of a site. Subsequently, the term “core ranking update”, “the quality update” and “the Phantom update” are all used interchangeably.
Google did confirm the first update in May 2015: http://searchengineland.com/the-quality-update-google-confirms-changing-how-quality-is-assessed-resulting-in-rankings-shake-up-221118 and a subsequent update in January 2016: https://www.seroundtable.com/google-core-ranking-21460.html
There have been a number of unconfirmed updates / refreshes since that point in time, all of which share very similar characteristics and are listed in more detail below.
From our understanding, it appears as if the algorithm needs to be refreshed before any noticeable changes, positive or negative, are realised. It is not an algorithm that currently runs in real time. That doesn’t mean you have to be impacted by a previous update to see movement. Sites that make changes and address issues can see positive movement during subsequent core updates.
The Phantom update affects the whole domain and sites impacted often see ranking changes across the board. However, it can impact URLs in different ways. For example, popular landing pages can all have high quality content that serves the user well, but if there is an intrusive popup built in to the site template or excessive ads, it could trigger the Phantom algorithm.
Which sites are affected?
A large proportion of sites hit by Phantom were previously hit by the Panda algorithm, which led many people to believe that this was the new real-time Panda. However, Phantom is a separate algorithm altogether.
Phantom impacts sites with heavy use of intrusive advertising and popups.
Phantom impacts sites with a poor UX, caused by broken user interfaces, excessive pagination, infinite scrolling and poor site design.
Phantom impacts sites with low quality content, including pages with thin content, duplicate content and sites that block certain content from appearing on mobile.
When does it update?
The first three updates to the Phantom algorithm were made in the month of May, in three consecutive years.
However, updates now appear to be being made at irregular intervals and can take approximately a month to rollout fully (known as tremors).
What is the difference between Phantom and Panda?
The Panda algorithm focuses on content quality, whereas Phantom focuses on barriers to user engagement.
For example, a site with great, original and comprehensive content would not be flagged by the Panda algorithm. However, if it simultaneously used deceptive means to disguise adverts or spam the user with intrusive popups, it would be flagged by the Phantom algorithm.
The reason why sites that have been previously affected by Panda often get hit by Phantom is because that poor quality and poor user engagement are commonly both found on low-quality sites. For example, it is likely that a scraper site will also flood their pages with ads, leading to poor user engagement.
May 20th, 2013 (PHANTOM 1)
Four large websites were hit by the first core update.
The first site was linking to too many sites with followed links (where they should have been nofollowed). This site was an authority in its space.
Two of the sites were sister sites that were crosslinking to each other using exact match anchor text.
All four sites had pockets of spammy links in their profiles, including directories, comment spam and spun articles.
Two of the sites were scraping content and publishing to fill out their pages.
All four sites had already been hit by Panda in the past.
May 23rd, 2014 (PHANTOM 2)
6/27 sites that were initially analysed after being hit by Phantom recovered in May. 5 of these actively made changes after being hit.
Changes made to these sites related to addressing low quality content, thin pages etc.
May 11th, 2015
Phantom has nothing to do with the mobile-friendly update.
The most recent update focused on sites with content quality problems.
Many sites analysed that were hit by this update were again hit by Panda in the past.
Tag pages were hit hard by this Phantom update, which all provided large lists of links pointing to other pages on the domain. Infinite scroll was often used.
Sites using popups automatically triggered when you land on a page, were hit.
Pages using stacked videos were hit.
Sites with syndication errors were hit.
Poor site design, large amount of links on one page and small font were factors.
Low quality comments (old, no relevancy to page content) were factors.
Lots of ad-heavy pages and adverts blending with on-page style were factors.
The sites impacted had huge backlink profiles, containing some followed links from partner sites.
June 5th, 2015
Google confirms Phantom is not Panda or Penguin and that is addresses “quality signals”.
Both Phantom and Panda seem to chew on similar problems (content).
Phantom changed how Google’s core ranking algorithm assesses these “quality signals”.
Google confirms that Phantom is a page-level algorithm but evidence suggests it is site-wide.
Phantom recoveries are achieved by addressing traditional Panda issues relating to content.
Impacted sites utilised infinite scroll had a huge amount of thin pages.
Directory sites were hit with thin pages and lots of on-page ads. All links on these directories were also followed.
Pages that covered subject in detail and had great user engagement experienced a partial recovery.
Forums that provided complex and accurate responses to queries experienced a rankings boost.
Ultra-thin local listing experienced rankings drop.
July 17th, 2015
Phantom tremors are being detected, suggesting that the algorithm is being tweaked.
Phantom could be the real-time Panda algorithm as every site analysed had content quality problems that would have been picked up by Panda.
December 2nd, 2015 (PHANTOM 3)
Phantom affects rich snippets in the SERPs. Sites that were hit by Phantom lost their rich snippets and vice versa.
Pure Spam manual action penalties were issues on the same day as Phantom algorithm tremor (28/11/15).
Ad Deception is a Phantom problem. Sites that blended their ads in with the page content, as well as the navigation. Google mentions ad deception in a newly released guideline around this time.
Clumsy user experience, including bad interfaces, are a Phantom problem too, particularly infinite scrolling and overlays that are hard to manoeuvre.
Excessive pagination is a Phantom problem, with sites that make the user jump through multiple pages in order to access the content badly hit.
Low quality and disorganised supplementary content is a Phantom problem. For example, pages with scattered links and lists in the sidebars – bad for UX.
Popups and prompts as you scroll is a Phantom problem.
Sites that successfully recovered from Phantom added author information to improve their content quality.
Use fetch and render in GSC to ensure on-page ads are not rendering differently to the search engine bots.
Some sites recovered by ensuring their mobile and desktop versions were properly connected using rel=canonical and rel=alternate.
August 29th, 2016
Phantom impacted sites that were blocking the user experience. For example, sites that blocked the user being able to use the back button to leave the site as well as excessive popups, inhibiting the user.
Phantom impacted sites with aggressive content that pushed the content down the page or hid it completely.
Sites with rendering issues were impacted by Phantom.
Deceptive adverts woven into the main content and forced downloads were present on sites effected by Phantom.
Phantom impacted sites with thin content, leading to very low dwell time.
November 16th, 2016
Sites affected by this core update had mobile problems relating to bad UX.
UX problems included popups, interstitials, rendering issues and broken UI.
December 14th, 2016
The switch to mobile first index means that sites that block their desktop content from appearing on mobile are more likely to be hit by Phantom.
Sites that cleaned up their mobile UX were more likely to experience a recovery from phantom.
February 5th, 2017
Google’s Quality Rater Guidelines were tweeted out by Gary Illyes right after the algorithm update, which reiterates the danger of misleading advertising, suggesting that it is an important algorithmic factor.
Relevancy for search queries greatly increased after this update, leading to sites with poor relevancy losing rankings.
Sites with confusing navigation and broken interfaces were hit by this most recent update.
If rich snippets were lost during this update, it is likely that a site was hit by Phantom.
Sites that removed thin pages from the index saw a recovery from this update.
Sites that enhanced their mobile UX saw improvements in rankings from this update.
Sites that fixed technical SEO problems relating to canonicalization, robots.txt and meta robots which were causing quality problems, saw recoveries from this update.
Below is a list of key issues from which a website can audit to identify risk factors associated with the Phantom update.
Targeting keywords that are relevant
Thin pages being indexed
Poor desktop & mobile UX
Technical SEO problems effecting site quality
Perform crawl analysis to highlight any quality problems
Run a Panda Report to analyse URLs with most volatility
Large number of external followed links
Unnatural crosslinking using exact match anchors
Unnatural links in profile
Publishing scraped content
Previous Panda penalty
Tag pages used
Automatic popups used? Intrusive on mobile
Poor site design
Low quality comments
Advertising issues? Advert blending
Content quality problems
Use of Infinite Scrolling
Disorganized & low-quality supplementary content
Author information used
Do dropped landing pages have low-quality content
Interstitial issues on mobile
Rendering issues on mobile
Broken mobile UI
Content being blocked on mobile