You searched for Case Study - LinkGraph https://linkgraph.io/ High authority link building services, white hat organic outreach. Tue, 29 Nov 2022 22:39:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://linkgraph.io/wp-content/uploads/2021/06/cropped-LinkGraph-Favicon-32x32.png You searched for Case Study - LinkGraph https://linkgraph.io/ 32 32 Google Algorithm Update History https://linkgraph.io/blog/google-algorithm-update-history/ https://linkgraph.io/blog/google-algorithm-update-history/#comments Fri, 21 Oct 2022 21:39:41 +0000 https://linkgraph.io/?p=2935 Learn how Google's algorithm has developed over time, what drove changes, and what it means for search and your own web content.

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
Intro

The Google algorithm is constantly changing. In 2018 alone, Google ran 15,096 Live traffic experiments, and launched 3,234 updates to its search algorithm.

 

Three variations of google search result layouts being tested with users.


Not all updates have significant impact on the search results. This page covers the top 150 updates to how search results function from 2000-2019. Updates are a blend of changes to:

 

  • Algorithms
  • Indexation
  • Data (aka Data Refreshes)
  • Google Search UIs
  • Webmaster Tools
  • Changes to ranking factors and signals

Before we get into the timeline of individual google updates, it’s going to be helpful to define a handful of things upfront for any SEO newbies out there:

Google’s Core Algorithm

SEO experts, writers, and audiences will often refer to “Google’s Core Algorithm” as though it is a single item. In reality, Google’s Core Algorithm is made up of millions of smaller algorithms that all work together to surface the best possible search results to users. What we mean when we say “Google’s Core Algorithm” is the set of algorithms that are applied to every single search, which are no longer considered experimental, and which are stable enough to run consistently without requiring significant changes.

Google Panda (2011-2016)

The Panda algorithm focused on removing low quality content from search by reviewing on-page content itself. This algorithm focused on thin content, content dominated by ads, poor quality content (spelling/grammar mistakes), and rewarded unique content. Google Panda was updated 29 times before finally being incorporated into the core algorithm in January of 2016.

Google Penguin (2012-2016)

The Penguin algorithm focused on removing sites engaging in spammy tactics from the search results. Penguin primarily filtered sites engaging in keyword stuffing and link schemes out of the search results. Google Penguin was updated 10 times before being integrated into Google’s core algorithm in September of 2016.

RankBrain (2015-Present)

This machine-learning based AI helps Google process and understand the meaning behind new search queries. RankBrain works by being able to infer the meaning of new words or terms based on context and related terms. RankBrain began rolling out across all of Google search in early 2015 and was fully live and global by mid-2016. Within three months of full deployment RankBrain was already the 3rd most important signal contributing to the results selected for a search query.

Matt Cutts

One of the first 100 employees at Google, Matt Cutts was the head of Google’s Web Spam team for many many years, and interacted heavily with the webmaster community. He spent a lot of time answering questions about algorithm changes and providing webmasters high-level advice and direction.

Danny Sullivan

Originally a Founding Editor, Advisor, and Writer for Search Engine Land (among others), Danny Sullivan now communicates with the SEO community as Google’s Public Search Liaison. Mr. Sullivan frequently finds himself reminding the community that the best way to rank is to create quality content that provides value to users.

Gary Illyes

Google Webmaster Trends Analyst who often responds to the SEO community when they have questions about Google algorithm updates and changes. Gary is known for his candid (and entertaining) responses, which usually have a heavy element of sarcasm.

Webmaster World:

Frequently referenced whenever people speak about Google algorithm updates, webmasterworld.com is one of the most popular forums for webmasters to discuss changes to Google’s search results. A popular community since the early 2000’s webmasters still flock to the space whenever major fluctuations are noticed to discuss theories.

Years.
Tags.

 

2021 Google Search Updates

2021 December – Local Search Update

From November 30th – December 8th, Google runs a local search ranking update. This update rebalances the various factors used to generate local results. Primary ranking factors for local search remain the same: Relevance, Distance, and Prominence. 

Additional Reading:

2021 November – Core Quality Update

From November 17th – November 30th, Google rolls out another core update. As with all core updates, this one is focused on improving the quality and relevance of search results. 

Additional Reading:

2021 August – Title Tag Update

Starting August 16th, Google starts rewriting page titles in the SERPs. After many SEOs saw negative results from the update, Google rolls back some of the changes in September. Google emphasizes that it still uses content with the <title> tag over 80% of the time. 

Additional Reading:

2021 July – Link Spam Update

Google updates link spam fighting algorithm to improve effectiveness of identifying and nullifying link spam. The update is particularly focused on affiliate sites and those websites who monetize through links.

Additional Reading:

2021 June – Page Experience Update

Google announced in late 2020 that its upcoming 2021 Page Experience update would introduce core web vitals as new Google ranking factors. Core web vitals are a set of user experience criteria that include page load times, mobile responsiveness, visual responsiveness, and more. Google evaluates these metrics through the following criteria:

  1. Largest Contentful Paint (LCP) – The time it takes a web page to load the largest piece of content on the page
  2. First Input Delay (FID) – A measurement of the users first interaction with the page from interactivity and responsiveness.
  3. Cumulative Layout Shift (CLS) – Measures visual stability and how stable the website is when loading and scrolling

This update makes it so Google will evaluate page experiences signals like mobile friendliness, safe browsing, HTTPS security, and intrusive interstitial guidelines when ranking web pages.

Additional Reading:

2021 February – Passage Ranking

Google introduces Passage Ranking and starts indexing passages of web content. Google now hones in on a specific passages of long-form content and ranks those passage in the SERPs. Google highlights the relevant passage and takes the users directly to the relevant passage after clicking on the blue link result. 

Additional Reading:

2020 Google Search Updates

2020 October – Indexing Bugs

From early September to the beginning of October, Google experienced multiple bugs with mobile indexing, canonicalization, news-indexing, top stories carousel, and sports scores breaking. The bugs impacted about .02% of searches. Google fully resolved all impacted urls by October 9th.

Additional Reading:

2020 August 11 – Google Glitch

On Tuesday, August 11th, Google experienced a massive, worldwide indexing glitch that impacted search results. Search results were very low-quality or irrelevant to search queries, and ecommerce sites in particular reported significant impacts on rankings. Google resolved the glitch within a few days.

Additional Reading:

2020 June – Google Bug Fix

A Google representative confirmed an indexing bug temporarily impacted rankings. Google was struggling to surface fresh content.

Additional Reading:

2020 May – Core Quality Update

This May 2020 core update was one of the more significant broad core updates with the introduction of core web vitals and increased emphasis on E.A.T. This update was a continuation of an effort to improve the quality of SERP results with COVID related searches. The update most significantly impacted those sites with low-quality or unnatural links. However some sites with lower-domain authority did appear to see positive ranking improvements for pages with high-quality, relevant content. 

Many SEOs reacted negatively, particularly because of the timing of the update, which occurred at the height of economic shutdowns to slow the spread of coronavirus. Some concerns about the May 2020 core quality update ranged from social media SERP domination and better SERP results for larger, more dominant brands like Amazon and Etsy. Some analysis noted these changes may have been reflecting user intent from quarantine, particularly because the update focused on providing better results for queries with multiple search intents. Google’s responded to the complaints by reinforcing existing content-quality signals. 

Additional Reading:

2020 March – COVID-19 Pandemic

Although not an official update, the coronavirus outbreak led to an unprecedented level of search queries that temporarily changed the landscape of search results. Google made several changes to adjust to the trending searches such as:

  • Increased user personalization to combat misinformation
  • Removed COVID-19 misinformation across YouTube and other platforms
  • Added “Sticky Menu” for COVID related searches
  • Added temporary business closures to the Map Pack
  • Temporarily banned ads for respirators and medical masks
  • Created COVID-19 Community Mobility Reports
  • Temporary limited certain Google My Business listings features

Additional Reading:

2020 February 7 – Unannounced Update

In February of 2020, many SEOs reported seeing significant changes to rankings, although Google had not announced and denied any broad core update. Various analysis of the update showed no clear pattern between websites that were impacted. 

Additional Reading:

2020 January 22 – Featured Snippet De-duplication

Prior to this January 2020 update, those sites that earned the featured snippet, or “position zero,” also appeared as the subsequent organic search result. This update de-duplicated search results to eliminate this double exposure. This impacted 100% of searches worldwide and had significant impacts on rank tracking and organic CTR.

Additional Reading:

2020 January – Broad Core Update

On January 13th, 2020, Google started rolling out another broad core update. Google did not provide details about the update, but did emphasize existing webmaster guidelines about content quality.

Additional Reading:

2019 Google Search Updates

2019 November Local Search Update

In November of 2019 Google rolled out an update to how local search results are formulated (ex: map pack results). This update improved Google’s understanding of the context of a search, by improving its understanding of synonyms. In essence, local businesses may find they are showing up in more searches.

 

2019 October 26 BERT

In October Google introduced BERT a deep-learning algorithm focused on helping Google understand the intent behind search queries. BERT (Bidirectional Encoder Representations from Transformers) gives context to each word within a search query. The “bidirectional” in BERT refers to how the algorithm looks at the words that come before and after each term before assessing the meaning of the term itself.

Here’s an example of bi-directional context from Google’s Blog:

In the sentence “I accessed the bank account,” a unidirectional contextual model would represent “bank” based on “I accessed the” but not “account.” However, BERT represents “bank” using both its previous and next context — “I accessed the… account” — starting from the very bottom of a deep neural network, making it deeply bidirectional.

The introduction of BERT marked the most significant change to Google search in half a decade, impacting 1 in 10 searches — 10% of all search queries.

Additional Reading:

2019 September – Entity Ratings & Rich Results

If you place reviews on your own site (even through a third party widget), and use schema markup on those reviews – the review stars will no longer show up in the Google results. Google applied this change to entities considered to be Local Businesses or Organizations.

The reasoning? Google considers these types of reviews to be self-serving. The logic is that if a site is placing a third party review widget on their own domain, they probably have some control over the reviews or review process.

Our recommendation? If you’re a local business or organization, claim your Google My Business listing and focus on encouraging users to leave reviews with Google directly.

Additional Reading:

2019 September – Broad Core Update

This update included two components:First, it hit sites exploiting a 301 redirect trick from expired sites. In this trick users would buy either expired sites with good SEO metrics and redirect the entire domain to their site, or users would pay a 3rd party to redirect a portion of pages from an expired site to their domain.Note: Sites with relevant 301 redirects from expired sites were still fine.

Second, video content appears to have gotten a boost from this update. June’s update brought an increase in video carousels in the SERPs. Now in September, we’re seeing video content bumping down organic pages that previously ranked above them.

 

We can see this at an even greater scale looking at two purely text and purely video sites – YouTube and Wikipedia. We can see that for the first time, YouTube has eclipsed Wikipedia in the Google search results.

 

Additional Reading:

2019 June – Broad Core Update

This is the first time that Google has pre-announced an update. Danny Sullivan, Google’s Search Liaison, stated that they chose to pre-announce the changes so webmasters would not be left “scratching their heads” about what was happening this time.

What happened?

  • We saw an increase in video carousels in the SERPs
  • Low quality news sites saw losses

What can sites do to respond to this broad core update? It looks like Google is leaning into video content, at least in the short-term. Consider including video as one of the types of content your team creates.

Additional Reading:

2019 May 22-26 – Indexing Bugs

On Wednesday May 22nd Google tweeted that there were indexation bugs causing stale results to be served for certain queries, this bug was resolved early on Thursday May 23rd.

By the evening of Thursday May 23rd Google was back to tweeting – stating that they were working on a new indexing bug that was preventing capture of new pages. On May 26th Google followed up that this indexation bug had also been fixed.

Additional Reading:

2019 April 4-11 De-Indexing Bugs

In April of 2019 an indexing bug caused about 4% of stable URLs to fall off of the first page. What happened? A technical error caused a bug to de-index a massive set of webpages.

Additional Reading:

2019 March 12 – Broad Core Update

Google was specifically vague about this update, and just kept redirecting people and questions to the Google quality guidelines. However, the webmaster community noticed that the update seemed to have a heavier impact on YMYL (your money or your life) pages.

YMYL sites with low quality content took a nose-dive, and sites with heavy trust signals (well known brands, known authorities on multiple topics, etc) climbed the rankings.

Let’s take two examples:

First, Everdayhealth.com lost 50% of their SEO visibility from this update. Sample headline:Can Himalayan Salt Lamps Really Help People with Asthma?

Next, Medicinenet.com saw a 12% increase in their SEO visibility from this update. Sample headline: 4 Deaths, 141 Legionnaires’ Infections Linked to Hot Tubs.

This update also seemed to factor in user behavior more strongly. Domains where users spent longer on the site, had more pages per visit, and had lower bounce rates saw an uptick in their rankings.

Additional Reading:

2019 March 1 – Extended Results Page

For one day, on March 1st, Google displayed 19 results on the first page of SERPs for all queries, 20 if you count the featured snippet. Many hypothesize it was a glitch related to in-depth articles, a results type from 2013 that has long since been integrated into regular organic search results.

Additional Reading:

2018 Google Algorithm Updates

2018 August – Broad Core Update (Medic)

This broad core update, known by its nickname “Medic” impacted YMYL (your money or your life) sites across the web.

SEOs had many theories about what to do to improve rankings after this update, but both Google and the larger SEO community ended up at the same messaging: make content user’s are looking for, and make it helpful.

This update sparked a lot of discussion around E-A-T (Expertise, Authoritativeness, Trustworthiness) for page quality, and the importance of clear authorship and bylines on content.

Additional Reading:

2018 July – Chrome Security Warning

Google begins marking all http sites as “not secure” and displaying warnings to users.

 

Google views security as one of their core principles, so this change makes sense as the next step to build on their October 2017 update that began warning users about unsecured forms.

 

Looking forward, Google is planning on blocking mixed content from https sites.

What can you do? Purchase an SSL certificate and make the move from http to https as soon as possible. Double check that all of your subdomains, images, PDFs and other assets associated with your site are also being served securely.

Additional Reading:

2018 July – Mobile Speed Update

Google rolled out the mobile page speed update, making page speed a ranking factor for mobile results.

Additional Reading:

2018 June – Video Carousels

Google introduces a dedicated video carousel on the first page of results for some queries, and moves videos out of regular results. This change also led to a significant increase in the number of search results displaying videos (+60%).

Additional Reading:

2018 April – Broad Core Update

The official line from Google about this broad core update, is that it rewards quality content that was previously under-rewarded. Sites that had content that was clearly better than the content of it’s organic competitors saw a boost, sites with thin or duplicative content fell.

2018 March – Broad Core Update

March’s update focused on content relevance (how well does content match the intent of the searcher) rather than content quality.

What can you do? Take a look at the pages google is listing in the top 10-20 spots for your target search term and see if you can spot any similarities that hint at how Google views the intent of the search.

Additional Reading:

2018 March – Mobile-First Index Starts to Roll Out

After months of testing Google begins rolling out mobile-first indexing. Under this approach, Google crawls and indexes the mobile version of website pages when adding them to their index. If content is missing from mobile versions of your webpages, that content may not be indexed by Google.

To quote Google themselves,

“Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.”

Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.

 

Additional Reading:

 

2017 Google Search Updates

2017 December – Maccabees

Google states that a series of minor improvements are rolled out across December. Webmasters and SEO professionals see large fluctuations in the SERPs.

 

Danny Sullivan's Maccabees Tweets about how there are always multiple daily updates, no single update.
Barry Schwartz gave this set of updates the Maccabees nickname as he noted the most fluctuation around December 12 (occurring during Hanukkah). However, updates occurred from the very beginning until the very end of December.

 

What were the Maccabees changes?

Webmasters noted that doorway pages took a hit. Doorway pages act as landing pages for users, but don’t contain the real content – users have to get past these initial landing pages to access content of any value. Google considers these pages barriers to a user.

A writer at Moz dissected a slew of site data from mid-december noted one key observation. When two pages ranked for the same term, the one with better user engagement saw it’s rankings improve after this update. The other page saw its rankings drop. In many instances what happened for sites that began to lose traffic, is that blog pages were being shown/ranked where product or service pages should have been displayed.

A number of official celebrity sites fall in the rankings including (notably) Channing Tatum, Charlie Sheen, Kristen Stewart, Tom Cruise, and even Barack Obama. This speaks to how Google might have rebalanced factors around authoritativeness vs. content quality. One SEO expert noted that thin celebrity sites fell while more robust celebrity sites (like Katy Perry’s) maintained their #1 position.

Multiple webmasters reporting a slew of manual actions on December 25th and 26th, and some webmasters also reported seeing jumps on the 26th for pages that had been working on site quality.

Additional Reading:

2017 November – Snippet Length Increased

Google increases the character length of meta descriptions to 300 characters. This update was not long-lived as Google rolled back to the original 150-160 character meta descriptions on May 13, 2018.

2017 May – Quality Update

Webmasters noted that this update targeted sites and pages with:

  • Deceptive advertising
  • UX challenges
  • Thin or low quality content

Additional Reading:

2017 March – Fred

In early March webmasters and SEOs began to notice significant fluctuations in the SERPs, and Barry Schwartz from SEJ began tweeting Google to confirm algorithm changes.

The changes seemed to target content sites engaging in aggressive monetization at the expense of users. Basically sites filling the internet up with low-value content, meant to benefit everyone except the user. This included PBN sites, and sites created with the sole intent of generating AdSense income.

Fred got its name from Gary Illyes who suggested to an SEO expert asking if he wanted to name the update, that we should start calling all updates without names “Fred.”

 

The joke, for anyone who knows the webmaster trends analyst, is that he calls everything unnamed fred (fish, people, EVERYTHING).

 

The SEO community took this as a confirmation of recent algorithm changes (note: literally every day has algorithm updates). Validating them digging into the SERP Changes.

Additional Reading:

2017 January 10 – Pop Up Penalty

Google announces that intrusive pop ups and interstitials are going to be factored into their search algorithm moving forward.

“To improve the mobile search experience, after January 10, 2017, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly.”

This change caused rankings to drop for sites that forced users to get past an ad or pop up to access relevant content. Not all pop ups or interstitials were penalized, for instance the following pop ups were still okay:

  • Pop ups that helped sites stay legally compliant (ex: accepting cookies, or verifying a user’s age).
  • Pop ups that did not block content on load.

Additional Reading:

2016 Google Search Updates

2016 September – Penguin 4.0

The Google announcement of Penguin 4.0 had two major components:

  • Penguin had been merged into the core algorithm, and would now have real-time updates.
  • Penguin would be more page-specific moving forward rather than impacting entire domains.

SEOs also noted one additional change. Penguin 4.0 seemed to just remove the impact of spam links on SERPs, rather than penalizing sites with spammy links. This appeared to be an attempt for Google to mitigate the impact of negative SEO attacks on sites.

That being said, today in 2019 we still see a positive impact from running disavows for clients who have seen spammy links creep into their backlink profiles.

Additional Reading:

2016 September – Possum Update

This update targeted duplicate and spammy results in local search (Local Pack and Google Maps). The goal being to provide more diverse results when they’re searching for a local business, product, or service.

Prior to the Possum update Google was filtering out duplicates in local results by looking for listings with matching domains or matching phone numbers. After the Possum update Google began filtering out duplicates based on their physical address.

Businesses who saw some of their listings removed from the local pack may have initially thought their sites were dead (killed by this update), but they weren’t – they were just being filtered (playing possum). The term was coined by Phil Rozek

SEOs also noted that businesses right outside of city limits also saw a major uptick in local rankings, as they got included in local searches for those cities.

Additional Reading:

2016 May – Mobile Friendly Boost

Google boosts the effect of the mobile-friendly ranking signal in search.
Google took time to stress that sites which are not mobile friendly but which still provide high quality content will still rank.

Additional Reading:

2016 February 19 – Adwords Change

Google Removes sidebar ads and ads a fourth ad to the top block above the organic search results.
This move reflects the search engine giant continuing to prioritize mobile-first experiences, where side-bar ads are cumbersome compared to results in the main content block.

2016 January – Broad Core Update + Panda Is Now Core

Google Confirms core algorithm update in January, right after confirming that Panda is now part of Google’s core algorithm.

Not a lot of conclusions were able to be drawn about the update, but SEOs noticed significant fluctuations with news sites/news publishers. Longform content with multi-media got a boost, and older articles took a bit of a dive for branded terms. This shift could reflect Google tweaking current-event related results to show more recent content, but the data was not definitive.

Additional Reading:

2015 Google Search Updates

2015 December – SSL/HTTPS by Default

Google starts indexing the https version of pages by default.

Pages using SSL are also seeing a slight boost. Google holds security as a core component of surfacing search results to users, and this shift becomes one of many security-related search algo changes. In fact, by the end of 2017 over 75% of the page one organic search results were https.

2015 October 26 – RankBrain

in testing since April 2015, Google officially introduced RankBrain on this date. RankBrain is a machine learning algorithm that filters search results to help give users a best answer to their query. Initially, RankBrain was used for about 15 percent of queries (mainly new queries Google had never seen before), but now it is involved in almost every query entered into Google. RankBrain has been called the third most important ranking signal.

Additional Reading:

2015 October 5 – Hacked Sites Algorithm

Google introduces an algorithm specifically targeting spammy in the search results that were gaining search equity from hacked sites.

This change was significant, it impacted 5% of search queries. This algorithm hides sites benefiting from hacked sites in the search results.

 

Interactions with Gary Illyes at #pubcon and on twitter suggest that this algo only applies to search queries traditionally known to be spammy.

 

 

The update came right after a September message from Google about cracking down on repeat spam offenders. Google’s blog post notified SEOs that sites which repeatedly received manual actions would find it harder and harder to have those manual actions reconsidered.

Additional Reading:

2015 August 6 – Google Snack Pack

Google switches from displaying seven results for local search in the map pack to only three.

Why the change? Google is switching over (step-by-step) to mobile-first search results, aka prioritizing mobile users over desktop users.

On mobile, only three local results fit onto the screen before a users needs to scroll. Google seems to want users to scroll to then access organic results.

Other noticeable changes from this update:

  • Google only displays the street (not the complete address) unless you click into a result.
  • Users can now filter snack pack results by rating using a dropdown.

Additional Reading:

2015 July 18- Panda 4.2 (Update 29)

Roll out of Panda 4.2 began on the weekend of July 18th and affected 2-3% of search queries. This was a refresh, and the first one for Panda in about 9 months.

Why does that matter? The Panda algorithm acts like a filter on search results to sort out low quality content. Panda basically gets applied to a set of data – and decides what to filter out (or down). Until the data for a site is refreshed, Panda’s ruling is static. So when a data refresh is completed, sites that have made improvements essentially get a revised ruling on how they’re filtered.

Nine months is a long time to wait for a revised ruling!

2015 May – Quality Update / Phantom II

This change is an update to the quality filters integrated into Google’s core algorithm, and alters how the algorithm processes signals for content quality. This algorithm is real-time, meaning that webmasters will not need to wait for data refreshes to see positive impact from making content improvements.

What kind of pages did we see drop in the rankings?

  • Clickbait content
  • Pages with disruptive ads
  • Pages where videos auto-played
  • How-to sites with thin or duplicative content (this ended up impacting a lot of how-to sites)
  • Pages that were hard to navigate/had UI barriers

In hindsight, this update feels like a precursor to Google’s 2017 updates for content spam and intrusive pop ups.

Additional Reading:

2015 April 21 – Mobilegeddon

Google boosts mobile-friendly pages in mobile search results.

This update was termed Mobilegeddon as SEOs expected it to impact a huge number of search queries, maybe more than any other update ever had. Why? Google was already seeing more searches on mobile than on desktop in the U.S. in May 2015.

In 2018 Google takes this a step further and starts mobile-first indexing.

Additional Reading:

2014 Google Algorithm Updates

2014 December – Pigeon Goes International

Google’s local algorithm, known as Pigeon, expands to international English speaking countries (UK, Canada, Australia) on December 22, 2014.

In December Google also releases updated guidelines for local businesses representing themselves on Google.

Additional Reading:

2014 October – Pirate II

Google releases an “improved DMCA demotion signal in Search,” specifically designed to target and downrank some of the sites most notorious for piracy.

In October Google also released an updated report on how they fight piracy, which includes changes they made to how results for media searches were displayed in search. Most of these user interface changes were geared towards helping user find legal (trusted) ways to consume the media content they were seeking.

 


—————————–
Additional Reading:

 

2014 October 17 – Penguin 3.0

This update impacted 1% of English search queries, and was the first update to Penguin’s algorithm in over a year. This update was both a refresh and a major algorithm update.

2014 September – Panda 4.1 (Update 28)

Panda 4.1 is the 28th update for the algorithm that targets poor quality content. This update impacted 3-5% of search queries.

To quote Google:

“Based on user (and webmaster!) feedback, we’ve been able to discover a few more signal to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.”

Major losers were sites with deceptive ads, affiliate sites (thin on content, meant to pass traffic to other monetizing affiliates), and sites with security issues.

2014 September – Known PBNs De-Indexed

This change impacted search, but was not an algorithm change, data refresh, or UI update.

Starting mid-to-late September, 2014 Google de-indexed a massive amount of sites being used to boost other sites and game Google’s search rankings.

Google then followed-up on the de-indexing with manual actions for sites benefiting from the PBN. These manual actions went out on September 18, 2014.

Additional Reading:

2014 August – Authorship Removed from Search Results

Authors are no longer displayed (name or photo) in the search results along with the pieces that they’ve written.

Almost a year later Gary Illyes suggested that sites with authorship markup should leave the markup in place because it might be used again in the future. However, at a later date it was suggested that Google is perfectly capable of recognizing authorship from bylines.

Additional Reading:

2014 August – SSL becomes a ranking factor

Sites using SSL began to see a slight boost in rankings.

Google would later go on to increase this boost, and eventually provide warning to users when they were trying to access unsecure pages.

Additional Reading:

2014 July 24 – Google Local Update (Pigeon)

Google’s local search algorithm is updated to include more signals from traditional search (knowledge graph, spelling correction, synonyms, etc).

Additional Reading

2014 June – Authorship Photos Removed

Photos of Authors are gone from SERPs.

This was the first step towards Google decommissioning Authorship markup.

2014 June – Payday Loan Update 3.0

Where Payday Loans 2.0 targeted spammy sites, Payday Loans 3.0 targeted spammy queries, or more specifically the types of illegal link schemes scene disproportionately within high-spam industries (payday loans, porn, gambling, etc).

What do you mean illegal? We mean link schemes that function off of hacking other websites or infecting them with malware.

This update also included better protection against negative SEO attacks,

Additional Reading:

2014 May 17-18 – Payday Loan Update 2.0

Payday Loan Update 2.0 was a comprehensive update to the algorithm (not just da data refresh). This update focused on devaluation of domains using spamy on-site tactics such as cloaking.

Cloaking is when the content/page that google can see for a page is different than the content/page that a human user sees when they click on that page from the SERPs.

2014 May – Panda 4.0 (Update 27)

Google had stopped announcing changes to Panda for a while, so when they announced Panda 4.0 we know it was going to be a larger change to the overall algorithm.

Panda 4.0 impacted 7.5% of English queries, and led to a drastic nose dive for a slew of prominent sites like eBay, Ask.com, and Biography.com.

 

Sites that curated information from other sources without posting info or analysis of their own (aka coupon sites, celebrity gossip sites) seemed to take a big hit from this update.

 

 

2014 February 6 – Page Layout 3.0 (Top Heavy 3.0)

This is a refresh of Google’s algorithm that devalues pages with too many above-the-fold ads, per Google’s blog:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.

So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

The Page Layout algorithm was originally launched on January 19, 2012, and has only had one other update in October of the same year (2012).

 

Tweet from Matt Cutts Announcing Panda 4.0

2013 Google Algorithm Updates

2013 December – Authorship Devalued

Authorship gets less of a boost in the search results. This is the first step Google took in beginning to phase out authorship markup.

2013 October – Penguin 2.1

Technically the 5th update to Google’s link-spam fighting algorithm, this minor update affects about 1% of search queries.

 

2013 August – Hummingbird

Hummingbird was a full replacement of the core search algorithm, and Google’s largest update since Caffeine (Panda and Penguin had only been changes to portions of the old algorithm).

Humminbird helped most with conversational search for results outside of the knowledge graph — where conversational search was already running. Hummingbird was a significant improvement to how google interpreted the way text and queries are typed into search.

This algorithm was named Hummingbird by Google because it’s “precise and fast.”

 

Additional Reading:

 

2013 July – Expansion of Knowledge Graph

Knowledge Graph Expands to nearly 25% of all searches, displaying information-rich cards right above or next to the organic search results.

 

 

Additional Reading:

2013 July – Panda Dance (Update 26)

Panda begins going through monthly refreshes, also known as the “Panda Dance,” which caused monthly shifts in search rankings.

The next time Google would acknowledge a formal Panda update outside of these refreshes would be almost a year later in May of 2014.

2013 June – Roll Out Anti-Spam Algorithm Changes

Google rolled out an anti-link-spam algorithm in June of 2013 targeting sites grossly violating webmaster guidelines with egregious unnatural link building.

Matt Cutts even acknowledged one target – ‘Citadel Insurance’ which built 28,000 links from 1,000 low ranking domains within a single day, June 14th, and managed to reach position #2 for car insurance with the tactic.

By the end of June sites were finding it much harder to exploit the system with similar tactics.

 

 

2013 June 11 – Payday Loans

This update impacted 0.3% of queries in the U.S., and as much as 4% of queries in Turkey.

This algorithm targets queries that have abnormally high incidents of SEO spam (payday loans, adult searches, drugs, pharmaceuticals) and applies an extra filters to these types of queries specifically.

 

2013 May 22 – Penguin 2.0

Penguin 2.0 was an update to the Penguin algorithm (as opposed to just a data refresh), it impacted 2.3% of english queries.

What changed?

  • Advertorials will no longer flow pagerank
  • Niches that are traditionally spammy will see more impact
  • Improvements to how hacked sites are detected
  • Link spammers will see links from their domains transfer less value.

One of the biggest shifts with Penguin 2.0 is it also analyzed linkspam for internal site pages, whereas Penguin 1.0 had looked at spammy links specifically pointing to domain home pages.

This marked the first time in 6 months that the Penguin algorithm had been updated, and the 4th update to Penguin that we’ve seen:

  • April 24, 2012 – Penguin 1.0 Launched
  • May 25, 2012 – Penguin 1.1 Data Refresh
  • October 5, 2012 – Another Penguin Data Refresh

 

Additional Reading:

 

2013 May – Domain Diversity

This update reduced the amount of times a user saw the same domain in the search results. According to Matt Cutts, once you’ve seen a cluster of +/- 4 results from the same domain, the subsequent search pages are going to be significantly less likely to show you results from that domain.

 

Additional Reading:

 

2013 May 8th – Phantom I

On May 8th, 2013 SEOs over at Webmaster World noticed intense fluctuation in the SERPs.

Lots of people dove into the data – some commenting that sites who had taken a dive were previously hit by Panda, but there were no conclusive takeaways. With no confirmation of major changes from Google, and nothing conclusive in the data – this anomaly came to be known as the “Phantom” update.

2013 March 14-15 – Panda Update 25

This is the 25th update for Panda, the algorithm that devalues low quality content in the SERPs. Matt Cutts confirmed that moving forward the Panda algorithm was going to be part of a regular algorithm updates, meaning it will be a rolling update instead of a pushed update process.

2013 January 22 – Panda Update 24

The 24th Panda update was announced on January 22, 2013 and impacted 1.2% of English search queries.

 

2012 Google Algorithm Updates

2012 December 21 – Panda Update 23

The 23rd Panda update hit on December 21, 2012 and impacted 1.3% of English search queries.

2012 December 4 – Knowledge Graph Expansion

On December 4, 2012 Google announced a foriegn language expansion of the Knowledge Graph, their project to “map out real-world things as diverse as movies, bridgets and planets.”

 


Variations of Knowledge Graph in Search Results for Different Languages (Russian, Japanese, etc)

2012 November – Panda Updates 21 & 22

In November 2012 Panda had two updates in the same month – one on November 5, 2012 (1.1% of English queries impacted in the US) and one on November 22, 2012 (0.8% of Enlish queries impacted in the US).

2012 October 9 – Page Layout Update

On October 9, 2012 Google rolled up an update to their Page Layout filter (also known as “Top Heavy”) impacting 0.7% of English-language search queries. This update rolled the Page Layout algorithm out globally.

Sites that made fixes after Google’s initial Page Layout Filter hit back in January of 2012 saw their rankings recover in the SERPs.

2012 October – Penguin Update 1.2

This was just a data refresh affecting 0.3% of English queries in the US.

 

2012 September – Panda Updates 19 & 20

Panda update 19 hit on September 18, 2012 affecting 0.7% of English search queries, followed just over a week later by Panda update 20 which hit on September 27, 2012 affecting 2.4% of English search queries.

Panda update 20 was an actual algorithm refresh, accounting for the higher percentage of affected queries.

2012 September – Exact Match Domains

At the end of September Matt Cutts announced an upcoming change: low quality exact match domains were going to be taking a hit in the search results.

Up until this point, exact match domains had been weighted heavily enough in the algorithms to counterbalance low quality site content.

Additional Reading:

2012 August 19 – Panda Update 18

Panda version 3.9.1 rolled out on Monday, August 19th, 2012, affecting less than 1% of English search queries in the US.

This update was a data refresh.

 

2012 August – Fewer Results on Page 1

In August Google began displaying 7 results for about 18% of the queries, rather than the standard 10.

Upon further inspection it appeared that google had reduced the number of organic results so they’d have more space to test a suite of potential top-of search features including: expanded site links, images, and local results.

This change, in conjunction with the knowledge graph, paved the way for the top-of-search rich snippet results we see in search today.

Additional Reading:

2012 August 10 – Pirate/DMCA Penalty

Google announces they’ll be devaluing sites that repeatedly get accused of copyright infringement in the SERPs. As of this date the number of valid copyright removal notices is a ranking signal in Google’s search algorithm.

Additional Reading:

2012 July 24 – Panda Update 17

On July 24, 2012 Google Announces Panda 3.9.0 – a refresh for the algorithm affecting less than 1% search

2012 July 27 – Webmaster Tool Link Warnings

Not technically an algorithm update, but it definitely affected the SEO landscape.

On July 27, 2012 Google posted an update clarifying topics surrounding a slew of unnatural link warnings that had recently been sent out to webmasters:

  • Unnatural link warnings and drops in rankings are directly connected
  • Google doesn’t penalize sites as much when they’re the victims of 3rd party bad actors

Additional Reading:

2012 June – Panda Updates 15 & 16

In June Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.7 rolled out on June 8, 2012 affecting less than 1% of English search queries in the U.S.
  • Panda 3.8 rolled out on June 25, 2012 affecting less than 1% of queries worldwide.

Both updates were data refreshes.

2012 June – 39 Google Updates

On June 7, 2012 Google posted an update providing insight into search changes made over the course of May. Highlights included:

  • Link Spam Improvements:
    • Better hacked sites detection
    • Better detection of inorganic backlink signals
    • Adjustments to Penguin
  • Adjustments to how Google handles page titles
  • Improvements to autocomplete for searches
  • Improvements to the freshness algorithm
  • Improvements to rankings for news and recognition of major news events.

Additional Reading:

2012 May 25 – Penguin 1.1

A data refresh for the Penguin algorithm was released on May 25, 2012 affecting less than 0.1% of search queries.

Additional Reading:

2012 My 16 – Knowledge Graph

On May 16, 2012 Google introduced the knowledge graph, a huge step forward in helping users complete their goals faster.

First, the knowledge graph improved Google’s understanding of entities in Search (what words represented — people, places, or things).

Second, it surfaced relevant information about these entities directly on the search results page as summaries and answers. This meant that users in many instances, no longer needed to click into a search result to find the information they were seeking.

Additional Resources:

2012 May 4 – 52 April Updates

On May 4, 2012 Google posted an update providing insight into search changes made over the course of April. Highlights included:

  • 15% increase in the base index
  • Removed the freshness boost for low quality content
  • Increased domain diversity in the search results.
  • Changes to Sitelinks
    • Sub sitelinks
    • Better ranking of expanded sitelinks
    • Sitelinks data refresh
  • Adjustment to surface more authoritative results.

Additional Reading:

2012 April – Panda Updates 13 & 14

In April Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.5 rolled out on April 19, 2012
  • Panda 3.6 rolled out on April 27, 2012 affecting 1% of queries.

Panda 3.5 seemed to target press portals and aggregators, as well as heavily-templated websites. This makes sense as these types of sites are likely to have a high number of pages with thin or duplicative content.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April – Parked Domain Bug

After a number of webmasters reported ranking shuffles, Google confirmed that a data error had caused some domains to be mistakenly treated as parked domains (and thereby devalued). This was not an intentional algorithm change.

Additional Reading:

2012 April 3 – 50 Updates

On April 3, 2012 Google posted an update providing insight into search changes made over the course of March. Highlights included:

  • Sitelinks Data Refresh
  • Better handling of queries with navigational and local intent
  • Improvements to detecting site quality
  • Improvements to how anchor text contributes to relevancy for sites and search queries
  • Improvements to how search handles synonyms

Additional Reading:

2012 March – Panda Update 12

On March 23, 2012 we saw the Penguin 3.4 update, a data refresh affecting 1.6% of queries.

 

2012 February 27 – Panda Update 11

Panda Update 3.3 was a data refresh that was announced on February 27, 2012.

2012 February 27 – Series of Updates

On February 27, 2012 Google posted an update providing insight into search changes made over the course of February. Highlights included:

  • Travel related search improvements
  • international launch of shopping rich snippets
  • improved health searches
  • Google changed how it was evaluating links, dropping a method of link analysis that had been used for the past several years.

Additional Reading:

2012 February – Venice

The Venice update changed the face of local search forever, as local sites now up even without a geo modifier being used in the keyword itself.

Additional Reading:

2012 January – Page Layout Update

This update devalued pages in search that had too many ads “above-the-fold.” Google said that ads that prevented users from accessing content quickly provided a poor user experience.

Additional Reading:

2012 January 10 – Personalized Search

On January 10, 2012 Google announced Search, plus Your World. Google had already expanded search to include content personally relevant to individuals with Social Search, Your World was the next step.

This update pulled in information from Google+ such as photos, profiles, and more.

Additional Reading:

2012 January 5 – 30 Google Updates

On January 5, 2012 Google posted an update providing insight into search changes made over the course of December of 2011. Highlights included:

  • Landing page quality became a signal for image search, beyond the image itself
  • Soft 404 detection (when a page returns a different status code, but the content still wont be accessible to a user).
  • More rich snippets
  • Better infrastructure for autocomplete (ex: spelling corrections)
  • More accurate byline dates
  • Related queries improvements
  • Upcoming events at venues
  • Faster mobile browsing – skipped the redirect phase of sending users to a mobile site m.domain.com

Additional Reading:

2011 Google Algorithm Updates

2011 December 1 – 10 Google Updates

On December 1, 2011 Google posted an update providing insight into search changes made the two weeks prior. Highlights included:

  • Refinements to the inclusion of related queries so they’d be more relevant
  • Expansion of indexing to include more long tail keywords
  • New parked domain classifier (placeholder sites hosting ads)
  • More complete (fresher) blog results
  • Improvements for recognizing and rewarding whichever sites originally posted content
  • Top result selection code rewrite to avoid “host crowding” (too many results from a single domain in the search results).
  • New verbatim tool
  • New google bar

Additional Reading:

2011 November 18 – Panda Update 10

The Panda 3.1 update rolled out on November 18th, 2011 and affected less than 1% of searches.

 

2011 November – Panda 3.1 (Update 9)

On November 18th, 2011 Panda Update 3.1 goes live, impacting <1% of searches.

 

2011 November – Automatic Translation & More

On November 14, 2011 Google posted an update providing insight into search changes made over the couple preceding weeks. Highlights included:

  • Cross language results + automatic translation
  • Better page titles in search results by de-duplicating boilerplate anchors (referring to google-generated page titles, when they ignore html title tags because they can provide a better one)
  • Extending application rich snippets
  • Refining official page detection, adjusted how they determine which pages are official
  • Improvements to date-restricted queries

Additional Reading:

2011 November 3 – Fresher Results

Google puts an emphasis on more recent results, especially on time-sensitive queries.

  • Ex: Recent events / hot topics
  • Ex: regularly occurring/recurring events
  • Frequently updated/outdated types of info (ex: best SLR camera)

Additional Reading:

2011 October – Query Encryption

On October 18, 2011 Google announced that they were going to be encrypting search data for users who are signed in.

The result? Webmasters could tell that users were coming from google search, but could no longer see the queries being used. Instead, webmasters began to see “(not provided)” showing up in their search results.

This change followed a January roll out of SSL encryption protocol to gmail users.

Additional Reading:

2011 October 19 – Panda Update 8 (“Flux”)

In October Matt Cutts announced there would be upcoming flux from the Panda 3.0 update affecting about 2% of search queries. Flux occurred throughout October as new signals were incorporated into the Panda algorithms and data is refreshed.

Additional Reading:

2011 September 28 – Panda Update 7

On September 20, 2011 Google released their 7th update to the Panda algorithm – Panda 2.5.

2011 September – Pagination Elements

Google added pagination elements – link attributes to help with pagination crawl/indexing issues.

  • Rel=”Next”
  • Rel=”prev”

Note: this is no longer an indexing signal anymore

2011 August 16 – Expanded Site Links

On August 16, 2011 Google announced expanded display of sitelinks from a max of 8 links to a max of 12 links.

 


Additional Reading:

 

2011 August 12 – Panda Update 6

Google rolled out Panda 2.4 expanding Panda to more languages August 12, 2011, impacting 6-9% of queries worldwide.

Additional Reading:

2011 July 23 – Panda Update 5

Google rolled out Panda 2.3 in July of 2011, adding new signals to help differentiate between higher and lower quality sites.

2011 June 28 – Google+

On June 28, 2011 Google launched their own social network, Google+. The network was sort of a middle ground between Linkedin and Facebook.

Over time, Google + shares and +1s (likes) will eventually become a temporary personalized search ranking factor.

Ultimately though, Google+ ended up being decommissioned in 2019

Additional Reading:

2011 June 16 – Panda Update 4

According to Matt Cutts Panda 2.2 improved scraper-site detection.

What’s a scraper? In this context, a scraper is software used to copy content from a website, often to be posted to another website for ranking purposes. This is considered a type of webspam (not to mention plagiarism).

This update rolled out around June 16, 2011.

2011 June 2 – Schema.org

On June 2, 2011 Google, Yahoo, and Microsoft announced a collaboration to create “a common vocabulary for structured data,” known as Schema.org.

Additional Reading:

2011 May 9 – Panda Update 3

Panda 2.1 rolled out in early May, and was relatively minor compared to previous Panda updates.

2011 April 11 – Panda Update 2

On April 11, 2011 Panda 2.0 rolled out globally to English users, impacting about 2% of search queries.

What was different in Panda 2.0?

  • Better assessment of site quality for long-tailed keywords
  • This update also begins to incorporate data around sites that user’s manually block

Additional Reading:

2011 March 28 – Google +1 Button

Google introduces the +1 Button, similar to facebook “like” button or the reddit upvote. The goal? Bring trusted content to the top of the search results.

Later in June Google posted a brief update that they made the button faster, and in August of 2011 it also became a share icon.

 

Additional Reading:

 

2011 February – Panda Update (AKA Farmer)

Panda was released to fight thin content and low-quality content in the SERPs. Panda was also designed to reward unique content that provides value to users.

 

Panda impacted a whopping 12% of search results, and virtually wiped out content farms, sites with low quality content, thin affiliate sites, sites with large ad-to-content ratios and over optimization.

 

As a result sites with less intrusive ads started to do better in the search results, sites with”thin” user-generated content went down, as did harder to read pages.

Per Google:

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content.”

Additional Reading

2011 January – Attribution Update

This update focused on stopping scraper sites from receiving benefit from stolen content. The algorithm worked to establish which site initially created and posted content, and boost that site in the SERPs over other sites which had stolen the content.

Additional Reading:

2011 January – Overstock.com & JCPenney Penalty

Overstock and J.C. Penney receive manual actions due to deceptive link building practices.

Overstock offered a 10% discount to universities, students, and parents — as long as they posted anchor-text rich content to their university website. A competitor noticed the trend and reported them to Google.

JC Penney had thousands of backlinks built to its site targeting exact match anchor text. After receiving a manual action they disavowed the spammy links and largely recovered.

Additional Reading:

2010 Google Algorithm Updates

2010 December – Social Signals Incorporated

Google confirms that they use social signals including accounting for shares when looking at news stories, and author quality.

<h3style=”font-size: 18pt;”>2010 December – Negative ReviewsIn late November a story broke about how businesses were soaring in the search results, and seeing their businesses grow exponentially – by being as terrible to customers as possible.

Enraged customers were leaving negative reviews on every major site they could linking back to these bad-actor businesses, trying to warn others. But what was happening in search, is all those backlinks were giving the bad actors more and more search equity — enabling them to show up as the first result for a wider and wider range of searches.

Google responded to the issue within weeks, making changes to ensure businesses could not abuse their users in that manner moving forward.

Per Google:

“Being bad is […] bad for business in Google’s search results.”

Additional Reading:
NYT – Bullies Rewarded in Search

2010 November – Instant Visual Previews

This temporary feature allowed users to see a visual preview of a website in the search results. It was quickly rolled back.

 

Additional Resources:
Google Blog – Beyond Instant Results, Instant Previews

 

2010 September – Google Instant

Google suggest starts displaying results before a user actually completes their query.

This feature lived for a long time (in tech-years anyways) but was sunset in 2017 as mobile search became dominant, and Google realized it might not be the optimal experience for on-the-go mobile users.

2010 August – Brand Update

Google made a change to allow some brands/domains to appear multiple times on page one depending on the search

This feature ends up undergoing a number of updates over time as Google works to get the right balance of site diversity when encountering host-clusters (multiple results from the same domain in search).

2010 June – Caffeine Roll Out

On June 10, 2010 Google announced Caffeine.

Caffeine was an entirely new indexing system with a new search index. Where before there had been multiple indexes, each being updated and refreshed at their own rates, caffeine enabled continuous updating of small portions of the search index. Under caffeine, newly indexed content was available within seconds of being crawled

Per Google:

“Caffeine provides 50 percent fresher results for web searches than our last index, and it’s the largest collection of web content we’ve offered. Whether it’s a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before.”

Additional Reading:

2010 May 3 – MayDay

The May Day update occurred between April 28th and May 3rd 2010. This update was a precursor to Panda and took a shot at combating content farms.

Google’s comment on the update? “If you’re impacted, assess your site for quality.”

Additional Resources:

2010 April – Google Places

In April of 2010 Local Business Center became Google Places. Along with this change came the introduction of service areas (as opposed to just a single address as a location).

Other highlights:

  • Simpler method for advertising
  • Google offered free professional photo shoots for businesses
  • Google announced another batch of favorite places

By April of 2010, 20% of searches were already location-based.

Additional Reading:

2009 Google Algorithm Updates

2009 December – Real Time Search

Google announces search features related to newly indexed content: Twitter Feeds, News Results, etc. This real time feed was nested under a “latest results” section of the first page of search results.

 

Additional Reading:

 

2009 August 10 – Caffeine Preview

On August 10 Google begins to preview Caffeine, requesting feedback from users.

Additional Reading:

2009 February – Vince

Essentially the Vince update boosted brands.

Vince focused on trust, authority and reputation as signals to provide higher quality results which could push big brands further to the top of the SERPs.

Additional Resources:
Watch – Is Google putting more weight on brands in rankings?
Read – SEO Book – Google Branding

2008 Google Search Updates

2008 August – Google Suggest

Google introduces “suggest” which displays suggested search terms as the user is typing their query.

Additional Reading:

2008 April – Dewey

The Dewey update rolled out in late March/early April. The update was called Dewey because Matt Cutts chose the (slightly unique) term as one that would allow comparison between results from different data centers.

2007 Google Algorithm Updates

2007 June – Buffy

The Buffy update caused fluctuations for single-word search results.

Why Buffy?Google Webmaster Central product manager and long-time head of operations, Vanessa Fox, notoriously an avid Buffy fan, announced she was leaving Google.

Vanessa garnered an intense respect from webmasters over her tenure both for her product leadership and for her responsiveness to the community – the people using google’s products daily. The webmaster community named this update after her interest as a sign of respect.

Additional Reading:

2007 May – Universal Search

Old school organic search results are integrated with video, local, image, news, blog, and book searches.

Additional Reading:

2006 Google Search Updates

2006 November – Supplemental Update

An update to how the filtering of pages stored in the supplemental index is handled. Google went on to scrap the supplemental index label in July 2007.

Additional Reading:

2005 Google Search Updates

2005 November – Big Daddy

This was an update to the Google search infrastructure and took 3 months to roll out: January, February, and March. This update also changed how google handled canonicalization and redirects.

Additional Reading:

2005 October 16 – Jagger Rollout Begins

The Jagger Update rolled out as a series of October updates.

The update targeted low quality links, reciprocal links, paid links, and link farms. The update helped prepare the way for the Big Daddy infrastructure update in November.

Additional Reading:

2005 October – Google Local / Maps

In October of 2015, Google merged Local Business Center data merges with Maps data.

2005 September – Gilligan / False Alarm

A number of SEOs noted fluctuations in September which they originally named “Gilligan.” It turns out there were no algorithm updates, just a data refresh (index update).

Given the news, many SEOs renamed their posts “False Alarm.” However, moving forward many data refreshes are considered updates by the community. So we’ll let the “Gilligan” update stand.

Additional Reading:

2005 June – Personalized Search

Google relaunches personal search. This time it helps shape future results based on your past selections.

Additional Reading:

2005 June – XML sitemaps

Google launches the ability to submit XML sitemaps via Google Webmaster tools. This update bypassed old HTML sitemaps. It gave Webmasters some influence over indexation and crawling, allowing them to feed pages to the index with this feature.

Additional Reading:

2005 May – Bourbon

The May 2005 update, nicknamed Bourbon seemed to devalue sites/pages with duplicate content, and affected 3.5% of search queries.

2005 February – Allegra

The Allegra update rolled out between February 2, 2005 and February 8, 2005. It caused major fluctuations in the SERPs. While nothing has ever been confirmed, these are the most popular theories amongst SEOs for what changed:

  • LSI being used as a ranking signal
  • Duplicate content is devalued
  • Suspicious links are somehow accounted for

Additional Reading:

2005 January – NoFollow

In early January, 2005 Google introduced the “Nofollow” link attribute to combat spam, and control the outbound link quality. This change helped clean up spammy blog comments: comments mass posted to blogs across the internet with links meant to boost the rankings of the target site.
Future Changes:

  • On June 15, 2009 Google changed the way it views NoFollow links in response to webmasters manipulating pages with “page rank sculpting”.
  • Google suggests webmasters use “nofollow” attributes for ads and paid links.
  • On September 10, 2019 Google Announced two additional link attributes “sponsored” and “ugc.”
    • Sponsored is for links that are paid or advertorial.
    • UGC is for links which come from user generated content.

Additional Reading:

2004 Google Algorithm Updates

2004 February – Brandy

The Brandy update rolled out the first half of February and included five significant changes to Google’s algorithmic formulas (confirmed by Sergey Brin).

Over this same time period Google’s index was significantly expanded, by over 20%, and dynamic web pages were included in the index.

What else changed?

  • Google began shifting importance away from Page Rank to link quality, link anchors, and link context.
  • Attention is being given to link neighborhoods – how well your site connected to others in your sector or space. This meant that outbound links became more important to a site’s overall SEO.
  • Latent Semantic Indexing increases in importance. Tags (titles, metas, H1/H2) took a back seat to LSI.
  • Keyword analysis gets a lot better. Google gets better at recognizing synonyms using LSI.

2004 January – Austin

Austin followed up on Florida continuing to clean up spammy SEO practices, and push unworthy sites out of the first pages of search results.
What changed?

  • Invisible text took another hit
  • Meta-tag stuffing was a target
  • FFA (Free for all) link farms no longer provided benefit

Many SEOs also speculated that this had been a change to Hilltop, a page rank algorithm that had been around since 1998.

Additional Reading:

2003 Google Algorithm Updates

2003 November 16 – Florida

Google’s Florida update rolled out on November 16, 2003 and targeted spammy seo practices such as keyword stuffing. Many sites that were trying to game the search engine algorithms instead of serve users also fell in the rankings.

 


GIF of a webmaster freaking out a little and mashing their keyboard looking worried
Additional Reading:

 

2003 September – Supplemental Index

Google split their index into main and supplemental. The goal was to increase the number of pages/content that Google could crawl and index. The supplemental index had less restrictions on indexing pages. Pages from the supplemental index would only be shown if there were very few good results from the main index to display for a search.

When the supplemental index was introduced some people viewed being relegated to the supplemental index as a penalty or search results “purgatory”.

Google retired the supplemental index tag in 2007, but has never said that they retired the supplemental index itself. That being said it’s open knowledge that Google maintains multiple indices, so it is within the realm of reason that the supplemental index may still be one of them. While the label dissapeared, many wonder if the supplemental index has continued to exist and morphed into what we see today as “omitted results”Sites found they were able to move from the supplemental index to the main index by acquiring more backlinks.

Additional Reading:

2003 July – Fritz (Everflux)

In July, 2003 Google moved away from monthly index updates (often referred to as the google dance) to daily updates in which a portion of the index was updated daily. These regular updates came to be referred to as “everflux.”

2003 June – Esmerelda

Esmerelda was the last giant monthly index update before Google switched over to daily index updates.

2003 May – Dominic

Google’s Dominic update focused on battling spammy link practices.

2003 April – Cassandra

Google’s Cassandra update launched in April of 2003 and targeted spammy SEO practices including hidden text, heavily co-linked domains, and other low-link-quality practices.

Google began allowing banned sites to submit a reconsideration request after manual penalties in April of 2003.

Additional Reading:

2003 February – Boston

Google’s first named update was Boston which rolled out in February of 2003. The Google Boston Update improved algorithms related to analyzing a site’s backlink data.

2002 Google Algorithm Updates

2002 September – 1st Documented Update

Google’s first documented search algorithm update happened on September 1, 2002. It was also the kickoff of “Google Dance” – large-scale monthly refreshes of Google’s search index.

SEOs were shocked by the update claiming “PageRank [is] DEAD”, this update was a little imperfect and included issues such as 404 pages showing up on the first page of search.

Additional Reading:

2000 Google Search Updates

2000 December – Google Toolbar

Google launches their search toolbar for browsers. The toolbar highlights search terms within webpage copy, and allowed users to search within websites that didn’t have their own site search.

Additional Reading:

 

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/google-algorithm-update-history/feed/ 47
Case Study category https://linkgraph.io/case-studies-and-results/ Thu, 26 May 2022 12:02:41 +0000 https://linkgraph.io/?page_id=14300 The post Case Study category appeared first on LinkGraph.

]]>
The post Case Study category appeared first on LinkGraph.

]]>
16 Link Building Tips to Improve Landing Page SEO https://linkgraph.io/blog/link-building-tips/ https://linkgraph.io/blog/link-building-tips/#respond Thu, 05 May 2022 00:06:59 +0000 https://linkgraph.io/?p=13591 Why do link building? Link building is one of the most important aspects of SEO. By building links to your site, you can improve your site’s authority […]

The post 16 Link Building Tips to Improve Landing Page SEO appeared first on LinkGraph.

]]>
Why do link building?

Link building is one of the most important aspects of SEO. By building links to your site, you can improve your site’s authority and relevance in search engine results pages (SERPs). This, in turn, can help you to increase traffic to your site, boost your online visibility, and improve your ROI.

But why is link building so important for SEO?

First and foremost, links are one of the main factors that search engines use to determine a site’s authority and relevance. The more high-quality links your site has, the more likely it is to rank higher in SERPs. In fact, a recent study by Moz found that the number of links a site has is the second-most important ranking factor, after the quality of its content.

Links also play a major role in driving traffic to your site. In fact, a recent study by Ahrefs found that the number of referring domains (links from other websites) is the top factor in determining how much traffic a site receives.

Moreover, links are an important part of building your brand’s online visibility. The more links you have from high-quality websites, the more likely it is that people will see your site when they search for relevant keywords.

Finally, links are an important part of generating ROI for your business. The more high-quality links you have, the more likely it is that people will click through to your site from SERPs. This, in turn, can help you to increase sales and conversions, and boost your bottom line.

Is Link Building Safe?

There’s a lot of debate surrounding the safety of link building. Some people believe that it’s a safe and effective way to improve SEO, while others think that it can lead to penalization from Google. So, what’s the truth?

In a word, it depends.

There are a number of ways to build links safely and effectively, but there are also a number of ways to get penalized by Google. In general, you’ll want to avoid any black hat techniques like buying links or spamming forums with links to your site. These techniques may improve your rankings in the short term, but they’ll likely lead to a penalty from Google in the long run.

Instead, focus on creating high-quality content that people will want to link to naturally. This may take longer, but it will be more effective in the long run. You can also build links through social media, PR, and other forms of marketing.

When it comes to link building, there’s no one-size-fits-all answer. You need to consider the individual circumstances of your website and use a variety of safe and effective techniques to build links that will improve your SEO.

1. Identify which landing pages will benefit from link building

There are a few key pages on your website that will benefit the most from link building. Start by identifying which pages have the most potential to bring in traffic and boost your SEO efforts. Once you know which pages to focus on, you can start targeting high-quality links that will improve your website’s overall ranking.

Some of the most important pages on your website are your homepage, product pages, and blog posts. Your homepage is typically the first page that potential customers see, so it’s important to make a good first impression. Product pages are also important, as they can help you convert website visitors into buyers. And finally, blog posts are a great way to attract new visitors and keep them coming back for more.

Once you’ve identified which pages need the most help, it’s time to start building links. There are a variety of methods you can use to get links, including guest posting, link building tools, and contacting bloggers and other website owners. However, you should always focus on quality over quantity. Only target high-quality links from reputable websites to ensure that your website gets the most benefit.

By focusing on link building for your key pages, you can improve your website’s ranking and increase your chances of success.

2. Use Link Analyzer Tools

There are a number of link analyzer tools that you can use to help you improve your website’s link popularity. These tools can help you to identify the quality and quantity of links pointing to your website.

One of the most popular link analyzer tools s the Alexa Toolbar. The Alexa Toolbar is also a free browser add-on that allows you to check the Alexa Rank of any website. The Alexa Toolbar also includes a link popularity tool that allows you to see the number of links pointing to a website.

Other popular link analyzer tools include SearchAtlas, Semrush, Ahrefs, and others.

3. See where your Competitors Are Earning Their Backlinks

One of the best ways to determine where to allocate your own link-building efforts is to see what sites are linking to your competitors.

By taking a look at the backlinks of your top competitors, you can get an idea of the types of sites that are linking to them, as well as the quality of those links.

If you see that a competitor has links from high-quality sites, you may want to try to get links from similar sites.

Likewise, if you see that a competitor has a lot of links from low-quality sites, you may want to focus on getting links from high-quality sites instead.

To find the backlinks of your competitors, you can use a variety of different tools.

Finding Backlinks with Moz

One of the most popular tools for doing this is known as the “Open Site Explorer” from Moz.

  1. The Open Site Explorer allows you to see the backlinks of any website, as well as the anchor text and PageRank of those links.
  2. To use the Open Site Explorer, simply enter the URL of one of your competitors into the search bar and hit “Search.”
  3. You will then be taken to a page that shows all of the backlinks that competitor has.
  4. You can also see the anchor text and PageRank of each of those links.
  5. If you want to see the backlinks of a specific page on a competitor’s website, you can also do that.
  6. Simply enter the URL of the page into the search bar and hit “Search.”
  7. You will then be taken to a page that shows all of the backlinks to that page, as well as the anchor text and PageRank of those links.

You can also export the data from the Open Site Explorer to a CSV file, which will allow you to analyze the data in a more in-depth way.

Analyzing Links with Majestic

Another popular tool for finding backlinks is Majestic SEO.

Majestic SEO allows you to see the backlinks of any website, as well as the number of links and the quality of those links.

  1. To use Majestic SEO, simply enter the URL of one of your competitors into the search bar and hit “Search.”
  2. You will then be taken to a page that shows all of the backlinks that competitor has.
  3. You can also see the number of links and the quality of those links.
  4. If you want to see the backlinks of a specific page on a competitor’s website, you can also do that.
  5. Simply enter the URL of the page into the search bar and hit “Search.”
  6. You will then be taken to a page that shows all of the backlinks to that page, as well as the number of links and the quality of those links.
  7. You can also export the data from Majestic SEO to a CSV file, which will allow you to analyze the data in a more in-depth way.

By taking a look at the backlinks of your competitors, you can get a good idea of the types of links that are helping them to rank well.

Then, you can focus on getting links from similar sites to help improve your own rankings.

4. Get your Business Listed in Online Directories

Almost every business today has a website, and with a website comes the need to get it listed in online directories. There are a number of online directories that you can list your business in, and many of them are free. The most important thing to remember when listing your business in online directories is to make sure that your business information is accurate and up to date.

Some of the most popular online directories are Google My Business, Yelp, and Bing Places for Business. Google My Business is a free online directory that is owned by Google. It allows businesses to create a business profile, which includes information such as the business name, address, phone number, website, and hours of operation. Google My Business also allows businesses to add photos and videos, and to write reviews.

Yelp is a free online directory that allows businesses to create a business profile, which includes information such as the business name, address, phone number, website, and hours of operation. Yelp also allows businesses to add photos and videos, and to write reviews.

Bing Places for Business is a free online directory that is owned by Microsoft. It allows businesses to create a business profile, which includes information such as the business name, address, phone number, website, and hours of operation. Bing Places for Business also allows businesses to add photos and videos, and to write reviews.

5. Invest in Content Marketing

Wanna build more links? Then it’s time to start investing in content marketing. The more content assets that you have, the more opportuniites you have for link building.

But where should you start?

First, you need to create quality content. This means investing in a good writer or team of writers. You’ll also need to make sure that your website and other marketing materials are up to date and look professional.

If you’re willing to invest the time and money in content marketing, you’ll be able to see great results. But it’s important to remember that it takes time and effort to see results, so be patient and keep up the hard work.

6. Become a Thought Leader in your Industry

The best way to become a thought leader in your industry is to start blogging and writing articles. By sharing your insights and knowledge with others, you’ll position yourself as an expert and build a following of people who look to you for advice.

To get started, think about the topics that are most important to you and your industry. Brainstorm a list of ideas, and then start writing. Be sure to share your articles on social media and other platforms, so that you can reach a larger audience.

In addition to writing, you can also become a thought leader by speaking at events and conferences. This gives you a chance to share your ideas with a larger group of people, and it can also help you build relationships with other leaders in your industry.

Finally, always be willing to share your knowledge with others. When you provide helpful information and advice, you’ll be seen as a thought leader in your industry.

7. Write a Guest Post

Writing a guest post for a blog that already has a large following can help you earn backlinks back to your website.

So, why should you write a guest post? Here are some of the benefits:

You’ll gain exposure to a new audience

    . When you write a guest post for a popular blog, you’ll get exposure to a new audience. This is a great way to grow your blog and attract new readers.

You’ll build relationships with other bloggers

    . When you write a guest post, you’ll build relationships with other bloggers. This can be a valuable opportunity to connect with other bloggers in your niche and share ideas.

You’ll earn links

    1. .

Lots of them

    1. . Writing a

guest post

    is a great way to earn contextual links to your web pages in a Google-compliant way.

8. Conduct Original Research, Case Studies, or Surveys

Conducting original research, case studies, or surveys can give you a wealth of information to include in your blog. It can also help you to establish yourself as an expert in your field. When conducting research, be sure to use reputable sources, and always cite your sources.

If you are conducting a survey, be sure to make it as accurate as possible. You may want to use a tool like Survey Monkey to create your survey. When you are ready to publish your survey, be sure to promote it widely to ensure that as many people as possible take it.

When writing about your research, be sure to present your findings in a clear and concise manner. Use graphs and charts to illustrate your findings, and be sure to explain the significance of your findings.

If you are writing about a case study, be sure to include all the pertinent details. Present the case study in a clear and concise manner, and explain the significance of your findings.

When writing about original research, be sure to explain the methodology that you used. Present your findings in a clear and concise manner, and explain the significance of your findings.

9. Distribute and Syndicate Press Releases

Press releases are a great way to get the word out about your latest product, service, or announcement. They can help to drive traffic to your website, and can even help to generate leads and sales. But how do you go about distributing and syndicating your press releases?

There are a number of different ways to distribute and syndicate your press releases, and the best way to do it will vary depending on your business and your target audience. Here are a few of the most popular methods:

1. Send your press release to local newspapers and magazines.

If you’re targeting a local audience, it’s a good idea to send your press release to the local newspapers and magazines. This is a great way to get your story in front of a wider audience, and it can also help to generate some publicity for your business.

2. Upload your press release to online newswires.

Online newswires are a great way to distribute your press releases to a wider audience. There are a number of different online newswires, and most of them are free to use. They can help to get your story in front of a large number of people, and they can also help to generate some publicity for your business.

3. Publish your press release on your website.

Another great way to distribute and syndicate your press releases is to publish them on your website. This is a great way to get your story in front of your target audience, and it can also help to generate some traffic and leads for your business.

4. Send your press release to industry-specific publications.

If you’re targeting a specific industry, it’s a good idea to send your press release to industry-specific publications. This is a great way to get your story in front of your target audience, and it can also help to generate some publicity for your business.

5. Post your press release on social media.

Finally, another great way to distribute and syndicate your press releases is to post them on social media. This is a great way to get your story in front of your target audience, and it can also help to generate some traffic and leads for your business.

10. Create Useful Tools and Resources

Creating useful tools and resources can be a great way to help people learn about your topic and to keep them coming back for more information. When creating these tools and resources, make sure they are of the highest quality and provide value to your audience.

Some great tools and resources to create include:

    Infographics
    Video tutorial
    Quzesiz or polls
    Cheat Sheets
    Comparison charts
    Blog post series

11. Try Broken Link Building

Broken link building is a process that can help you find link opportunities from pages that are no longer available. This technique can be used to find websites that have recently discontinued their services, moved to a new domain, or even deleted their website entirely.

To get started with broken link building, you’ll need a few tools:

    A link prospecting tool like Ahrefs, Majestic, or Moz Open Site Explorer
    A broken link checker like Xenu’s Link Sleuth or Screaming Frog SEO Spider

Once you have these SEO tools in place, you can start finding broken links to your target pages.

  1. Enter your target page into your link prospecting tool.
  2. Export the results to a CSV file.
  3. Use the broken link checker to find all of the broken links on the page.
  4. Add the broken links as links to your target page.
  5. Reach out to the website owner and ask if they would be interested in swapping links.

This process can be tedious, but it can be a great way to find high-quality links.

12. Try HARO Link Building

HARO (or Help-A-Reporter-Out) is a great way to get links. Haro is a service that connects journalists with sources, and it’s a great way to get links from high-quality websites.

To use Haro, simply sign up for an account and submit a query. You’ll then be connected with journalists who are looking for sources for their stories. You can then pitch your story ideas to the journalists, and if they’re interested, they’ll contact you for more information.

If the journalist uses your information in their story, they’ll typically include a link to your website. This is a great way to get links from high-quality websites, and it can help improve your SEO rankings.

13. Share your Content on Social Media

Once your content is created, it’s important to share it on social media. This will help you to reach a wider audience and generate more leads.

When sharing your content, be sure to use relevant hashtags and post it on the correct social media platform. For example, if you’re sharing a blog post, you’ll want to post it on your company’s blog and then share it on Twitter, LinkedIn, and Facebook.

You can also use social media to drive traffic to your website. For example, you can post a link to your latest blog post on Twitter and ask your followers to click the link and read the post.

When sharing your content on social media, be sure to:

    Use relevant hashtags
    Post it on the correct social media platform
    Post a link to your website

14. Get Interviewed on Podcasts

Being a guest on podcast is a great way to share your message with a wider audience

Reach out to some of your favorite podcasts and see if they’d be interested in having you on as a guest. Podcasts are a great way to build your credibility and reach a new audience. Often, those podcasters will include. link to your website. inthe episode description.

15. Sign up for Conferences and Events

There are many great conferences and events out there for entrepreneurs. It can be a great opportunity to learn more about the industry, network with others, and earn backlinks back to your website.

Here are a few events to consider that have relevance to multiple industries:

The Entrepreneur’s Summit

    is a three-day event that brings together entrepreneurs from all over the world to share their stories and advice.

TED

    is a series of conferences that covers a wide range of topics, from technology to business to global issues.

The Ignite Conference

    is a series of events that features 20-minute talks from a variety of entrepreneurs and thought leaders.

The Chicago Ideas Week

    is a week-long event that features a variety of talks and workshops from entrepreneurs and thought leaders.

The American Express OPEN Forum

    is a series of events that covers a wide range of topics for small business owners.

The Small Business Expo

    is an annual event that provides small business owners with the opportunity to learn from industry experts and network with other entrepreneurs.

16. Use Contextual Anchor Text

Anchor text is the clickable text in a hyperlink. When you click on an anchor text, you are taken to the destination URL specified in the hyperlink.

Contextual anchor text are the words that are surrounding the hyperlink. Most webmasters use contextual anchor text because it is more natural and looks less spammy. However, there are times when you might want to use naked anchor text to make the hyperlink more visible.

When choosing anchor text, you should always consider your target audience and the keywords that they are likely to use. You should also use a variety of anchor text to avoid getting penalized by Google.

Start Link Building Today With the Above Tips

There are a lot of reasons why you should start link building today.

When you have a lot of high-quality links from other websites, it can show that your website is a credible source of information.

This can help you prove to Google that your website is reputable and well trusted in your industry..

The post 16 Link Building Tips to Improve Landing Page SEO appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/link-building-tips/feed/ 0
7 Tips for Better Information Architecture on Your Website https://linkgraph.io/blog/information-architecture-seo/ https://linkgraph.io/blog/information-architecture-seo/#respond Tue, 23 Nov 2021 11:45:02 +0000 https://linkgraph.io/?p=10894 Just as traditional architecture determines how people will use a building or another structure, information architecture (IA) guides users in how they use information systems. And while […]

The post 7 Tips for Better Information Architecture on Your Website appeared first on LinkGraph.

]]>
Just as traditional architecture determines how people will use a building or another structure, information architecture (IA) guides users in how they use information systems. And while there are many information systems out there, the most commonly used are websites. 

Unlike the architecture of bridges and buildings, though, information architecture has more moving parts, a more abstract form of ‘building materials,’ and has only been around for a few decades. Additionally, information systems like websites are more malleable and can be adjusted and improved over time.

If you can master the principles of information architecture, you can build a website that will stand the test of time. Whether you’re in the process of creating your website or want to revamp your user experience and content, this article will provide you insight into how you can transform your website into a shining example of well-designed information architecture.

What Is Information Architecture in Relation to a Website?

An index finger pressing a digital button for UX user Experience with other information architecture terms surrounding it.

Information architecture refers to the process your users go through to gather information about your products or services through a website or other digital platform like an app. Information architecture provides people with a systematic way to navigate from point A to point B in order to achieve an action or gain knowledge. In other words, better information architecture promotes easier accessibility of information through intuitive navigation design.

The best information architecture not only streamlines the user’s journey and goals, but it fulfills specific user needs by organizing a vast amount of information into little, easily digestible categories.

From Where Does Information Architecture Originate?

Much of the methodologies, techniques, and principles used to understand and improve information architecture design come from Peter Morville. Morville is the founding architect of this branch of user experience (UX) and content inventory systems. While he was the first, there is a large number of experts in this discipline who develop IA best practices through the Information Architecture Institute and user research.

What Elements Does Information Architecture Include?

Caucasian male hand pressing a digital information map. POV is from the monitor looking out.

Before we dive into how to improve your information architecture, it’s important to have a good sense of what is included in this field of study in relation to your website. While information architecture can apply to library science, spreadsheet science, and even physical structures, we will be focusing on IA in relation to websites.

So where can you find examples of information architecture on a website?

All it takes is for a website to load in order to be flooded with examples of information architecture. Information architecture is the strategic organization and presentation of your website’s content. In fact, nearly every aspect of a website and web design is part of IA. Of course, there is good information architecture and subpar IA, but all of the following are important parts of an IA system that go into your site:

  • UX design/UI design
  • Written content or web cop
  • Graphic design and design patterns
  • Images
  • Buttons
  • Links
  • Layout features
  • Website nomenclature
  • Metadata tags
  • Accessibility features

Good IA comes into play in all of the above. And these elements are often categorized into UX design, content creation strategy, and homepage layout (UI design).

How Do Information Architecture and SEO Work Together?

Search engine optimization (SEO) and information architecture both benefit website owners and web users by improving the internet experience. SEO and IA make quality content easier to find, understand, and navigate. SEO and IA differ in where they fit into the website creation process.

Good IA Supports SEO

SEO has the goal of increasing a website’s visibility through the science of configuring content, front-end web development, and back-end web development in response to search engine algorithms. The result is a website that search engines can find and display as search results to web users’ inquiries. This is an ongoing process. SEO requires a proactive and reactive approach since algorithms often change. Additionally, search engines see value to websites that regularly update their content.

SEO specialists regularly improve a website’s

  • Written content
  • Loading speed & responsiveness
  • Organization
  • Visual design
  • Graphics and photos

Information architecture often works best when established before active web design begins. IA establishes a framework that supports the efforts of SEO specialists for the lifetime of a website. With a well-strategized IA, a website will have a strong foundation of logical organization. This makes a website more enjoyable from the user’s perspective since they can find what they need easily. In turn, this improves the website’s reputation. A better reputation increases the website’s authority and pushes it higher on search engine results pages, so more people can find it.

Graphic of how IA and SEO work together with black background and pink and blue graphics

Good information architecture only as to be designed once.

Like most systems, the best IA only has to be designed once. If an IA system is effective, it will allow a website to scale and respond to changes needed for the most current SEO strategies. As more blogs, products, or landing pages are added to a website for SEO, good IA already has a designated location and system to handle them.

Why is Information Architecture Important in UX?

Blond white woman in lab coat putting fingers to head in a ‘mind blown’ gesture

As your local librarians will tell you, providing easy access to information is priceless. Information is both empowering and vital for the best individual experience and a better society. However, when it comes to your UX, IA has a more specific importance. It increases your brand’s value to potential clients while bolstering your sales.

Good IA structure based on set principles has the power to help people find what they are looking for within seconds. One of the simplest examples of this is concise and accurate folder labels in your Google Drive. This naming or navigation system allows you to access the files and information you’re looking for quickly and effortlessly–leading to less frustration and wasted time.

While more complicated, Google Maps also uses IA to help people find what they’re looking for in the physical world. For instance, if you type “food near me,” your search results will be full of nearby restaurants. This demonstration of IA is a perfect example of what it means to help a user understand what they are looking for since the user is likely looking for businesses that provide food. 

How to Improve Your Information Architecture

Improving your information architecture can turn your website from an ordinary e-commerce page into a resource visitors enjoy using. These tips can guide you through how to improve your IA and help you prioritize which tasks to begin with.

1. Utilize wireframes in the prototype stages of your sitemap and IA design development.

Male hands moving elements of a website around on a hand-drawn IA or wireframe website layout

Wireframes serve a multitude of purposes when it comes to developing strong IA and a sitemap. They work superbly as information architecture diagrams that can be moved around and changed before your design is finalized. 

At their very core, wireframes connect your IA to its UX design. In striking similarity to an architectural blueprint, a wireframe functions as a skeletal outline of a site or mobile app. However, this method of UX development is not limited to visual design, unlike a mockup. To accurately determine the logic of your site’s flow and the intended customer journey, this is a necessary step in your IA project timeline. Your site’s intended functions can best be evaluated through wireframing.

Through wireframing, you will have a solid idea of your visual hierarchy when you are ready to move your site to the content strategy phase. Common elements of a wireframe include 

  • Search fields
  • Breadcrumbs
  • navigation systems
  • Headers and footers. 

Ideally, you would use wireframes during your initial UX/UI design process. However, you can still utilize them on an existing website.

Identify Paths with Wireframes

Aside from assessing functionality, wireframing is a particularly useful method of identifying paths between web pages. This critical phase of the IA process will allow you to visualize how much space should be allocated for specific content.

When Prototyping Your Visual Hierarchy, Start with a Sketch

Low-fidelity wireframe versions of a website are quick to develop and more abstract because their main focus is on the visual hierarchy of your site. These bare-bones prototypes often implement mock content (like Latin text) as filler for spatial visualization. However, they provide you with a guideline for content volume when the time comes.

Linking concepts to tangible images and links can be a complicated process, even for the seasoned designer. If you have trouble getting your ideas to match your result, consider implementing a mind mapping software like XMind. XMind is a productivity tool used professionally to solidify brainstorming.

Move from Broad to Detailed Wireframes

Laptop sitting on a wood surface, open to high fidelity wireframe with images, text, and a search function

Conversely, high-fidelity wireframes are more detailed versions are excellent blueprints for interaction design. They include metadata about a particular page element, like its behavior or dimensions. These more detailed versions are excellent blueprints for previewing your interaction design.

2. Keep your brand personas in mind throughout the UX design and content strategy process.

A 5 by 5 grid showing various women with different styles to demonstrate a variety of brand personas

Unity and consistency across your brand are integral parts of a solid information architectural system. 

Your site is a reflection of your brand, from the elements of your visual design down to each blog post and product page. Accordingly, you should be keeping your brand personas in mind each time you implement a UX feature or post a new content piece. This ensures fidelity between your company and your target audience. Use your personas as a guide to help you, your design team, and your content strategist collaborate on your ideal user perception. 

Define and Implement Your Goal User Perception

Your goal user perception is the way you would like customers or potential customers to view your brand. Before making one of the many decisions IA requires, run your ideas through this line of questioning:

  • Does this align with the image I want to create for my brand?
  • Will this decision affect consistency across my site or organization?
  • Am I appropriately conveying the good qualities of my business?
  • Does this get us closer to our main goal?
  • How does this project fit into the future of our company?

Any content or design elements that do not hold up to this line of questioning can be eliminated. Not only can this process help you avoid inconsistencies, but it reduces the possibility of having too much content on your site. This benefits your web admins, especially those who keep up with content creation for SEO purposes.

3. Your visual hierarchy determines readability, so prioritize your content accordingly.

A laptop with monitor behind on a desk with mouse beside showing website with visual hierarchy

Visual hierarchy is a principle of laying out and sizing visual elements to denote their importance to the viewer. For example, alignment, texture, whitespace, and contrast are a few of the visual design concepts that can help draw users’ attention to the right content. An effective user interface design does more than simply provide information. A quality hierarchy can persuade and impress users.

There are a few aspects of visual hierarchy that are highly beneficial to apply when creating UX design based on cognitive psychology. 

Visual Hierarchy Principles to Keep in Mind:

1: Larger images are perceived as more important

2: Bright colors garner more attention

3: Elements that are aligned are more pleasing to the eye

4: Higher contrast demands more attention

5: Repetition tells the viewer that elements are related

6: Proximity (or closeness) denotes interconnectedness in topic

7: More white space around an element draws more attention to it

Visual unity isn’t just essential to your brand image, it is also a critical part of your UX design. Familiar colors, menu hierarchies, and diagrams promote consistency and fluid usability. Even small distractions like slow-loading graphics or unaligned text columns can interrupt the user experience.

There are several useful IA software that can assist you in your UI development process, like OmniGraffle. OmniGraffle is used to create visuals and graphics for use in prototypes and mockups. As mentioned above, high-fidelity site frameworks utilize these types of visuals and graphics to help designers strategize where to put information and why it belongs there.

Visual Tidiness Affects More Than Just Usability

If you have ever been to a site that was unattractive, cluttered, or disorganized, you likely formulated a negative opinion of that business or organization. Perhaps you even deemed the information to be less reputable due to the nature or design of the site. This is why it’s important to stick to simplistic and user-friendly design. Together, a pleasant UX and UI can boost user confidence and solidify your site’s credibility. 

In addition to building trust among your users, a quality UX also lets Google and other search engines know that your site is worthy of ranking.

4. Structure and categorization are fundamental.

A screenshot of LinkGraph’s menu to demonstrate how content is structured and categorized

One mistake many people make is putting their content all in one place. In fact, overstuffing information into a single URL causes your UI to suffer, since there is no hierarchy or sense of organization. Too much information on a single page takes users much longer to sort through content to find a specific piece of information.

Users should be able to locate all desired information on your website quickly and easily. This requires a well-planned site map.

family Feud gif

Category is… A Better User Experience

To create a better structure, you must first go through the process of categorization. Categorization is the process of organizing your content into a taxonomy system. Categorization is an integral part of navigation design because it has the ability to guide the user to the right content. 

Start by Finding Commonalities

Begin by grouping your content by similarities in content type. For example, in the image above, you will notice, at LinkGraph, we group our resources by format type (eBooks, blog posts, case studies, and videos).

The most common similarities should be land higher on your sitemap since they’re usually the starting place for narrowing down the user flow for optimal navigation.

gif of black cat filing claws

For example, if your website centers on pet care, you likely will want to first group your products or articles by pet species. From there, you may want to divide the information or products into what aspect of care they provide. As you can see, this would make navigation easier for cat owners looking for a technique or clippers to trim their cat’s nails.

Using tools like SearchAtlas can make long-term organization easier by allowing you to group pages into categories. This allows you to see the category performance, so you can target where you can eliminate or improve content.

A screenshot of SearchAtlas’ page group performance view

Eliminate Unnecessary Content and Categories

While generating new content is extremely important, making sure you have room for this content on your site is also essential. It can be tempting to hold onto content that you have created, but it is best to let it go to make room for site updates.

Omitting unnecessary or irrelevant data can also enhance the user experience. So, don’t be shy to perform a content audit and delete pages that receive little-to-no traffic. A potential customer looking for a specific piece of information may become frustrated or lose interest in your digital product if it is too difficult to find.

 

5. Your homepage shouldn’t be the only local navigation point.

A photo of a laptop with a homepage to an interior designer

While the ideal destination page is the homepage, users find nearly endless different ways to land on a website. For this reason, the digital design of each page on your website should share the same functions as your homepage.

Your website will likely be backlinked on other websites to enhance reputability and SERP ranking when your business begins implementing a content strategy. Since backlinking incorporates relevant keywords that may bring visitors directly to content, such as blogs or guides, you should ensure that every entry point of your website is equally user-friendly and visually appealing as the homepage in order to make a good first impression and move visitors beyond the landing page.

 A landing page for HGTV on tips for adopting a dog. A pink box around the navigation menu.

Alt: A landing page for HGTV on tips for adopting a dog. A pink box around the navigation menu.

For example, if a user enters your site through the contact page URL, it should be easy for them to find navigation elements that will take them to the homepage or digital product browsing section.

Provide tools to make finding resources easier

An efficient search system is the backbone of a great user interaction design. This allows an of your webpage participants to find what they’re seeking in seconds rather than minutes.

Provide FAQs with links to more specific information. This gives users the choice of how much information they need and an easy way to access it.

Keep a navigation menu at the top of all your subpages. Subpages need to provide access points for other activities you offer–Otherwise, your users may never travel from a subpage to your sales funnel (or another offering on your main page).

6. Go through the customer journey then map out a blueprint for improvements

Two caucasian women sitting side by side with a laptop between, going through the customer journey

Alt: Two caucasian women sitting side by side with a laptop between, going through the customer journey

The best usability testing you can perform is going through the actions of a potential customer. You can do this yourself by going through your website manually. Mind maps can also make the task of mapping the customer journey easy.

For best results, anticipate how a user will engage with your interaction design. Once you have a clear blueprint of your users’ needs, you can create an information hierarchy and a sitemap. Your sitemap allows Google’s bots to crawl your URLs to identify information used for SERPs.

Keep Speed In Mind

In general, the online community values convenience and speed above all. A recent UX study demonstrated that 53% of visits are abandoned if a mobile app or site takes longer than three seconds to load. This means from decision points, your web design has about 3 seconds to sort and present the piece of information digital product the user is looking for.

This is to say that load time, page speed, and click response are essential parts of your information architecture and it is important to keep up with their performance. Thankfully, tools like the

Artificial Intelligence and the Customer Journey

The behavior of an internet user is relatively predictable, and artificial intelligence technology can now mimic user activity for rapid results from AI user testing and other usability testing efforts. In conjunction with heatmaps, you can pinpoint where users tend to get hung up and turn decision points into exit points.

Perform Regular Performance Audits and Fixes

A screenshot of SearchAtlas’s issues reporting tool for site performance

SearchAtlas can make tracking and monitoring performance simple once your site is live. This can help you improve the customer journey by identifying navigation issues such as broken links.

Identify What Pages Visitors Use Most with GSC Insights

Screenshot of SAs to pages tool

Finding which pages on your site that visitors utilize most can help you prioritize their functionality when auditing your site’s performance. This also gives you insight into which categories of content your target audience is most interested in.

7. Make sure the information part of your information architecture is high quality.

Male arm and hand holding a pink highlighter as he prioritizes an idea on a planning web.

Findability, usability, and graphic design are all essential elements to good IA. However, the content you are managing needs to be as relevant as it is organized. The same way an information architect is well-versed in the science of organization, content strategists and content creators are experts in SEO and how to improve the content structures.

Reader engagement is a must when it comes to engagement time and scroll distance. The easiest way to improve your content to encourage deeper navigation is with clear headings as a road map to your content. The first thing many visitors will do is preview your headings and images for relevance to their search terms.

The quality of your metadata and headings will also drive more visitors to your site and reduce your bounce rate.

Structure Your Content for Usability and SEO

The Core Web Vital update made the structure of content an even higher priority. This change takes into account how long it takes for users to access the most important aspects of your website. The difference is now most IA design locates data-heavy elements below the page fold. And if these elements are vital assets to your brand, you need to give visitors a reason to scroll through pieces of content far enough to move beyond the fold. This is where the quality of your content comes in.

Information Architecture: The Science of Organizing the Customer Journey

The impact of well-strategized information architecture continues to become more and more profound. With information architects, UX experts, and content auditors, websites are better able to provide every user with easier access to their desired outcomes. Through the science of user behavior, cognitive psychology-based UI design, and strict hierarchy patterns, IA is improving the internet for all users.

Better IA can set your business apart from the competition. With LinkGraph’s team of visual designers, content curators and creators, and UX web developers you can turn your site into a top-performing contender on search engines, the worldwide web, and among your loyal base of customers. If you’re ready to watch your business grow, we’re ready to take on your next project.

The post 7 Tips for Better Information Architecture on Your Website appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/information-architecture-seo/feed/ 0
Case study reveals results from LinkGraph’s Best B2B Campaign 2021 https://linkgraph.io/case-study-bright-pattern/ Tue, 06 Jul 2021 08:54:03 +0000 https://linkgraph.io/?page_id=9631 The post Case study reveals results from LinkGraph’s Best B2B Campaign 2021 appeared first on LinkGraph.

]]>
The post Case study reveals results from LinkGraph’s Best B2B Campaign 2021 appeared first on LinkGraph.

]]>