You searched for Infographic - LinkGraph https://linkgraph.io/ High authority link building services, white hat organic outreach. Tue, 29 Nov 2022 22:40:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://linkgraph.io/wp-content/uploads/2021/06/cropped-LinkGraph-Favicon-32x32.png You searched for Infographic - LinkGraph https://linkgraph.io/ 32 32 14 Content Upgrades That’ll Skyrocket Your Lead Generation https://linkgraph.io/blog/content-upgrades/ https://linkgraph.io/blog/content-upgrades/#respond Wed, 23 Nov 2022 14:41:12 +0000 https://linkgraph.io/?p=23477 Lead generation is vital when it comes to getting customers interested in the goods and services you’re selling. One of the best ways to do this – […]

The post 14 Content Upgrades That’ll Skyrocket Your Lead Generation appeared first on LinkGraph.

]]>
Lead generation is vital when it comes to getting customers interested in the goods and services you’re selling. One of the best ways to do this – as you probably already know – is to write consistent blog posts ending in calls to action (CTAs). However, it doesn’t have to stop there. You can utilize your already existing blogs to increase lead generation by offering content upgrades to your articles if a user signs up for email content. These content upgrades can come in many different forms, and encourage readers to submit their email addresses and other information about themselves in exchange for exclusive content. 

Here are 14 ways to upgrade your content alongside your blog posts to increase lead generation 👇

Content Upgrade #1: Use CTAs in Blog Posts

As mentioned in the introduction, if you’re not already putting CTAs in your blog posts, you really should be. CTAs compel readers to take action once they’ve read your blog to either find out more about the topic or to engage with your goods and services. Include links to other internal blog posts, or to a relevant service you can provide which is connected to the topic of the blog.

This is especially important when we consider that inbound links are one of the most important factors Google takes into account when ranking your website, and play a critical role in SEO.

Content Upgrade #2: Continued Content

Continued content is a type of exclusive content where you simply continue the blog post for email subscribers only. This increases lead generation by getting users to input their email addresses to read the full article. Sometimes, it’s a simple case of instead of writing a top 15 blog post, write a top 10 but include an extra 5 points in the email marketing campaign sent out to subscribers.

To do this effectively, you’ll need to include a CTA at the end of your blog post with an email subscription box informing readers they can have the rest of the blog sent directly to their email address.

Content Upgrade #3: Exclusive Discounts

If you have a landing page where your ultimate CTA is to sell a product, offering exclusive ‘members only’ discounts to email subscribers is a great way to increase lead generation.

The discounts don’t have to be huge – even 5-10% will be enough to get a portion of your readers to sign up with their email address. Nowadays, this is a common practice even in large companies to improve lead generation.

Content Upgrade #4: Cheat Sheets

Cheat sheets are an excellent form of content upgrade because it provides a huge amount of value to both you and your subscribers. These are especially useful for blogs surrounding practical topics such as coding. For example, you could create a cheat sheet of simple commands for a particular language such as Linux, meaning subscribers can use your simple cheat sheet in their day to day lives. Connecting this to a blog post about the topic it’s attached to is a great CTA and should drastically increase lead generation.

Content Upgrade #5: Checklists

Checklists, like cheat sheets, are a great way for your subscribers to practically engage with your content and to make their own lives easier. It also gives your audience an opportunity to keep themselves on track with whatever they’re working on with a list outlining every step they need to complete to achieve their goals.

Checklists are also a great way to show off your creativity and visual flair to customers by making an aesthetically pleasing piece of content which matches the style of your brand.

Content Upgrade #6: Taster Courses

If you’re offering some kind of course or service alongside your blog posts, why not curate a small, free course for your email subscribers to increase lead generation towards your paid courses?

The course could really take the form of anything – from e-books and video tutorials to an entire email series. You can also take this into account when thinking about what your course actually offers. For example, if your blog posts or company provides advice for start-ups you may want to offer a free email course on how to use social media marketing.

Content Upgrade #7: Printable PDF Guides

A great way to get your subscribers to practically engage with your brand is by sending printable planners, worksheets or goal trackers for them to fill in using pens and pencils. Although this may not be for everyone, if your business provides advice, goods or services for creatives it may just be the best option. 

The best place to start when creating a printable and downloadable PDF guide is to think about what people enjoy writing down or crossing off and how this could relate to your blog content. You can create even a basic Google Doc that they can print and use.

Content Upgrade #8: Audio Blogs

Multiple newspaper websites have already taken to providing audio options for readers who may be hard of sight or who need their eyes elsewhere while they listen. Giving your readers the option of listening to your blog posts rather than reading them is an excellent way to both boost inclusivity and capitalize on those who would rather listen than read.

Additionally, if your organization holds livestreams or webinars these can be re-uploaded and distributed to your customers via email. The easiest way to do this would be by uploading the content as an unlisted video to YouTube, meaning only those with the link from the email would have access to it.

Content Upgrade #9: Scripts Templates

Scripts are great for anyone looking to send out mass emails, or who email the same types of people very frequently. This could be super helpful for PR managers reaching out to journalists, or anyone who is struggling to find the right words for certain situations. 

A great way to implement this would be as a CTA on articles which are designed to target people who are new to a certain employment status, e.g. new PR managers or anyone else who needs to get to grips with the basics of their job.

Example: Prowly

Content Upgrade #10: Case Studies

Case studies are a great way to connect simultaneously with both customers and those on your email list. Showcasing your product or service’s success is a great way to increase lead generation, since it shows customers exactly how your product or service could be used in situations that may be similar to their own. 

Example: LinkGraph

Case studies are usually accessible on an organization’s website, but using them as a way to generate leads is also a really smart idea.

Content Upgrade #11: Challenges

Another interactive way to improve lead generation is through challenges. This could be through email or done via avenues such as Facebook groups so everyone can participate and interact with one-another. This works great as a CTA in a blog post dedicated towards a certain goal, such as increasing your email list to X amount of people or creating a certain amount of content within a certain time. 

This simultaneously allows your readers to connect with you as a creator whilst creating a fun and productive challenge to generate new leads and email signups. As a challenge reward, your readers can also get bonus content such as free templates or checklists.

Content Upgrade #12: Infographics

Creating infographics might sound a bit daunting, but infographics can be made relatively easily using certain graphic design tools such as Canva or Visme. Infographics are great for providing statistics and quotes in a visually appealing style, and the format is easy-to-understand and digestible. 

Infographics can also be used for data visualization using charts and graphs to share significant numbers and data points. These can be shared as a CTA at the end of a blog post, telling the reader that if they sign up to the email newsletter they can find all of the statistics behind your blog post in one place.

Content Upgrade #13: Ebooks

Free ebooks are an excellent way to provide further information on a relevant topic covered in a blog post. As a CTA, they can be useful in persuading people who want to learn more about the topic to subscribe to your email list. The great thing about these is they can be as long or as short as you want – there really is no standard length for an ebook (you can also write a mini ebook 🙂

Example: Leadfeeder

Content Upgrade #14: Free Trial

Free trials are used ubiquitously across all industries, and for good reason. Giving customers a sneak peek into content they could have access to for a monetary fee makes them more likely to pay for the full experience. Even if your organization offers a free version of the paid product, giving email subscribers access to the full experience is a great way to increase your conversion rate and convert free users to paid users.

Increasing your Business’s Lead Generation with Content Upgrades

These 14 content upgrade ideas should be enough to give your business all it needs, get more lead magnets and improve lead generation efforts. Better CTAs in your blog posts combined with more people signing up to your mailing list should drastically improve your organization’s lead generation abilities.

 

About the Author: Romana Hoekstra is a Content Marketing Lead at Leadfeeder, a B2B visitor identification software that tracks and identifies companies that visit your website. Currently, she is leading the remote-first content marketing team and crafts high-performing content marketing strategies with a focus on organic growth, SEO, high-quality content production & distribution. You can connect with Romana on Linkedin.

The post 14 Content Upgrades That’ll Skyrocket Your Lead Generation appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/content-upgrades/feed/ 0
Law Firm SEO – A 20 Step Action Plan for Attorneys https://linkgraph.io/blog/law-firm-seo/ https://linkgraph.io/blog/law-firm-seo/#respond Wed, 09 Nov 2022 01:17:20 +0000 https://linkgraph.io/?p=3053 By applying an effective law firm SEO strategy, you’ll leap ahead of most of your competitors. To help your law firm rank #1 in Google, here's a step-by-step guide to putting together your SEO campaign.

The post Law Firm SEO – A 20 Step Action Plan for Attorneys appeared first on LinkGraph.

]]>
Are you trying to win new clients for your law firm? Prospective customers are already using search engines to find you, and having an effective SEO (search engine optimization) plan for your law firm is the best way to take advantage of this!

Need proof? Check out these stats

  • 96% of people use a search engine to seek legal advice.
  • 38% of people use the internet to find an attorney.
  • 62% of legal searches are non-branded (for example, Miami car accident attorney).

Additionally, your law firm website is a great spot to generate new leads for your firm. 74% of consumers who visit a law firm’s website end up taking action, such as contacting the firm by phone.

Also, the lawyer SEO competition doesn’t necessarily reflect the legal market you’re in. Only 35% of law firm websites have been updated in the last 3 years, and 40% of law firms don’t even have a law firm website.

In short, by applying an effective law firm SEO strategy, you’ll leap ahead of most of your competitors.

To help you put together your own SEO campaign, I’ll show you how to rank your law firm #1 in Google – step-by-step.

ARTICLE CONTENTS

TECHNICAL SEO
Step 1. Determine Website Structure
Step 2. Setup Your GMB Listing
Step 3. Improve Your Site Speed
Step 4. Mobile Optimization
Step 5. Implement SSL
KEYWORD RESEARCH
Step 6. Understand Search Intent
Step 7. Find the right keywords
PAGES & CONTENT
Step 8. Identify User Content Goals
Step 9. Format Your Pages Properly
Step 10. Optimize Your Home Page
Step 11. Create Practice Pages
Step 12. Rank Better with Blog Posts
Step 13. Fix Zombie Pages
DOMINATE LOCAL SEARCH
Step 14. Tailor Pages to Markets
Step 15. Legal Directory Citations
Step 16. Claim & Manage Reviews
LINKS
Step 17. Outbound Links
Step 18. Inbound Links
MEASURE RESULTS
Step 19. Tools to Use
Step 20. KPIs

TECHNICAL SEO FOR ATTORNEYS

Step 1. Determine Your Website Structure

Structure your own website so your users (and Google) can find everything. Your website needs to have a defined structure. Without one, it’s difficult for users to navigate and difficult for search engines to crawl and discover your web pages.

Structuring your site for your users

Users need to be able to easily find what they’re looking for. This means that you need to understand what information people seek out when visiting your law firm’s website and put that important information  on the homepage or make it easy to access from the navigation bar.

For example, if a prospective client is looking for a personal injury attorney in Miami, they may search your firm’s website for practice areas, office location, reviews, and the about section.

Look at how this law firm’s website quickly addresses those needs with their navigation bar.

Putting critical items in the navigation bar makes them quick and easy to access. Take a look at these three examples of law firms ranking on the first page for “personal injury attorney” in NYC, and you’ll notice they include each of the items above in their main nav.

EXAMPLE 1

EXAMPLE 2

EXAMPLE 3

For any practice area, it’s a good idea to have these items in your navigation menu :

  • Practice areas, either directly on your navigation menu or as a dropdown if you have multiple services. This gets your law firm’s services based keywords on every page of your site, sending strong relevancy signals to Google crawlers.
  • Location information, either directly on your navigation menu or as a dropdown if you have multiple locations. This gets your location-based keywords on every page of your site, sending strong relevancy signals to search engine algorithms about the geographic area you serve.
  • A link to your attorneys or about page, which should give an overview of the years of experience of your whole legal team.
  • A link to your reviews or testimonials page – to build trust.
  • A link to your “Contact Us” clearly labeled with a unique color where visitors can contact you via phone number, email, or an embedded contact form. This is a call to action.
  • Your phone number. This is another call to action. Even if you already have a “contact us” button that links to a contact page, 74% of people who land on a law firm’s website are likely to contact you via phone, so making this form of contact as simple as 1 click is to your advantage.

If you’re unsure about what users are likely to look for on your website, search Google for your practice areas and look at the top ranking competitors sites to get ideas for your navigation and site layout.

Finally, it’s important to make sure your navigation menu is usable both on desktop and mobile.

In fact, 31% of all law firm related website traffic comes from mobile, so a large amount of your leads are likely to come from a mobile device.
Take a look at how this law firm’s website made their navigation menu easy to access and use on mobile phones.

You’ll notice that the area between the buttons is large enough that everything is easy to touch – even if you have a small screen and big fingers.

This is referred to as the “tap area” of a button and is a key component of converting on mobile devices. Make sure this is sized appropriately for phones and fingers of all sizes. Users can become easily frustrated if they have a difficult time tapping the correct button on a mobile device and may leave your site.

Structuring your site for Successful SEO

Google also uses your law firm’s website structure to determine what website content is important and relevant information. Here are a few ways to help Google crawl your law firm website in a more effective way.

Use proper page and URL Structure

Ideally, your website as a whole should be structured like a pyramid, with your home page at the top, your category pages (the ones in your navigation menu) beneath that, and your individual pages beneath your category pages.

Not only does this make it very easy for users to find relevant content on your site, but also makes it easier for search engines to index each page of your website.

When formatting your URLs, this means that any pages linked to in the main navigation menu are only one folder deep from the homepage.

This means that they should only have one slash after the .com, .net, etc (aka. the “top-level-domain”).

So, your about page should look like https://yourdomain.com/about

Any individual pages that are a subset of your category pages, like blog articles, should only have two slashes after the top-level-domain.

For example, blog articles would look like this: https://yourdomain.com/blog/how-to-hire-a-personal-injury-lawyer

Clear URL structure makes it easy for search engine crawlers to find pages on your law firm website.

Clear linking and navigation titles

The placement of navigation items is an important factor for users and search engines alike. While users are more likely to pay attention to navigation titles, search engines use the anchor text of these navigation items to determine the topical relevance of a page.

What is anchor text?
Anchor text is the clickable text in a hyperlink.
Here’s what it looks like in your site’s code

With this code in place, the anchor text “Jon Wye’s Custom Designed Belts” would link to the URL “https://www.jonwye.com.

If we inspect the code on Harell & Harell’s site, we can see this in action. Here’s the navigation menu item’s anchor text for the user.

And here is the URL structure for the link the navigation menu item points to.

The anchor text of your navigation items is important because it sends “link signals” to search engines that tell them “Hey, these pages are very important!” By having these links on every page of your law firm’s website, you’ll be sending strong link signals to search engines and helping them understand what these pages are about – because of the anchor text.

These same link signals can be leveraged in the footer of your law firm website as well. Adding links to pages such as to your blog, privacy policy, or sitemap in the footer can help boost the link signals to these pages without taking up space in the main navigation menu.

Proper use of H tags – How to use H tags for SEO

Header tags (commonly called H tags) outline the structure of your page. Often times, an H tag is used as the title displayed on the page, while the page title is what’s displayed in the organic search results.

These tags are often followed by a number – H1, H2, H3, etc. This is to show where they lie in the hierarchy of your page structure.

Common H tag page formatting looks like this:

See how they outline the hierarchy structure of a page? H1 would be the page title, H2 would be a subtopic of the page, and H3 would be a subtopic of the H2 header.

Notice the difference between these 2 articles. One is using H tags properly, while the other is writing their headlines in plain text.

Header Tags Used Properly

Header Tags Not Used Correctly

Using H tags for your headlines helps search engines understand the structure of your page and makes it easier for your users to find what they’re looking for more quickly.

When writing your H tags, keep a few things in mind:

  • Only use 1 H1 tag on your page.
  • Use H2, H3, and other H tags to segment out the content of your page.
  • Use related keywords in your HTML tags.

Step 2. Create and optimize your law firm’s Google My Business listing

85% of people use online maps, such as Google Maps, to find legal services.

Google Maps is a huge part of local SEO. If your firm largely targets local clients, then getting listed on Google Maps is a must.

How to add your law firm to Google Maps

So, how do you get listed on Google Maps?
By creating a Google My Business listing.

Here’s how.

Google My Business best practices

Google uses information from Google My Business to display information for searches that have local area intent.

Not only that, but rather than listing information from your website on search results, Google often pulls business information from your Google My Business listing as well.

The information for Morgan & Morgan in the above screenshot is coming from their Google My Business listing.

Clearly, it’s important that this information is up-to-date, accurate, and fully optimized.

How to optimize your law firm’s Google My Business listing

Here’s how to optimize your firm’s Google My Business account:

  • Enter your business information correctly on the map so users can easily find you.
  • List the official website of your law firm.
  • Include your opening hours.
  • Make sure your business name, address, and phone number is EXACTLY the same as listed on your website. Google aggregates this information from across the web.
  • Choose the most appropriate and specific category for your firm so that you show up in the right search terms.
  • Add photos of your office, staff or anything else you’d like that’s relevant and professional.
  • Describe your law firm. Include links and relevant keywords in the introduction.

If you’re interested in seeing how users behave with your listing, check out Google My Business insights.

Step 3. Make your law firm website as fast as possible

Google is now mobile-first, which means they assume users are accessing your site with a 5G connection.

They want to provide users with a great page experience. Presenting users with slow websites doesn’t accomplish this, so if you want higher rankings, your website needs to be fast.

Due to its impact on user experience, website speed is one of the most important SEO ranking factors.

If a website takes a long time to load, the user will click back to Google to find a better choice. Google will simply think the user didn’t find what they were looking for and your website rank will drop.

Amazon found clear correlations between page speed and bounce rate. Just a few seconds too long, and your users are 32% more likely to leave.

Google takes page speed and bounce rate into consideration when ranking your website, so it’s important to make your site as fast as possible.

To make your site as fast as possible, use Google’s PageSpeed Tool to see how your site loads on desktop and on a 5G connection. This tool is a simple way to discover any issues that you can address to make your site faster.

Step 4. Make sure your site is mobile friendly

Consider this – you’re a personal injury attorney, and a potential client just got into a car accident.

They try to access your site to call you, but they have a poor mobile connection.

Or worse, they’re nervous – their adrenaline is pumping – and they’re having trouble tapping their screen with accuracy.

Your website takes too long to load, and when it finally does, the user pushes the wrong button on accident, so they move on to the next listing in Google.

This is why mobile optimization is important for attorney websites.

At a minimum, you should make sure that:

  • Your website loads quickly on mobile.
  • Your buttons are sized well enough that people with small screens or big fingers can tap them without accidentally tapping a different button.
  • Keep important information above the fold – i.e. keep your call to action visible without requiring users to scroll down.
  • Have a click to call button for mobile.

Check out Lawrence Law Group’s site as an example of doing this correctly.

Finally, you should make sure your design is great. 57% of users won’t even consider your firm if the website is poorly designed on mobile.

This, and Google prioritizes mobile experiences when ranking websites.

Step 5. Secure your law firm’s website with SSL (Secure Socket Layer)

Ever come across a website and see something like this?

Or worse, this?

Do these websites encourage trustworthiness or make you feel that your data would be safe?

As an attorney, you know that trust between you and your target audience is important, so why would this be any different online?

This is what happens when a website isn’t secured with an SSL certificate.

SSL stands for Secure Sockets Layer, and is essentially a form of validation for your website that confirm there aren’t any intermediaries between a page and the web host that could potentially steal a users information.

Basically, an SSL certificate proves that a website is who they say they are. This is shown by a site having https instead of http at the beginning of their domain.

Google has also confirmed that it is, in fact, taken into consideration for rankings.

Often times, you can get an SSL certificate through your web hosting provider. They’re usually available for an annual fee, and will fix all of the issues associated with “website not secure” popups or messages.

If you prefer to go the route of free, or would rather have your SSL certificate not tied to your web hosting provider, you can use a service like Let’s Encrypt instead.

Once you get your certificate set up, plug your homepage https URL into Why No Padlock? to have their tool crawl your site and make sure it’s implemented correctly.

A few things to note about getting your site SSL certified:

  • Google will treat this as if you’re moving your site to a new domain name, which means you may temporarily lose search rankings and organic traffic until Google crawls your site again and reindexes your new https pages.
  • You could end up with a lot of broken links, so it’s important to make sure you properly 301 redirect your http links to your new https links when migrating to https.

KEYWORD RESEARCH

Find out how to get #1 on Google rankings and beat the competition
Book a Call

Step 6. Understand searcher intent and keyword types

Before you start optimizing your law firm’s website, you need to know what kind of keywords you’re going to go after.

In attorney SEO, as in all SEO, a keyword is really just a search term that Google’s users type to find what they’re looking for.

What they’re looking for is described as searcher intent – and can be broken down into three categories:

  • Awareness
  • Evaluation
  • Purchase

Searcher intent addresses the question “what are the searchers really looking for?”
Let’s assume a musician is trying to copyright their music. Here are some search terms they might use in each stage.

  • Awareness – This is where the musician is looking for answers, resources, educational material, and insights. Example search terms (or keywords) may include:
      • How can I protect my music?
      • Do I copyright or trademark a song?
      • How to copyright a song
  • Evaluation – This is the middle stage where the musician knows what needs to be done and is researching options. Example search terms (or, again, keywords) in this stage may include:
      • Music lawyers near me
      • Who are the best copyright lawyers in Nashville
      • Copyright lawyer reviews
  • Purchase – This is the final stage where the musician is figuring out what it would take to become a customer. Keywords used in this stage are likely to be very specific:
    • Law firm name contact info
    • Lawyer name contact info

In this example, the musician wanted to protect their music, learned more about what’s involved, then narrowed down the options until they found the best one for them, then took steps to contact the appropriate firm.

The closer a user gets to a purchase, the longer the keywords usually are. This is where the phrase long-tail keywords comes from.

Step 7. Find the right keywords

Now that you understand searcher intent and know about how people use Google to make purchases, let’s dive into some keyword research.

To find new keyword phrase ideas, just head over to Google’s Keyword Planner, log in, and click “Discover new keywords.”

Next, enter your website or a keyword of your choice to get started. For this example, I’m going to enter a keyword.

Finally, click “Get Results” and you’ll be able to browse a huge list of keywords Google’s tool has generated for you!

You’ll notice that you have columns that show you the monthly search volume and the cost-per-click bid range if you were to run ads.

If a keyword has a high bid, that means advertisers are bidding high amounts for that search query in PPC advertising campaigns –likely because it drives sales.

That means these keywords are likely to have high purchase intent. These are the keywords that you’ll likely want to target with pages that have lots of call-to-actions.

If you want more keyword ideas, you can leverage Google. Just take one of your chosen keywords, plug it into Google’s search box, and look at the “People also ask” section.

If you click one of the questions, Google automatically generates more of them.

It can literally be an endlesssupply of keyword ideas!

When you find your keywords, remember to use your primary keyword within the H1 and title tags of your page. This gives the search engines a clear indication as to what the page is about. For more information on keywords and keyword research check out Keywords 101: A Beginner’s Guide.

PAGES & CONTENT

Step 8. Understand Google’s content preferences

Google prioritizes pages based on how it views search intent for different terms. You won’t be able to effectively rank a product page for an informational search.

Google often prioritizes long-form content, but content that meets a user’s need always wins out.

It’s the difference between “how to find a good accident lawyer” and “accident lawyer near me” searches. One will land on a blog post/long form content, the other on a directory or services page.

Think about it like this – someone with a broken faucet is looking for contact info for an available plumber, not a long-form article on plumbing.

Step 9. Format your pages properly

When formatting your page, there are a few things that need consideration.

  • Formatting your titles
  • Using H tags
  • Writing meta descriptions
  • Formatting your content

Let’s go over each of these.

How to write your page titles for SEO

The page title is the clickable headline of your page that appears on search engine results pages (commonly referred to as SERPs).

It also appears in browser tabs, like this:


In the HTML code, these are usually surrounded by title tags, which look like this:

In most website editors, including WordPress, you won’t need to actually access or write the code. They’ll automatically apply the title tags for you when you write the title of your page.

When writing your title tags, keep a few things in mind.

  • Keep your titles about 55-60 characters long. Too short and they aren’t detailed enough, but too long and they’ll be cut off at the end in search engines, meaning people won’t be able to read them.
  • Use your target keyword in the title as close to the beginning as possible.
  • Describe your page content in the best way possible.
  • Keep your titles unique to the specific page. Otherwise, multiple pages may compete for the same keyword.
  • Use your brand name wisely. In most cases, your brand name should be left to the end of the title.

Here are some examples of well formatted page titles, and one that’s not as well formatted:

While Gunster, Morgan & Morgan, and Dunlap Bennett & Ludwig, follows the above guidelines, Gibney Law made a few mistakes:

  • Their title is too long, which made it cut off at the end.
  • They didn’t use a searchable keyword in the beginning of the title. Instead, they used their brand name.

How to write your page descriptions for SEO

Page descriptions are the short paragraph of text placed in the HTML that describe the contents of a page. These are known as “meta descriptions” and will show under your page in the organic search results.

In your code, it will look something like this:

If you use WordPress or any other website editor, you won’t need to edit the code itself. You can easily control the meta description with plugins like Yoast SEO.

Google has specifically stated that they do not use the meta description as a ranking signal. However, the number of people who click on your website vs. others is a ranking signal, and the meta description influences a user’s decision to click on your website.

Because of this, the meta description indirectly influences your rankings.

So, when writing your meta descriptions, do so with the goal of convincing users to click on your listing rather than stuffing keywords in there.

Here are a few things that can accomplish this:

  • Keep your description between 135-160 characters so that it doesn’t cut off at the end.
  • Don’t duplicate your meta descriptions. Write unique ones for every webpage.
  • Use your keyword in the description. This is important not because search engines use this as a ranking signal, but because the keyword is often highlighted in bold, which can draw attention to your organic listing.
  • Treat the meta as an advertisement for your page. Make it compelling and relevant. It should match the contents of your page while being as appealing as possible.

Step 10. Create a winning home page

Your homepage is the most valuable page on your site. Here are some ways to make it have a better chance of securing top rankings.

Optimize for the most competitive terms

As far as search engines are concerned, your home page carries the most weight in terms of value. Because of this, it’s best to optimize the page for your most competitive keyword.

Boyd Law does this very well.

It’s clear what their target keyword is.

Your page may not rank right away, but as you build your domain authority and visibility, it will climb closer to the top of the search results.

Feature reviews to build trust

You need to establish credibility and trust as quickly as possible. The best way to do this is featuring reviews or testimonials on your homepage.

Take a look at how Morgan & Morgan features powerful video testimonials on their homepage.

Use images or videos to boost engagement

Google uses dwell time as a ranking factor, so it’s in your best interest to keep users engaged on your homepage as long as possible. Using videos or other visual graphics accomplish this.

Look at how The Law Offices of Peter C. Bronstein does this.

In your video, address the key pain points of your target audience and how you can help with those.

Craft a compelling call-to-action (CTA)

A clear, consistent call-to-action is what generates leads.

When writing your CTA, you want to keep 3 things in mind.

  1. Use action words and be specific – Phrases that encourage users to do something are much more powerful than generic phrases. A CTA like “Call Now for a Free Consultation” is much more compelling than “Click Here to Call.”
  2. Create a sense of urgency – By simply telling users to do something now or that time is running out, suddenly your CTA seems more urgent. You can accomplish this without hard-selling by using the word “Now” or pointing out that users can “reap the benefits today.”
  3. Use contrast in your design – If your CTA is the same color as the rest of your website, it isn’t going to stand out. Use a contrasting color so users are quickly drawn to it.

Let’s look at an example of a good CTA vs. one that could use some work.

Notice how May Personal Injury Lawyers uses actionable language and tells users what they get from calling – a free consultation. The colors of the CTA contrast the rest of the design.

Contrast that with Law Offices of Peter C. Bronstein and you’ll notice that, while he has a compelling call to action, the CTA button doesn’t contrast the rest of the design.

Step 11. Create practice area pages

After your homepage, your practice area pages are going to be the next most valuable in your SEO efforts.

It’s important to make individual pages for each practice area because it gives you more opportunity to go after keywords related to those practice areas by addressing the specific needs of that audience.

If we look at Morgan & Morgan’s website, we’ll see that they have a dropdown listing all of their practice areas.

For each of those practice areas, they have a unique page.

On your practice area pages, you want to include the following:

  • The purchase intent keyword related to this practice area as identified in your keyword research (discussed above in steps 6 and 7).
  • Page titles and meta descriptions with the keyword’s searcher intent in mind (discussed above in step 9).
  • Answers to common questions about this practice area. You can identify common questions by typing your target keyword into Google and looking at the “People also ask” suggestions. Each question you address on this page should use an H tag so that search engines understand the structure of your page.
  • Testimonials from clients that you’ve helped in this specific area of practice.
  • A call-to-action that’s specific to this area of practice.

Step 12. Become an authority with epic blog content

You already have lots of legal info in your head from your experience. Creating high-quality content on your site that communicates this effectively to potential clients positions you as an authority in your legal space.

Consider your practice areas and think about how you can create great content and super-detailed blog articles that help potential clients.

Things like step-by-step guides, simplifying otherwise complex topics, or even simple blog posts on key pain points your audience may face are great examples of this.

Look at how Peterson Watts Law Group does this with their music copyright article.

You can come up with content marketing ideas by:

  • Looking into your clients most commonly asked questions.
  • Reviewing Google’s “People also ask” questions for related search terms.
  • Reviewing questions on Quora in your space.
  • Using tools like Google Trends to find topics that are trending online.

When you create your content, keep in mind that you’re writing for the internet – which means you should make your content easy to skim. Here’s how:

  • Write short, concise sentences using simple language. Try and keep your writing at a 10th grade reading level or less. Tools like Hemmingway can help you determine the readability of your content.
  • Use lots of white space by keeping your paragraphs 1-3 sentences long.
  • Use the right font size. A 22 point font provides the best reading experience online.
  • Use bullet points whenever possible.

When writing your content, while it is important to intersperse your keywords throughout, Google algorithms are getting better at understanding language and will understand synonyms related to the topi. It’s more important that the topic is covered in full, and keyword stuffing won’t work. The primary objective should be that your content fully addresses the needs of its audience.

Step 13. Fix zombie pages

What are zombie pages, and why are they bad for SEO?

Zombie pages are those that exist on your website but provide no value whatsoever – meaning they don’t bring you any traffic.

They usually take on one of the following forms:

  • Duplicate content (ex. News stories copy and pasted from other sites – don’t do this. It can actually make your site appear more spammy and can cause your rankings to drop.)
  • Outdated blog posts
  • Aging press releases
  • Pages that shouldn’t be indexed
  • Archive pages
  • Category and tag pages (often found on WordPress blogs)
  • Search result pages
  • Old press releases
  • Thin content (<50 words)
  • Boilerplate content

These pages are often indexed by Google, but rank poorly because they provide no value to users.

Because search engines use metrics like pages viewed per session and dwell time on a page, thin pages and zombie pages can give Google the impression your site is low-quality.

Should you delete zombie pages?

If you can’t bring a zombie page to life by improving the content and making these pages useful to users, then redirecting the page to more useful content may be the best alternative.

Redirects are especially important if a page your are deleting has any links pointing towards it from other websites.

Links pointing to your site from others are referred to as backlinks. We’ll touch more on these later, but in short, Google counts links as votes of confidence to your site and uses them to help determine rankings for a page or pages of a domain. Any high ranking web page likely has lots of links pointing to it.

You can check backlinks to a page with tools like Ahrefs, SemRush, or Majestic.

In most cases, since zombie pages provide little to no value to your users, it’s unlikely you’ll find any backlinks pointing to them.

However, if you do, you should redirect these pages to another relevant page on your site to retain whatever search equity the page had acquired.

How to redirect pages

The best way to redirect pages on your site is using 301 redirects.

A 301 redirect is a permanent redirect from one URL to another. They essentially send visitors and search engines to a different URL than what they clicked on from a search engine page or typed into their browser.

Let’s put this into practice.

If you click on either of these URLs, you’ll be directed to www.google.com:

That’s because google.com 301 redirects to www.google.com, since Google wanted that to be their primary domain.

Here’s a step-by-step video showing you how to set up 301 redirects in WordPress.

How to properly delete pages

If you don’t have any pages on your site that you can redirect your pages towards, and the page in question has no backlinks, then deleting those pages may be your best option.

When you delete a page, make sure you set the HTTP header to “410 content deleted.”

This tells users – and search engines – that you intentionally deleted the content, and will result in Google removing it from their index sooner.

You can use this plugin to do this on WordPress.

Use Lawyer SEO to dominate local search

Step 14. Create pages for specific markets

71% of people looking for an attorney believe it’s important to have a local one.

This means that they’re likely looking for lawyers within their specific geographic area using the name of a town or county that may otherwise be underserved.

If other law firms aren’t targeting these smaller towns or counties, this could be a great opportunity for you.

Just look at how Morgan & Morgan creates specific pages for the small town of Tavares just outside of Orlando, Florida.

You won’t receive as much traffic for these pages as you will on your homepage, but if you target enough areas, it adds up.

Just look for nearby cities, counties, or towns and create pages tailored specifically to each of them with a customized page title, meta description, and page copy.

Do this with 5 surrounding areas for 10 practice areas and that’s 50 new pages that can attract a very targeted audience!

Finally, make sure other pages on your site link to these pages to help improve their link signals. For example, if you write a blog post about car accidents in Los Angeles, California, link to your “Los Angeles Car Accident Attorney” page.

Free SEO proposal when you schedule with LinkGraph
Book a Call

Step 15. Get citations in popular legal directories

If you serve local clients, quality citations – mentions of your business name, address, and/or phone number – are important.

Google considers citations from relevant, reliable websites when as an important ranking factor in local search results.

Not only that, but lots of people still find lawyers through online directories.

In fact, legal directories often rank for competitive search terms in the legal industry.

Getting citations from targeted directories add credibility, context, and authenticity to your law firm, and allow you to be found by search engine users who click the directory listings in the search results.

How to find citation sources

The best places to get cited in are prominent legal directories and data aggregators.

Legal directories

A good way to think about directory placements is to go after ones that you think you can actually get clients from.

The best way to find these directories is to type all of your target keywords into Google and simply look at the directory listings on the first page. Anything here is worth getting listed in because you can potentially grab second-hand search traffic – i.e. people will click the directory listing in Google, then find your firm

Some of the most popular legal directories worth getting listed in are:

  • Avvo
  • FindLaw
  • Lawyers.com (paid)
  • Justia
  • Hg.org
  • Nolo

For a full list of directories, check out this page from Moz that organizes citation sources by city. Remember to look at nearby cities as well

Data aggregators

You also want to make sure your information is correct with all key data aggregators because search engines pull data from these sources.

Most search engines get their data from:

…who pull their data from:

So it’s important to make sure your information is always up-to-date in these sources. Otherwise, your rankings can drop if any out-of-date information is passed along.

Step 16. Get reviews on Google, Yelp, and law directories

Reviews are important for Google rankings, click through rates, and creating a perception of trust.

This is true for your Google listings and your law directory listings.

According to Bright Local’s Customer Review Survey:

  • 85% of people trust online reviews as much as a personal recommendation.
  • 73% of people trust a business more because of a review.
  • Yelp and Google are some of the review sources people trust most

Needless to say, reviews are an essential part of your law firm’s SEO strategy

Here are some ways you can get reviews on Google, Yelp, and law directories of your choice.

Ask your clients

If you’ve given excellent service to a client, a great way to earn reviews is to simply ask!

Amazon does this through email, so why not do this with your clients as well?

Just send them an email explaining how you want to hear more about their experience with a link to your preferred review source. If you’ve given them great service and have developed a strong relationship with the client, they’ll see your request as a good thing and be more than happy to do this.

Keep in mind that you want to make sure you ask happy clients for reviews since they’re more likely to leave good ones.

Add review links to your site

In some areas of legal practice, clients are likely to revisit your website frequently

In these cases, leaving links to your preferred review sources can encourage repeat clients to share their experiences.

Take a look at how Johanson Law Group does this.

Just link the “write a review” button on your site to Google, Yelp, Avvo, or whatever your preferred directory is.

Use review generation tools

There are tools that can help automate the customer feedback process to make it easier.

These tools handle client follow-ups on your behalf via text or email which frees you up to handle more important things.

While these are great, if your firm is relatively new, I’d recommend calling or emailing each client individually until you have a consistent inflow of clients to ask for reviews.

LINKS

Step 17. Link to authority sources from your site

Linking out is a great way to show Google that you’re interested in providing value to your users.

When Google analyzes links, they look at them like neighborhoods. If you’re linking out to lots of high quality, high domain authority sites in your industry, and lots of high quality sites are linking to you, Google considers your site as part of a good “link neighborhood.”

The opposite can also be true. If you link out to low quality sites and lots of low quality sites are linking to you, this is a bad “link neighborhood.”

Linking to non-competing legal sites can help enhance a reader’s understanding of a topic you may be writing about on your own site.

This will improve user experience on your site – which will lead to better SERP rankings.

For example, Yavitch & Palmer’s site links out to a number of legal resources related to their areas of practice.

The rel=“nofollow” tag

The rel=”nofollow” tag is a value that can be added to a URL that tells search engines not to follow the URL.

In the code for a URL, it looks like this:

This was introduced in 2005 by Google to stop people from blog comment spamming in an effort to get links to their site that would influence their rankings.

This tag should only be used if a link is paid for or can be easily added by the public (such as in comments or reviews).

Otherwise, you don’t really need to worry about it.

Increase dwell time with outbound and internal links

If people can find what they’re looking for on your site, they’re more likely to stick around.

This includes when you give them what they’re looking for by linking to it.

So if you’re writing a blog article and mention a resource that readers may want to learn more about, link to it!

Peterson Watts Law Group does this regularly in their blog articles.

When you do link out, make sure the pages you link to open in a new browser tab when clicked so users can easily come back to your site. Here’s how to do this in WordPress

Follow these same guidelines with internal links – links from one page on your site to another – to help search engines better understand the structure of your site and rank it on an ongoing basis.

Step 18. Increase your rankings with backlinks

One of the best ways to increase your Google search results is to get other sites to link to yours.

Google counts links as votes of confidence. If other reputable sites are linking to you, Google trusts your site more and pushes you up in the search results pages.

These are known as backlinks (i.e. another site is “linking back” to you), and the process of trying to get these backlinks is known as “link building.”

A lot of sites pay for backlinks, but this is against Google webmaster guidelines and is known as “black hat” SEO.

“White hat” link building is done leveraging methods that follow google’s guidelines. Often, these methods require a lot of time and hard work, like creating new content like guest blogs or long-form articles that includes links back to your website, and then pitching that content to other webmasters to publish on their site. This is the safest way to build links, and even more so when working with a reputable SEO agency.

Two of the best ways to get high-quality backlinks for your site are guest posting and HARO.

Guest Posting

Guest posting is the easiest way to get other sites to link to you. It basically works like this.

  1. You find other sites that accept guest articles.
  2. You pitch them an idea.
  3. You write it and send it to them with a link to your site in the body of the content.

Simple, right?

Let’s break down the steps.

Find other sites that accept guest posts

To find sites that accept guest posts, we’ll use Google.

Simply enter search terms like these:

  • Your keyword + “write for us”
  • Your keyword + “guest post by”
  • Your Keyword + “contribution by”

Make sure you maintain the “”. This tells Google to only find pages that contain this exact phrase.

When you find a site that looks like a good fit, you’re ready to craft your pitch.

Pitch 3 article ideas

For each of the sites you find, look around at the types of articles they write and come up with 3 similar ones that they haven’t covered yet.

Once you have your article ideas, send them an email that looks something like this:

Hi [Name],

My name is [Your Name] and I’m [Your Company and Role]

I’m contacting you because I’d love to contribute a guest post on [Website].

Here are some ideas I’ve come up with that I think your readers would get a ton of value from:

[Idea #1]

[Idea #2]

[Idea #3]

I’ll make sure the piece overflows with information that can’t be found anywhere else.

To give you an idea of the quality I’ll bring to your site, here’s a link to a guest post that I recently published on [Other Website].

Cheers,

[Your First Name]

Once you hear back from a site, the next step is to write and send your article to them!

When you write your article, make sure it provides real value to their readers and isn’t just an article written in an attempt to get a link. Any good site will see right through this and will reject your article once they get it.

Make sure you link to your site within the body of the content – ideally to a blog post you have. Most sites will link to your site in your bio, but Google usually doesn’t count these.

If you do a good job, they may ask you to contribute content about your main practice area or a specific topic on a monthly basis. This means lots of long term SEO success for your law firm website and elevating you and your team of lawyers as industry experts.

HARO

HARO (or Help A Reporter Out) is a great source of links and press mentions.

Basically, they send you 3 emails each day with a list of topics reporters are writing about for news sites and need some help with – like this:

All you need to do is scroll through the list of topics, pick one out where you can offer value, and write your reply.

Here’s an example of one that’s fit for attorneys.

Just click on the query to be taken down to the section of the email where you can read it in full.

Finally, just click the email address listed with the query and draft your response!

Remember, with HARO, the more helpful information you can provide, the better. Often times, reporters will take only part of what you say, so giving them more to work with gives you more of a chance at landing a placement. In some cases, they may include a website link with your comments.

MEASURING RESULTS

Step 19. Make sure you have the right tools.

In order to measure your SEO results, you need to install the correct analytics tools.

Google Analytics is important because that lets you see how much organic traffic you’re getting and gives you insight into how your users are using your website. You can leverage this data to make improvements to your user experience.

Here’s a video that walks you through how to set up Google Analytics for your website.


The second tool you’ll need is Google Search Console or LinkGraph’s Google Search Console Tool.

These tools let you analyze ranking data and give you a look at your position for different keywords as well as how many impressions and clicks you’re getting from search.

Here’s a video that walks you through how to set it up for your website.

Step 20. Understand how to measure SEO performance

Once you have your tools installed, it’s important to start measuring your SEO performance over time.

Specifically, you want to look at the following key performance indicators (KPIs):

  • Rankings – How many keywords and keyword phrases are you ranking for? How have your rankings changed for those keywords over the last few months?
  • Traffic – How many visitors are you getting from organic search?
  • Conversions – How many new leads are you getting from organic search traffic?

Here’s how to look at each of these.

Use Google Search Console to monitor keyword rankings

The best way to look at your keyword rankings is with Google Search Console.

If you log in to Google Search Console, click “Search Results” on the left. This will show you a report of all the keywords you rank for and your position over time for those keywords.

Don’t look at rankings over weeks – look over a period of months. Legal SEO work takes a while to kick in.

Use Google Analytics to monitor traffic and conversions

Traffic can be measured in Google Analytics.

If you open your Google Analytics account and go to Audience -> Overview, then scroll down and select the “Organic Search” option, you’ll be able to see all of the traffic that comes from search engines.

Again, make sure you measure this over a period of months – not days. The nature of good SEO is that it takes time for search engines to react to your efforts, especially in a competitive landscape or given keyword phrase like car accident lawyer, divorce lawyer, or dui lawyer.

As well as simply looking at the traffic, you’ll also want to look into a variety of factors:

  • What website pages visitors are landing on.
  • Visitor demographics.
  • What they’re doing on your site.
  • How frequently they’re revisiting.
  • Monitor site speed.
  • Check for any differences in behavior between mobile and desktop users.

CONCLUSION

There you have it – a 20 step action plan to dominate the search results! Whether you’re a car accident lawyer, divorce attorney, criminal defense lawyer, family law, or other type of attorney, lawyer SEO is the fastest way to break into competitive markets by securing top rankings for your law practice website.

Hopefully this gave you lots of valuable insight into the inner workings of search engine optimization and how they prioritize organic results for sites that give users what they’re looking for.

A solid SEO strategy is something that every digital marketing campaign should include. These strategies have worked for hundreds of other websites, so they’ll work for yours too! If you need help, working with an SEO company and SEO experts can help you build domain authority and site visibility faster through comprehensive digital marketing strategy. reach out to one of our law firm SEO experts to learn more.

Get 7 Days Free to use the most powerful SEO software on the planet
Learn More

The post Law Firm SEO – A 20 Step Action Plan for Attorneys appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/law-firm-seo/feed/ 0
Google Algorithm Update History https://linkgraph.io/blog/google-algorithm-update-history/ https://linkgraph.io/blog/google-algorithm-update-history/#comments Fri, 21 Oct 2022 21:39:41 +0000 https://linkgraph.io/?p=2935 Learn how Google's algorithm has developed over time, what drove changes, and what it means for search and your own web content.

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
Intro

The Google algorithm is constantly changing. In 2018 alone, Google ran 15,096 Live traffic experiments, and launched 3,234 updates to its search algorithm.

 

Three variations of google search result layouts being tested with users.


Not all updates have significant impact on the search results. This page covers the top 150 updates to how search results function from 2000-2019. Updates are a blend of changes to:

 

  • Algorithms
  • Indexation
  • Data (aka Data Refreshes)
  • Google Search UIs
  • Webmaster Tools
  • Changes to ranking factors and signals

Before we get into the timeline of individual google updates, it’s going to be helpful to define a handful of things upfront for any SEO newbies out there:

Google’s Core Algorithm

SEO experts, writers, and audiences will often refer to “Google’s Core Algorithm” as though it is a single item. In reality, Google’s Core Algorithm is made up of millions of smaller algorithms that all work together to surface the best possible search results to users. What we mean when we say “Google’s Core Algorithm” is the set of algorithms that are applied to every single search, which are no longer considered experimental, and which are stable enough to run consistently without requiring significant changes.

Google Panda (2011-2016)

The Panda algorithm focused on removing low quality content from search by reviewing on-page content itself. This algorithm focused on thin content, content dominated by ads, poor quality content (spelling/grammar mistakes), and rewarded unique content. Google Panda was updated 29 times before finally being incorporated into the core algorithm in January of 2016.

Google Penguin (2012-2016)

The Penguin algorithm focused on removing sites engaging in spammy tactics from the search results. Penguin primarily filtered sites engaging in keyword stuffing and link schemes out of the search results. Google Penguin was updated 10 times before being integrated into Google’s core algorithm in September of 2016.

RankBrain (2015-Present)

This machine-learning based AI helps Google process and understand the meaning behind new search queries. RankBrain works by being able to infer the meaning of new words or terms based on context and related terms. RankBrain began rolling out across all of Google search in early 2015 and was fully live and global by mid-2016. Within three months of full deployment RankBrain was already the 3rd most important signal contributing to the results selected for a search query.

Matt Cutts

One of the first 100 employees at Google, Matt Cutts was the head of Google’s Web Spam team for many many years, and interacted heavily with the webmaster community. He spent a lot of time answering questions about algorithm changes and providing webmasters high-level advice and direction.

Danny Sullivan

Originally a Founding Editor, Advisor, and Writer for Search Engine Land (among others), Danny Sullivan now communicates with the SEO community as Google’s Public Search Liaison. Mr. Sullivan frequently finds himself reminding the community that the best way to rank is to create quality content that provides value to users.

Gary Illyes

Google Webmaster Trends Analyst who often responds to the SEO community when they have questions about Google algorithm updates and changes. Gary is known for his candid (and entertaining) responses, which usually have a heavy element of sarcasm.

Webmaster World:

Frequently referenced whenever people speak about Google algorithm updates, webmasterworld.com is one of the most popular forums for webmasters to discuss changes to Google’s search results. A popular community since the early 2000’s webmasters still flock to the space whenever major fluctuations are noticed to discuss theories.

Years.
Tags.

 

2021 Google Search Updates

2021 December – Local Search Update

From November 30th – December 8th, Google runs a local search ranking update. This update rebalances the various factors used to generate local results. Primary ranking factors for local search remain the same: Relevance, Distance, and Prominence. 

Additional Reading:

2021 November – Core Quality Update

From November 17th – November 30th, Google rolls out another core update. As with all core updates, this one is focused on improving the quality and relevance of search results. 

Additional Reading:

2021 August – Title Tag Update

Starting August 16th, Google starts rewriting page titles in the SERPs. After many SEOs saw negative results from the update, Google rolls back some of the changes in September. Google emphasizes that it still uses content with the <title> tag over 80% of the time. 

Additional Reading:

2021 July – Link Spam Update

Google updates link spam fighting algorithm to improve effectiveness of identifying and nullifying link spam. The update is particularly focused on affiliate sites and those websites who monetize through links.

Additional Reading:

2021 June – Page Experience Update

Google announced in late 2020 that its upcoming 2021 Page Experience update would introduce core web vitals as new Google ranking factors. Core web vitals are a set of user experience criteria that include page load times, mobile responsiveness, visual responsiveness, and more. Google evaluates these metrics through the following criteria:

  1. Largest Contentful Paint (LCP) – The time it takes a web page to load the largest piece of content on the page
  2. First Input Delay (FID) – A measurement of the users first interaction with the page from interactivity and responsiveness.
  3. Cumulative Layout Shift (CLS) – Measures visual stability and how stable the website is when loading and scrolling

This update makes it so Google will evaluate page experiences signals like mobile friendliness, safe browsing, HTTPS security, and intrusive interstitial guidelines when ranking web pages.

Additional Reading:

2021 February – Passage Ranking

Google introduces Passage Ranking and starts indexing passages of web content. Google now hones in on a specific passages of long-form content and ranks those passage in the SERPs. Google highlights the relevant passage and takes the users directly to the relevant passage after clicking on the blue link result. 

Additional Reading:

2020 Google Search Updates

2020 October – Indexing Bugs

From early September to the beginning of October, Google experienced multiple bugs with mobile indexing, canonicalization, news-indexing, top stories carousel, and sports scores breaking. The bugs impacted about .02% of searches. Google fully resolved all impacted urls by October 9th.

Additional Reading:

2020 August 11 – Google Glitch

On Tuesday, August 11th, Google experienced a massive, worldwide indexing glitch that impacted search results. Search results were very low-quality or irrelevant to search queries, and ecommerce sites in particular reported significant impacts on rankings. Google resolved the glitch within a few days.

Additional Reading:

2020 June – Google Bug Fix

A Google representative confirmed an indexing bug temporarily impacted rankings. Google was struggling to surface fresh content.

Additional Reading:

2020 May – Core Quality Update

This May 2020 core update was one of the more significant broad core updates with the introduction of core web vitals and increased emphasis on E.A.T. This update was a continuation of an effort to improve the quality of SERP results with COVID related searches. The update most significantly impacted those sites with low-quality or unnatural links. However some sites with lower-domain authority did appear to see positive ranking improvements for pages with high-quality, relevant content. 

Many SEOs reacted negatively, particularly because of the timing of the update, which occurred at the height of economic shutdowns to slow the spread of coronavirus. Some concerns about the May 2020 core quality update ranged from social media SERP domination and better SERP results for larger, more dominant brands like Amazon and Etsy. Some analysis noted these changes may have been reflecting user intent from quarantine, particularly because the update focused on providing better results for queries with multiple search intents. Google’s responded to the complaints by reinforcing existing content-quality signals. 

Additional Reading:

2020 March – COVID-19 Pandemic

Although not an official update, the coronavirus outbreak led to an unprecedented level of search queries that temporarily changed the landscape of search results. Google made several changes to adjust to the trending searches such as:

  • Increased user personalization to combat misinformation
  • Removed COVID-19 misinformation across YouTube and other platforms
  • Added “Sticky Menu” for COVID related searches
  • Added temporary business closures to the Map Pack
  • Temporarily banned ads for respirators and medical masks
  • Created COVID-19 Community Mobility Reports
  • Temporary limited certain Google My Business listings features

Additional Reading:

2020 February 7 – Unannounced Update

In February of 2020, many SEOs reported seeing significant changes to rankings, although Google had not announced and denied any broad core update. Various analysis of the update showed no clear pattern between websites that were impacted. 

Additional Reading:

2020 January 22 – Featured Snippet De-duplication

Prior to this January 2020 update, those sites that earned the featured snippet, or “position zero,” also appeared as the subsequent organic search result. This update de-duplicated search results to eliminate this double exposure. This impacted 100% of searches worldwide and had significant impacts on rank tracking and organic CTR.

Additional Reading:

2020 January – Broad Core Update

On January 13th, 2020, Google started rolling out another broad core update. Google did not provide details about the update, but did emphasize existing webmaster guidelines about content quality.

Additional Reading:

2019 Google Search Updates

2019 November Local Search Update

In November of 2019 Google rolled out an update to how local search results are formulated (ex: map pack results). This update improved Google’s understanding of the context of a search, by improving its understanding of synonyms. In essence, local businesses may find they are showing up in more searches.

 

2019 October 26 BERT

In October Google introduced BERT a deep-learning algorithm focused on helping Google understand the intent behind search queries. BERT (Bidirectional Encoder Representations from Transformers) gives context to each word within a search query. The “bidirectional” in BERT refers to how the algorithm looks at the words that come before and after each term before assessing the meaning of the term itself.

Here’s an example of bi-directional context from Google’s Blog:

In the sentence “I accessed the bank account,” a unidirectional contextual model would represent “bank” based on “I accessed the” but not “account.” However, BERT represents “bank” using both its previous and next context — “I accessed the… account” — starting from the very bottom of a deep neural network, making it deeply bidirectional.

The introduction of BERT marked the most significant change to Google search in half a decade, impacting 1 in 10 searches — 10% of all search queries.

Additional Reading:

2019 September – Entity Ratings & Rich Results

If you place reviews on your own site (even through a third party widget), and use schema markup on those reviews – the review stars will no longer show up in the Google results. Google applied this change to entities considered to be Local Businesses or Organizations.

The reasoning? Google considers these types of reviews to be self-serving. The logic is that if a site is placing a third party review widget on their own domain, they probably have some control over the reviews or review process.

Our recommendation? If you’re a local business or organization, claim your Google My Business listing and focus on encouraging users to leave reviews with Google directly.

Additional Reading:

2019 September – Broad Core Update

This update included two components:First, it hit sites exploiting a 301 redirect trick from expired sites. In this trick users would buy either expired sites with good SEO metrics and redirect the entire domain to their site, or users would pay a 3rd party to redirect a portion of pages from an expired site to their domain.Note: Sites with relevant 301 redirects from expired sites were still fine.

Second, video content appears to have gotten a boost from this update. June’s update brought an increase in video carousels in the SERPs. Now in September, we’re seeing video content bumping down organic pages that previously ranked above them.

 

We can see this at an even greater scale looking at two purely text and purely video sites – YouTube and Wikipedia. We can see that for the first time, YouTube has eclipsed Wikipedia in the Google search results.

 

Additional Reading:

2019 June – Broad Core Update

This is the first time that Google has pre-announced an update. Danny Sullivan, Google’s Search Liaison, stated that they chose to pre-announce the changes so webmasters would not be left “scratching their heads” about what was happening this time.

What happened?

  • We saw an increase in video carousels in the SERPs
  • Low quality news sites saw losses

What can sites do to respond to this broad core update? It looks like Google is leaning into video content, at least in the short-term. Consider including video as one of the types of content your team creates.

Additional Reading:

2019 May 22-26 – Indexing Bugs

On Wednesday May 22nd Google tweeted that there were indexation bugs causing stale results to be served for certain queries, this bug was resolved early on Thursday May 23rd.

By the evening of Thursday May 23rd Google was back to tweeting – stating that they were working on a new indexing bug that was preventing capture of new pages. On May 26th Google followed up that this indexation bug had also been fixed.

Additional Reading:

2019 April 4-11 De-Indexing Bugs

In April of 2019 an indexing bug caused about 4% of stable URLs to fall off of the first page. What happened? A technical error caused a bug to de-index a massive set of webpages.

Additional Reading:

2019 March 12 – Broad Core Update

Google was specifically vague about this update, and just kept redirecting people and questions to the Google quality guidelines. However, the webmaster community noticed that the update seemed to have a heavier impact on YMYL (your money or your life) pages.

YMYL sites with low quality content took a nose-dive, and sites with heavy trust signals (well known brands, known authorities on multiple topics, etc) climbed the rankings.

Let’s take two examples:

First, Everdayhealth.com lost 50% of their SEO visibility from this update. Sample headline:Can Himalayan Salt Lamps Really Help People with Asthma?

Next, Medicinenet.com saw a 12% increase in their SEO visibility from this update. Sample headline: 4 Deaths, 141 Legionnaires’ Infections Linked to Hot Tubs.

This update also seemed to factor in user behavior more strongly. Domains where users spent longer on the site, had more pages per visit, and had lower bounce rates saw an uptick in their rankings.

Additional Reading:

2019 March 1 – Extended Results Page

For one day, on March 1st, Google displayed 19 results on the first page of SERPs for all queries, 20 if you count the featured snippet. Many hypothesize it was a glitch related to in-depth articles, a results type from 2013 that has long since been integrated into regular organic search results.

Additional Reading:

2018 Google Algorithm Updates

2018 August – Broad Core Update (Medic)

This broad core update, known by its nickname “Medic” impacted YMYL (your money or your life) sites across the web.

SEOs had many theories about what to do to improve rankings after this update, but both Google and the larger SEO community ended up at the same messaging: make content user’s are looking for, and make it helpful.

This update sparked a lot of discussion around E-A-T (Expertise, Authoritativeness, Trustworthiness) for page quality, and the importance of clear authorship and bylines on content.

Additional Reading:

2018 July – Chrome Security Warning

Google begins marking all http sites as “not secure” and displaying warnings to users.

 

Google views security as one of their core principles, so this change makes sense as the next step to build on their October 2017 update that began warning users about unsecured forms.

 

Looking forward, Google is planning on blocking mixed content from https sites.

What can you do? Purchase an SSL certificate and make the move from http to https as soon as possible. Double check that all of your subdomains, images, PDFs and other assets associated with your site are also being served securely.

Additional Reading:

2018 July – Mobile Speed Update

Google rolled out the mobile page speed update, making page speed a ranking factor for mobile results.

Additional Reading:

2018 June – Video Carousels

Google introduces a dedicated video carousel on the first page of results for some queries, and moves videos out of regular results. This change also led to a significant increase in the number of search results displaying videos (+60%).

Additional Reading:

2018 April – Broad Core Update

The official line from Google about this broad core update, is that it rewards quality content that was previously under-rewarded. Sites that had content that was clearly better than the content of it’s organic competitors saw a boost, sites with thin or duplicative content fell.

2018 March – Broad Core Update

March’s update focused on content relevance (how well does content match the intent of the searcher) rather than content quality.

What can you do? Take a look at the pages google is listing in the top 10-20 spots for your target search term and see if you can spot any similarities that hint at how Google views the intent of the search.

Additional Reading:

2018 March – Mobile-First Index Starts to Roll Out

After months of testing Google begins rolling out mobile-first indexing. Under this approach, Google crawls and indexes the mobile version of website pages when adding them to their index. If content is missing from mobile versions of your webpages, that content may not be indexed by Google.

To quote Google themselves,

“Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.”

Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.

 

Additional Reading:

 

2017 Google Search Updates

2017 December – Maccabees

Google states that a series of minor improvements are rolled out across December. Webmasters and SEO professionals see large fluctuations in the SERPs.

 

Danny Sullivan's Maccabees Tweets about how there are always multiple daily updates, no single update.
Barry Schwartz gave this set of updates the Maccabees nickname as he noted the most fluctuation around December 12 (occurring during Hanukkah). However, updates occurred from the very beginning until the very end of December.

 

What were the Maccabees changes?

Webmasters noted that doorway pages took a hit. Doorway pages act as landing pages for users, but don’t contain the real content – users have to get past these initial landing pages to access content of any value. Google considers these pages barriers to a user.

A writer at Moz dissected a slew of site data from mid-december noted one key observation. When two pages ranked for the same term, the one with better user engagement saw it’s rankings improve after this update. The other page saw its rankings drop. In many instances what happened for sites that began to lose traffic, is that blog pages were being shown/ranked where product or service pages should have been displayed.

A number of official celebrity sites fall in the rankings including (notably) Channing Tatum, Charlie Sheen, Kristen Stewart, Tom Cruise, and even Barack Obama. This speaks to how Google might have rebalanced factors around authoritativeness vs. content quality. One SEO expert noted that thin celebrity sites fell while more robust celebrity sites (like Katy Perry’s) maintained their #1 position.

Multiple webmasters reporting a slew of manual actions on December 25th and 26th, and some webmasters also reported seeing jumps on the 26th for pages that had been working on site quality.

Additional Reading:

2017 November – Snippet Length Increased

Google increases the character length of meta descriptions to 300 characters. This update was not long-lived as Google rolled back to the original 150-160 character meta descriptions on May 13, 2018.

2017 May – Quality Update

Webmasters noted that this update targeted sites and pages with:

  • Deceptive advertising
  • UX challenges
  • Thin or low quality content

Additional Reading:

2017 March – Fred

In early March webmasters and SEOs began to notice significant fluctuations in the SERPs, and Barry Schwartz from SEJ began tweeting Google to confirm algorithm changes.

The changes seemed to target content sites engaging in aggressive monetization at the expense of users. Basically sites filling the internet up with low-value content, meant to benefit everyone except the user. This included PBN sites, and sites created with the sole intent of generating AdSense income.

Fred got its name from Gary Illyes who suggested to an SEO expert asking if he wanted to name the update, that we should start calling all updates without names “Fred.”

 

The joke, for anyone who knows the webmaster trends analyst, is that he calls everything unnamed fred (fish, people, EVERYTHING).

 

The SEO community took this as a confirmation of recent algorithm changes (note: literally every day has algorithm updates). Validating them digging into the SERP Changes.

Additional Reading:

2017 January 10 – Pop Up Penalty

Google announces that intrusive pop ups and interstitials are going to be factored into their search algorithm moving forward.

“To improve the mobile search experience, after January 10, 2017, pages where content is not easily accessible to a user on the transition from the mobile search results may not rank as highly.”

This change caused rankings to drop for sites that forced users to get past an ad or pop up to access relevant content. Not all pop ups or interstitials were penalized, for instance the following pop ups were still okay:

  • Pop ups that helped sites stay legally compliant (ex: accepting cookies, or verifying a user’s age).
  • Pop ups that did not block content on load.

Additional Reading:

2016 Google Search Updates

2016 September – Penguin 4.0

The Google announcement of Penguin 4.0 had two major components:

  • Penguin had been merged into the core algorithm, and would now have real-time updates.
  • Penguin would be more page-specific moving forward rather than impacting entire domains.

SEOs also noted one additional change. Penguin 4.0 seemed to just remove the impact of spam links on SERPs, rather than penalizing sites with spammy links. This appeared to be an attempt for Google to mitigate the impact of negative SEO attacks on sites.

That being said, today in 2019 we still see a positive impact from running disavows for clients who have seen spammy links creep into their backlink profiles.

Additional Reading:

2016 September – Possum Update

This update targeted duplicate and spammy results in local search (Local Pack and Google Maps). The goal being to provide more diverse results when they’re searching for a local business, product, or service.

Prior to the Possum update Google was filtering out duplicates in local results by looking for listings with matching domains or matching phone numbers. After the Possum update Google began filtering out duplicates based on their physical address.

Businesses who saw some of their listings removed from the local pack may have initially thought their sites were dead (killed by this update), but they weren’t – they were just being filtered (playing possum). The term was coined by Phil Rozek

SEOs also noted that businesses right outside of city limits also saw a major uptick in local rankings, as they got included in local searches for those cities.

Additional Reading:

2016 May – Mobile Friendly Boost

Google boosts the effect of the mobile-friendly ranking signal in search.
Google took time to stress that sites which are not mobile friendly but which still provide high quality content will still rank.

Additional Reading:

2016 February 19 – Adwords Change

Google Removes sidebar ads and ads a fourth ad to the top block above the organic search results.
This move reflects the search engine giant continuing to prioritize mobile-first experiences, where side-bar ads are cumbersome compared to results in the main content block.

2016 January – Broad Core Update + Panda Is Now Core

Google Confirms core algorithm update in January, right after confirming that Panda is now part of Google’s core algorithm.

Not a lot of conclusions were able to be drawn about the update, but SEOs noticed significant fluctuations with news sites/news publishers. Longform content with multi-media got a boost, and older articles took a bit of a dive for branded terms. This shift could reflect Google tweaking current-event related results to show more recent content, but the data was not definitive.

Additional Reading:

2015 Google Search Updates

2015 December – SSL/HTTPS by Default

Google starts indexing the https version of pages by default.

Pages using SSL are also seeing a slight boost. Google holds security as a core component of surfacing search results to users, and this shift becomes one of many security-related search algo changes. In fact, by the end of 2017 over 75% of the page one organic search results were https.

2015 October 26 – RankBrain

in testing since April 2015, Google officially introduced RankBrain on this date. RankBrain is a machine learning algorithm that filters search results to help give users a best answer to their query. Initially, RankBrain was used for about 15 percent of queries (mainly new queries Google had never seen before), but now it is involved in almost every query entered into Google. RankBrain has been called the third most important ranking signal.

Additional Reading:

2015 October 5 – Hacked Sites Algorithm

Google introduces an algorithm specifically targeting spammy in the search results that were gaining search equity from hacked sites.

This change was significant, it impacted 5% of search queries. This algorithm hides sites benefiting from hacked sites in the search results.

 

Interactions with Gary Illyes at #pubcon and on twitter suggest that this algo only applies to search queries traditionally known to be spammy.

 

 

The update came right after a September message from Google about cracking down on repeat spam offenders. Google’s blog post notified SEOs that sites which repeatedly received manual actions would find it harder and harder to have those manual actions reconsidered.

Additional Reading:

2015 August 6 – Google Snack Pack

Google switches from displaying seven results for local search in the map pack to only three.

Why the change? Google is switching over (step-by-step) to mobile-first search results, aka prioritizing mobile users over desktop users.

On mobile, only three local results fit onto the screen before a users needs to scroll. Google seems to want users to scroll to then access organic results.

Other noticeable changes from this update:

  • Google only displays the street (not the complete address) unless you click into a result.
  • Users can now filter snack pack results by rating using a dropdown.

Additional Reading:

2015 July 18- Panda 4.2 (Update 29)

Roll out of Panda 4.2 began on the weekend of July 18th and affected 2-3% of search queries. This was a refresh, and the first one for Panda in about 9 months.

Why does that matter? The Panda algorithm acts like a filter on search results to sort out low quality content. Panda basically gets applied to a set of data – and decides what to filter out (or down). Until the data for a site is refreshed, Panda’s ruling is static. So when a data refresh is completed, sites that have made improvements essentially get a revised ruling on how they’re filtered.

Nine months is a long time to wait for a revised ruling!

2015 May – Quality Update / Phantom II

This change is an update to the quality filters integrated into Google’s core algorithm, and alters how the algorithm processes signals for content quality. This algorithm is real-time, meaning that webmasters will not need to wait for data refreshes to see positive impact from making content improvements.

What kind of pages did we see drop in the rankings?

  • Clickbait content
  • Pages with disruptive ads
  • Pages where videos auto-played
  • How-to sites with thin or duplicative content (this ended up impacting a lot of how-to sites)
  • Pages that were hard to navigate/had UI barriers

In hindsight, this update feels like a precursor to Google’s 2017 updates for content spam and intrusive pop ups.

Additional Reading:

2015 April 21 – Mobilegeddon

Google boosts mobile-friendly pages in mobile search results.

This update was termed Mobilegeddon as SEOs expected it to impact a huge number of search queries, maybe more than any other update ever had. Why? Google was already seeing more searches on mobile than on desktop in the U.S. in May 2015.

In 2018 Google takes this a step further and starts mobile-first indexing.

Additional Reading:

2014 Google Algorithm Updates

2014 December – Pigeon Goes International

Google’s local algorithm, known as Pigeon, expands to international English speaking countries (UK, Canada, Australia) on December 22, 2014.

In December Google also releases updated guidelines for local businesses representing themselves on Google.

Additional Reading:

2014 October – Pirate II

Google releases an “improved DMCA demotion signal in Search,” specifically designed to target and downrank some of the sites most notorious for piracy.

In October Google also released an updated report on how they fight piracy, which includes changes they made to how results for media searches were displayed in search. Most of these user interface changes were geared towards helping user find legal (trusted) ways to consume the media content they were seeking.

 


—————————–
Additional Reading:

 

2014 October 17 – Penguin 3.0

This update impacted 1% of English search queries, and was the first update to Penguin’s algorithm in over a year. This update was both a refresh and a major algorithm update.

2014 September – Panda 4.1 (Update 28)

Panda 4.1 is the 28th update for the algorithm that targets poor quality content. This update impacted 3-5% of search queries.

To quote Google:

“Based on user (and webmaster!) feedback, we’ve been able to discover a few more signal to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.”

Major losers were sites with deceptive ads, affiliate sites (thin on content, meant to pass traffic to other monetizing affiliates), and sites with security issues.

2014 September – Known PBNs De-Indexed

This change impacted search, but was not an algorithm change, data refresh, or UI update.

Starting mid-to-late September, 2014 Google de-indexed a massive amount of sites being used to boost other sites and game Google’s search rankings.

Google then followed-up on the de-indexing with manual actions for sites benefiting from the PBN. These manual actions went out on September 18, 2014.

Additional Reading:

2014 August – Authorship Removed from Search Results

Authors are no longer displayed (name or photo) in the search results along with the pieces that they’ve written.

Almost a year later Gary Illyes suggested that sites with authorship markup should leave the markup in place because it might be used again in the future. However, at a later date it was suggested that Google is perfectly capable of recognizing authorship from bylines.

Additional Reading:

2014 August – SSL becomes a ranking factor

Sites using SSL began to see a slight boost in rankings.

Google would later go on to increase this boost, and eventually provide warning to users when they were trying to access unsecure pages.

Additional Reading:

2014 July 24 – Google Local Update (Pigeon)

Google’s local search algorithm is updated to include more signals from traditional search (knowledge graph, spelling correction, synonyms, etc).

Additional Reading

2014 June – Authorship Photos Removed

Photos of Authors are gone from SERPs.

This was the first step towards Google decommissioning Authorship markup.

2014 June – Payday Loan Update 3.0

Where Payday Loans 2.0 targeted spammy sites, Payday Loans 3.0 targeted spammy queries, or more specifically the types of illegal link schemes scene disproportionately within high-spam industries (payday loans, porn, gambling, etc).

What do you mean illegal? We mean link schemes that function off of hacking other websites or infecting them with malware.

This update also included better protection against negative SEO attacks,

Additional Reading:

2014 May 17-18 – Payday Loan Update 2.0

Payday Loan Update 2.0 was a comprehensive update to the algorithm (not just da data refresh). This update focused on devaluation of domains using spamy on-site tactics such as cloaking.

Cloaking is when the content/page that google can see for a page is different than the content/page that a human user sees when they click on that page from the SERPs.

2014 May – Panda 4.0 (Update 27)

Google had stopped announcing changes to Panda for a while, so when they announced Panda 4.0 we know it was going to be a larger change to the overall algorithm.

Panda 4.0 impacted 7.5% of English queries, and led to a drastic nose dive for a slew of prominent sites like eBay, Ask.com, and Biography.com.

 

Sites that curated information from other sources without posting info or analysis of their own (aka coupon sites, celebrity gossip sites) seemed to take a big hit from this update.

 

 

2014 February 6 – Page Layout 3.0 (Top Heavy 3.0)

This is a refresh of Google’s algorithm that devalues pages with too many above-the-fold ads, per Google’s blog:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.

So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience.

The Page Layout algorithm was originally launched on January 19, 2012, and has only had one other update in October of the same year (2012).

 

Tweet from Matt Cutts Announcing Panda 4.0

2013 Google Algorithm Updates

2013 December – Authorship Devalued

Authorship gets less of a boost in the search results. This is the first step Google took in beginning to phase out authorship markup.

2013 October – Penguin 2.1

Technically the 5th update to Google’s link-spam fighting algorithm, this minor update affects about 1% of search queries.

 

2013 August – Hummingbird

Hummingbird was a full replacement of the core search algorithm, and Google’s largest update since Caffeine (Panda and Penguin had only been changes to portions of the old algorithm).

Humminbird helped most with conversational search for results outside of the knowledge graph — where conversational search was already running. Hummingbird was a significant improvement to how google interpreted the way text and queries are typed into search.

This algorithm was named Hummingbird by Google because it’s “precise and fast.”

 

Additional Reading:

 

2013 July – Expansion of Knowledge Graph

Knowledge Graph Expands to nearly 25% of all searches, displaying information-rich cards right above or next to the organic search results.

 

 

Additional Reading:

2013 July – Panda Dance (Update 26)

Panda begins going through monthly refreshes, also known as the “Panda Dance,” which caused monthly shifts in search rankings.

The next time Google would acknowledge a formal Panda update outside of these refreshes would be almost a year later in May of 2014.

2013 June – Roll Out Anti-Spam Algorithm Changes

Google rolled out an anti-link-spam algorithm in June of 2013 targeting sites grossly violating webmaster guidelines with egregious unnatural link building.

Matt Cutts even acknowledged one target – ‘Citadel Insurance’ which built 28,000 links from 1,000 low ranking domains within a single day, June 14th, and managed to reach position #2 for car insurance with the tactic.

By the end of June sites were finding it much harder to exploit the system with similar tactics.

 

 

2013 June 11 – Payday Loans

This update impacted 0.3% of queries in the U.S., and as much as 4% of queries in Turkey.

This algorithm targets queries that have abnormally high incidents of SEO spam (payday loans, adult searches, drugs, pharmaceuticals) and applies an extra filters to these types of queries specifically.

 

2013 May 22 – Penguin 2.0

Penguin 2.0 was an update to the Penguin algorithm (as opposed to just a data refresh), it impacted 2.3% of english queries.

What changed?

  • Advertorials will no longer flow pagerank
  • Niches that are traditionally spammy will see more impact
  • Improvements to how hacked sites are detected
  • Link spammers will see links from their domains transfer less value.

One of the biggest shifts with Penguin 2.0 is it also analyzed linkspam for internal site pages, whereas Penguin 1.0 had looked at spammy links specifically pointing to domain home pages.

This marked the first time in 6 months that the Penguin algorithm had been updated, and the 4th update to Penguin that we’ve seen:

  • April 24, 2012 – Penguin 1.0 Launched
  • May 25, 2012 – Penguin 1.1 Data Refresh
  • October 5, 2012 – Another Penguin Data Refresh

 

Additional Reading:

 

2013 May – Domain Diversity

This update reduced the amount of times a user saw the same domain in the search results. According to Matt Cutts, once you’ve seen a cluster of +/- 4 results from the same domain, the subsequent search pages are going to be significantly less likely to show you results from that domain.

 

Additional Reading:

 

2013 May 8th – Phantom I

On May 8th, 2013 SEOs over at Webmaster World noticed intense fluctuation in the SERPs.

Lots of people dove into the data – some commenting that sites who had taken a dive were previously hit by Panda, but there were no conclusive takeaways. With no confirmation of major changes from Google, and nothing conclusive in the data – this anomaly came to be known as the “Phantom” update.

2013 March 14-15 – Panda Update 25

This is the 25th update for Panda, the algorithm that devalues low quality content in the SERPs. Matt Cutts confirmed that moving forward the Panda algorithm was going to be part of a regular algorithm updates, meaning it will be a rolling update instead of a pushed update process.

2013 January 22 – Panda Update 24

The 24th Panda update was announced on January 22, 2013 and impacted 1.2% of English search queries.

 

2012 Google Algorithm Updates

2012 December 21 – Panda Update 23

The 23rd Panda update hit on December 21, 2012 and impacted 1.3% of English search queries.

2012 December 4 – Knowledge Graph Expansion

On December 4, 2012 Google announced a foriegn language expansion of the Knowledge Graph, their project to “map out real-world things as diverse as movies, bridgets and planets.”

 


Variations of Knowledge Graph in Search Results for Different Languages (Russian, Japanese, etc)

2012 November – Panda Updates 21 & 22

In November 2012 Panda had two updates in the same month – one on November 5, 2012 (1.1% of English queries impacted in the US) and one on November 22, 2012 (0.8% of Enlish queries impacted in the US).

2012 October 9 – Page Layout Update

On October 9, 2012 Google rolled up an update to their Page Layout filter (also known as “Top Heavy”) impacting 0.7% of English-language search queries. This update rolled the Page Layout algorithm out globally.

Sites that made fixes after Google’s initial Page Layout Filter hit back in January of 2012 saw their rankings recover in the SERPs.

2012 October – Penguin Update 1.2

This was just a data refresh affecting 0.3% of English queries in the US.

 

2012 September – Panda Updates 19 & 20

Panda update 19 hit on September 18, 2012 affecting 0.7% of English search queries, followed just over a week later by Panda update 20 which hit on September 27, 2012 affecting 2.4% of English search queries.

Panda update 20 was an actual algorithm refresh, accounting for the higher percentage of affected queries.

2012 September – Exact Match Domains

At the end of September Matt Cutts announced an upcoming change: low quality exact match domains were going to be taking a hit in the search results.

Up until this point, exact match domains had been weighted heavily enough in the algorithms to counterbalance low quality site content.

Additional Reading:

2012 August 19 – Panda Update 18

Panda version 3.9.1 rolled out on Monday, August 19th, 2012, affecting less than 1% of English search queries in the US.

This update was a data refresh.

 

2012 August – Fewer Results on Page 1

In August Google began displaying 7 results for about 18% of the queries, rather than the standard 10.

Upon further inspection it appeared that google had reduced the number of organic results so they’d have more space to test a suite of potential top-of search features including: expanded site links, images, and local results.

This change, in conjunction with the knowledge graph, paved the way for the top-of-search rich snippet results we see in search today.

Additional Reading:

2012 August 10 – Pirate/DMCA Penalty

Google announces they’ll be devaluing sites that repeatedly get accused of copyright infringement in the SERPs. As of this date the number of valid copyright removal notices is a ranking signal in Google’s search algorithm.

Additional Reading:

2012 July 24 – Panda Update 17

On July 24, 2012 Google Announces Panda 3.9.0 – a refresh for the algorithm affecting less than 1% search

2012 July 27 – Webmaster Tool Link Warnings

Not technically an algorithm update, but it definitely affected the SEO landscape.

On July 27, 2012 Google posted an update clarifying topics surrounding a slew of unnatural link warnings that had recently been sent out to webmasters:

  • Unnatural link warnings and drops in rankings are directly connected
  • Google doesn’t penalize sites as much when they’re the victims of 3rd party bad actors

Additional Reading:

2012 June – Panda Updates 15 & 16

In June Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.7 rolled out on June 8, 2012 affecting less than 1% of English search queries in the U.S.
  • Panda 3.8 rolled out on June 25, 2012 affecting less than 1% of queries worldwide.

Both updates were data refreshes.

2012 June – 39 Google Updates

On June 7, 2012 Google posted an update providing insight into search changes made over the course of May. Highlights included:

  • Link Spam Improvements:
    • Better hacked sites detection
    • Better detection of inorganic backlink signals
    • Adjustments to Penguin
  • Adjustments to how Google handles page titles
  • Improvements to autocomplete for searches
  • Improvements to the freshness algorithm
  • Improvements to rankings for news and recognition of major news events.

Additional Reading:

2012 May 25 – Penguin 1.1

A data refresh for the Penguin algorithm was released on May 25, 2012 affecting less than 0.1% of search queries.

Additional Reading:

2012 My 16 – Knowledge Graph

On May 16, 2012 Google introduced the knowledge graph, a huge step forward in helping users complete their goals faster.

First, the knowledge graph improved Google’s understanding of entities in Search (what words represented — people, places, or things).

Second, it surfaced relevant information about these entities directly on the search results page as summaries and answers. This meant that users in many instances, no longer needed to click into a search result to find the information they were seeking.

Additional Resources:

2012 May 4 – 52 April Updates

On May 4, 2012 Google posted an update providing insight into search changes made over the course of April. Highlights included:

  • 15% increase in the base index
  • Removed the freshness boost for low quality content
  • Increased domain diversity in the search results.
  • Changes to Sitelinks
    • Sub sitelinks
    • Better ranking of expanded sitelinks
    • Sitelinks data refresh
  • Adjustment to surface more authoritative results.

Additional Reading:

2012 April – Panda Updates 13 & 14

In April Google made two updates to its Panda algorithm fighting low quality content in the SERPs:

  • Panda 3.5 rolled out on April 19, 2012
  • Panda 3.6 rolled out on April 27, 2012 affecting 1% of queries.

Panda 3.5 seemed to target press portals and aggregators, as well as heavily-templated websites. This makes sense as these types of sites are likely to have a high number of pages with thin or duplicative content.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April 24 – Penguin

The Penguin Algorithm was announced on April 24, 2012 and focused specifically on devaluing sites that engage in spammy SEO practices.

The two primary targets of Penguin 1.0? Keyword stuffing and link schemes.

Additional Reading:

2012 April – Parked Domain Bug

After a number of webmasters reported ranking shuffles, Google confirmed that a data error had caused some domains to be mistakenly treated as parked domains (and thereby devalued). This was not an intentional algorithm change.

Additional Reading:

2012 April 3 – 50 Updates

On April 3, 2012 Google posted an update providing insight into search changes made over the course of March. Highlights included:

  • Sitelinks Data Refresh
  • Better handling of queries with navigational and local intent
  • Improvements to detecting site quality
  • Improvements to how anchor text contributes to relevancy for sites and search queries
  • Improvements to how search handles synonyms

Additional Reading:

2012 March – Panda Update 12

On March 23, 2012 we saw the Penguin 3.4 update, a data refresh affecting 1.6% of queries.

 

2012 February 27 – Panda Update 11

Panda Update 3.3 was a data refresh that was announced on February 27, 2012.

2012 February 27 – Series of Updates

On February 27, 2012 Google posted an update providing insight into search changes made over the course of February. Highlights included:

  • Travel related search improvements
  • international launch of shopping rich snippets
  • improved health searches
  • Google changed how it was evaluating links, dropping a method of link analysis that had been used for the past several years.

Additional Reading:

2012 February – Venice

The Venice update changed the face of local search forever, as local sites now up even without a geo modifier being used in the keyword itself.

Additional Reading:

2012 January – Page Layout Update

This update devalued pages in search that had too many ads “above-the-fold.” Google said that ads that prevented users from accessing content quickly provided a poor user experience.

Additional Reading:

2012 January 10 – Personalized Search

On January 10, 2012 Google announced Search, plus Your World. Google had already expanded search to include content personally relevant to individuals with Social Search, Your World was the next step.

This update pulled in information from Google+ such as photos, profiles, and more.

Additional Reading:

2012 January 5 – 30 Google Updates

On January 5, 2012 Google posted an update providing insight into search changes made over the course of December of 2011. Highlights included:

  • Landing page quality became a signal for image search, beyond the image itself
  • Soft 404 detection (when a page returns a different status code, but the content still wont be accessible to a user).
  • More rich snippets
  • Better infrastructure for autocomplete (ex: spelling corrections)
  • More accurate byline dates
  • Related queries improvements
  • Upcoming events at venues
  • Faster mobile browsing – skipped the redirect phase of sending users to a mobile site m.domain.com

Additional Reading:

2011 Google Algorithm Updates

2011 December 1 – 10 Google Updates

On December 1, 2011 Google posted an update providing insight into search changes made the two weeks prior. Highlights included:

  • Refinements to the inclusion of related queries so they’d be more relevant
  • Expansion of indexing to include more long tail keywords
  • New parked domain classifier (placeholder sites hosting ads)
  • More complete (fresher) blog results
  • Improvements for recognizing and rewarding whichever sites originally posted content
  • Top result selection code rewrite to avoid “host crowding” (too many results from a single domain in the search results).
  • New verbatim tool
  • New google bar

Additional Reading:

2011 November 18 – Panda Update 10

The Panda 3.1 update rolled out on November 18th, 2011 and affected less than 1% of searches.

 

2011 November – Panda 3.1 (Update 9)

On November 18th, 2011 Panda Update 3.1 goes live, impacting <1% of searches.

 

2011 November – Automatic Translation & More

On November 14, 2011 Google posted an update providing insight into search changes made over the couple preceding weeks. Highlights included:

  • Cross language results + automatic translation
  • Better page titles in search results by de-duplicating boilerplate anchors (referring to google-generated page titles, when they ignore html title tags because they can provide a better one)
  • Extending application rich snippets
  • Refining official page detection, adjusted how they determine which pages are official
  • Improvements to date-restricted queries

Additional Reading:

2011 November 3 – Fresher Results

Google puts an emphasis on more recent results, especially on time-sensitive queries.

  • Ex: Recent events / hot topics
  • Ex: regularly occurring/recurring events
  • Frequently updated/outdated types of info (ex: best SLR camera)

Additional Reading:

2011 October – Query Encryption

On October 18, 2011 Google announced that they were going to be encrypting search data for users who are signed in.

The result? Webmasters could tell that users were coming from google search, but could no longer see the queries being used. Instead, webmasters began to see “(not provided)” showing up in their search results.

This change followed a January roll out of SSL encryption protocol to gmail users.

Additional Reading:

2011 October 19 – Panda Update 8 (“Flux”)

In October Matt Cutts announced there would be upcoming flux from the Panda 3.0 update affecting about 2% of search queries. Flux occurred throughout October as new signals were incorporated into the Panda algorithms and data is refreshed.

Additional Reading:

2011 September 28 – Panda Update 7

On September 20, 2011 Google released their 7th update to the Panda algorithm – Panda 2.5.

2011 September – Pagination Elements

Google added pagination elements – link attributes to help with pagination crawl/indexing issues.

  • Rel=”Next”
  • Rel=”prev”

Note: this is no longer an indexing signal anymore

2011 August 16 – Expanded Site Links

On August 16, 2011 Google announced expanded display of sitelinks from a max of 8 links to a max of 12 links.

 


Additional Reading:

 

2011 August 12 – Panda Update 6

Google rolled out Panda 2.4 expanding Panda to more languages August 12, 2011, impacting 6-9% of queries worldwide.

Additional Reading:

2011 July 23 – Panda Update 5

Google rolled out Panda 2.3 in July of 2011, adding new signals to help differentiate between higher and lower quality sites.

2011 June 28 – Google+

On June 28, 2011 Google launched their own social network, Google+. The network was sort of a middle ground between Linkedin and Facebook.

Over time, Google + shares and +1s (likes) will eventually become a temporary personalized search ranking factor.

Ultimately though, Google+ ended up being decommissioned in 2019

Additional Reading:

2011 June 16 – Panda Update 4

According to Matt Cutts Panda 2.2 improved scraper-site detection.

What’s a scraper? In this context, a scraper is software used to copy content from a website, often to be posted to another website for ranking purposes. This is considered a type of webspam (not to mention plagiarism).

This update rolled out around June 16, 2011.

2011 June 2 – Schema.org

On June 2, 2011 Google, Yahoo, and Microsoft announced a collaboration to create “a common vocabulary for structured data,” known as Schema.org.

Additional Reading:

2011 May 9 – Panda Update 3

Panda 2.1 rolled out in early May, and was relatively minor compared to previous Panda updates.

2011 April 11 – Panda Update 2

On April 11, 2011 Panda 2.0 rolled out globally to English users, impacting about 2% of search queries.

What was different in Panda 2.0?

  • Better assessment of site quality for long-tailed keywords
  • This update also begins to incorporate data around sites that user’s manually block

Additional Reading:

2011 March 28 – Google +1 Button

Google introduces the +1 Button, similar to facebook “like” button or the reddit upvote. The goal? Bring trusted content to the top of the search results.

Later in June Google posted a brief update that they made the button faster, and in August of 2011 it also became a share icon.

 

Additional Reading:

 

2011 February – Panda Update (AKA Farmer)

Panda was released to fight thin content and low-quality content in the SERPs. Panda was also designed to reward unique content that provides value to users.

 

Panda impacted a whopping 12% of search results, and virtually wiped out content farms, sites with low quality content, thin affiliate sites, sites with large ad-to-content ratios and over optimization.

 

As a result sites with less intrusive ads started to do better in the search results, sites with”thin” user-generated content went down, as did harder to read pages.

Per Google:

As “pure webspam” has decreased over time, attention has shifted instead to “content farms,” which are sites with shallow or low-quality content.”

Additional Reading

2011 January – Attribution Update

This update focused on stopping scraper sites from receiving benefit from stolen content. The algorithm worked to establish which site initially created and posted content, and boost that site in the SERPs over other sites which had stolen the content.

Additional Reading:

2011 January – Overstock.com & JCPenney Penalty

Overstock and J.C. Penney receive manual actions due to deceptive link building practices.

Overstock offered a 10% discount to universities, students, and parents — as long as they posted anchor-text rich content to their university website. A competitor noticed the trend and reported them to Google.

JC Penney had thousands of backlinks built to its site targeting exact match anchor text. After receiving a manual action they disavowed the spammy links and largely recovered.

Additional Reading:

2010 Google Algorithm Updates

2010 December – Social Signals Incorporated

Google confirms that they use social signals including accounting for shares when looking at news stories, and author quality.

<h3style=”font-size: 18pt;”>2010 December – Negative ReviewsIn late November a story broke about how businesses were soaring in the search results, and seeing their businesses grow exponentially – by being as terrible to customers as possible.

Enraged customers were leaving negative reviews on every major site they could linking back to these bad-actor businesses, trying to warn others. But what was happening in search, is all those backlinks were giving the bad actors more and more search equity — enabling them to show up as the first result for a wider and wider range of searches.

Google responded to the issue within weeks, making changes to ensure businesses could not abuse their users in that manner moving forward.

Per Google:

“Being bad is […] bad for business in Google’s search results.”

Additional Reading:
NYT – Bullies Rewarded in Search

2010 November – Instant Visual Previews

This temporary feature allowed users to see a visual preview of a website in the search results. It was quickly rolled back.

 

Additional Resources:
Google Blog – Beyond Instant Results, Instant Previews

 

2010 September – Google Instant

Google suggest starts displaying results before a user actually completes their query.

This feature lived for a long time (in tech-years anyways) but was sunset in 2017 as mobile search became dominant, and Google realized it might not be the optimal experience for on-the-go mobile users.

2010 August – Brand Update

Google made a change to allow some brands/domains to appear multiple times on page one depending on the search

This feature ends up undergoing a number of updates over time as Google works to get the right balance of site diversity when encountering host-clusters (multiple results from the same domain in search).

2010 June – Caffeine Roll Out

On June 10, 2010 Google announced Caffeine.

Caffeine was an entirely new indexing system with a new search index. Where before there had been multiple indexes, each being updated and refreshed at their own rates, caffeine enabled continuous updating of small portions of the search index. Under caffeine, newly indexed content was available within seconds of being crawled

Per Google:

“Caffeine provides 50 percent fresher results for web searches than our last index, and it’s the largest collection of web content we’ve offered. Whether it’s a news story, a blog or a forum post, you can now find links to relevant content much sooner after it is published than was possible ever before.”

Additional Reading:

2010 May 3 – MayDay

The May Day update occurred between April 28th and May 3rd 2010. This update was a precursor to Panda and took a shot at combating content farms.

Google’s comment on the update? “If you’re impacted, assess your site for quality.”

Additional Resources:

2010 April – Google Places

In April of 2010 Local Business Center became Google Places. Along with this change came the introduction of service areas (as opposed to just a single address as a location).

Other highlights:

  • Simpler method for advertising
  • Google offered free professional photo shoots for businesses
  • Google announced another batch of favorite places

By April of 2010, 20% of searches were already location-based.

Additional Reading:

2009 Google Algorithm Updates

2009 December – Real Time Search

Google announces search features related to newly indexed content: Twitter Feeds, News Results, etc. This real time feed was nested under a “latest results” section of the first page of search results.

 

Additional Reading:

 

2009 August 10 – Caffeine Preview

On August 10 Google begins to preview Caffeine, requesting feedback from users.

Additional Reading:

2009 February – Vince

Essentially the Vince update boosted brands.

Vince focused on trust, authority and reputation as signals to provide higher quality results which could push big brands further to the top of the SERPs.

Additional Resources:
Watch – Is Google putting more weight on brands in rankings?
Read – SEO Book – Google Branding

2008 Google Search Updates

2008 August – Google Suggest

Google introduces “suggest” which displays suggested search terms as the user is typing their query.

Additional Reading:

2008 April – Dewey

The Dewey update rolled out in late March/early April. The update was called Dewey because Matt Cutts chose the (slightly unique) term as one that would allow comparison between results from different data centers.

2007 Google Algorithm Updates

2007 June – Buffy

The Buffy update caused fluctuations for single-word search results.

Why Buffy?Google Webmaster Central product manager and long-time head of operations, Vanessa Fox, notoriously an avid Buffy fan, announced she was leaving Google.

Vanessa garnered an intense respect from webmasters over her tenure both for her product leadership and for her responsiveness to the community – the people using google’s products daily. The webmaster community named this update after her interest as a sign of respect.

Additional Reading:

2007 May – Universal Search

Old school organic search results are integrated with video, local, image, news, blog, and book searches.

Additional Reading:

2006 Google Search Updates

2006 November – Supplemental Update

An update to how the filtering of pages stored in the supplemental index is handled. Google went on to scrap the supplemental index label in July 2007.

Additional Reading:

2005 Google Search Updates

2005 November – Big Daddy

This was an update to the Google search infrastructure and took 3 months to roll out: January, February, and March. This update also changed how google handled canonicalization and redirects.

Additional Reading:

2005 October 16 – Jagger Rollout Begins

The Jagger Update rolled out as a series of October updates.

The update targeted low quality links, reciprocal links, paid links, and link farms. The update helped prepare the way for the Big Daddy infrastructure update in November.

Additional Reading:

2005 October – Google Local / Maps

In October of 2015, Google merged Local Business Center data merges with Maps data.

2005 September – Gilligan / False Alarm

A number of SEOs noted fluctuations in September which they originally named “Gilligan.” It turns out there were no algorithm updates, just a data refresh (index update).

Given the news, many SEOs renamed their posts “False Alarm.” However, moving forward many data refreshes are considered updates by the community. So we’ll let the “Gilligan” update stand.

Additional Reading:

2005 June – Personalized Search

Google relaunches personal search. This time it helps shape future results based on your past selections.

Additional Reading:

2005 June – XML sitemaps

Google launches the ability to submit XML sitemaps via Google Webmaster tools. This update bypassed old HTML sitemaps. It gave Webmasters some influence over indexation and crawling, allowing them to feed pages to the index with this feature.

Additional Reading:

2005 May – Bourbon

The May 2005 update, nicknamed Bourbon seemed to devalue sites/pages with duplicate content, and affected 3.5% of search queries.

2005 February – Allegra

The Allegra update rolled out between February 2, 2005 and February 8, 2005. It caused major fluctuations in the SERPs. While nothing has ever been confirmed, these are the most popular theories amongst SEOs for what changed:

  • LSI being used as a ranking signal
  • Duplicate content is devalued
  • Suspicious links are somehow accounted for

Additional Reading:

2005 January – NoFollow

In early January, 2005 Google introduced the “Nofollow” link attribute to combat spam, and control the outbound link quality. This change helped clean up spammy blog comments: comments mass posted to blogs across the internet with links meant to boost the rankings of the target site.
Future Changes:

  • On June 15, 2009 Google changed the way it views NoFollow links in response to webmasters manipulating pages with “page rank sculpting”.
  • Google suggests webmasters use “nofollow” attributes for ads and paid links.
  • On September 10, 2019 Google Announced two additional link attributes “sponsored” and “ugc.”
    • Sponsored is for links that are paid or advertorial.
    • UGC is for links which come from user generated content.

Additional Reading:

2004 Google Algorithm Updates

2004 February – Brandy

The Brandy update rolled out the first half of February and included five significant changes to Google’s algorithmic formulas (confirmed by Sergey Brin).

Over this same time period Google’s index was significantly expanded, by over 20%, and dynamic web pages were included in the index.

What else changed?

  • Google began shifting importance away from Page Rank to link quality, link anchors, and link context.
  • Attention is being given to link neighborhoods – how well your site connected to others in your sector or space. This meant that outbound links became more important to a site’s overall SEO.
  • Latent Semantic Indexing increases in importance. Tags (titles, metas, H1/H2) took a back seat to LSI.
  • Keyword analysis gets a lot better. Google gets better at recognizing synonyms using LSI.

2004 January – Austin

Austin followed up on Florida continuing to clean up spammy SEO practices, and push unworthy sites out of the first pages of search results.
What changed?

  • Invisible text took another hit
  • Meta-tag stuffing was a target
  • FFA (Free for all) link farms no longer provided benefit

Many SEOs also speculated that this had been a change to Hilltop, a page rank algorithm that had been around since 1998.

Additional Reading:

2003 Google Algorithm Updates

2003 November 16 – Florida

Google’s Florida update rolled out on November 16, 2003 and targeted spammy seo practices such as keyword stuffing. Many sites that were trying to game the search engine algorithms instead of serve users also fell in the rankings.

 


GIF of a webmaster freaking out a little and mashing their keyboard looking worried
Additional Reading:

 

2003 September – Supplemental Index

Google split their index into main and supplemental. The goal was to increase the number of pages/content that Google could crawl and index. The supplemental index had less restrictions on indexing pages. Pages from the supplemental index would only be shown if there were very few good results from the main index to display for a search.

When the supplemental index was introduced some people viewed being relegated to the supplemental index as a penalty or search results “purgatory”.

Google retired the supplemental index tag in 2007, but has never said that they retired the supplemental index itself. That being said it’s open knowledge that Google maintains multiple indices, so it is within the realm of reason that the supplemental index may still be one of them. While the label dissapeared, many wonder if the supplemental index has continued to exist and morphed into what we see today as “omitted results”Sites found they were able to move from the supplemental index to the main index by acquiring more backlinks.

Additional Reading:

2003 July – Fritz (Everflux)

In July, 2003 Google moved away from monthly index updates (often referred to as the google dance) to daily updates in which a portion of the index was updated daily. These regular updates came to be referred to as “everflux.”

2003 June – Esmerelda

Esmerelda was the last giant monthly index update before Google switched over to daily index updates.

2003 May – Dominic

Google’s Dominic update focused on battling spammy link practices.

2003 April – Cassandra

Google’s Cassandra update launched in April of 2003 and targeted spammy SEO practices including hidden text, heavily co-linked domains, and other low-link-quality practices.

Google began allowing banned sites to submit a reconsideration request after manual penalties in April of 2003.

Additional Reading:

2003 February – Boston

Google’s first named update was Boston which rolled out in February of 2003. The Google Boston Update improved algorithms related to analyzing a site’s backlink data.

2002 Google Algorithm Updates

2002 September – 1st Documented Update

Google’s first documented search algorithm update happened on September 1, 2002. It was also the kickoff of “Google Dance” – large-scale monthly refreshes of Google’s search index.

SEOs were shocked by the update claiming “PageRank [is] DEAD”, this update was a little imperfect and included issues such as 404 pages showing up on the first page of search.

Additional Reading:

2000 Google Search Updates

2000 December – Google Toolbar

Google launches their search toolbar for browsers. The toolbar highlights search terms within webpage copy, and allowed users to search within websites that didn’t have their own site search.

Additional Reading:

 

The post Google Algorithm Update History appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/google-algorithm-update-history/feed/ 47
Mobile SEO – The Complete Guide 2022 https://linkgraph.io/blog/mobile-seo/ https://linkgraph.io/blog/mobile-seo/#respond Fri, 21 Oct 2022 15:33:43 +0000 https://linkgraph.io/?p=2918 –Updated for 2022– As of 2020 over 58% of site visits now come from mobile search traffic. If you aren’t taking mobile into account heavily enough, it’s […]

The post Mobile SEO – The Complete Guide 2022 appeared first on LinkGraph.

]]>
–Updated for 2022–

As of 2020 over 58% of site visits now come from mobile search traffic. If you aren’t taking mobile into account heavily enough, it’s likely hurting your business.

The use of mobile devices is rapidly changing the way customers are searching, engaging, and buying. Consumers have access to faster Internet while they’re on-the-go. That means Internet traffic is increasing through mobile devices. Beyond social engagement and consuming content, they’re also making buying decisions.

Mobile Search is Often the First Step for Purchases

According to Morgan Stanley, 91% of adults keep their smartphones within arm’s reach. That’s ninety-one percent of ALL adults, and it’s shifting both business culture and research practices. Rather than dedicating time to research a topic, users now perform micro-searches on the go, and then follow-up on those initially discovered options or solutions later on.

How big is this trend? An IDG Global Solutions survey found 92% of senior execs own a smartphone used for business, 77% of those research business purchases from their mobile device with 95% then finalizing related purchases via laptop/desktop. That’s a huge portion of the B2B purchase pool starting their journey from mobile. Missing a user during their initial mobile-based exploration may mean your business is losing out on a huge portion of the market.

Mobile Search is Often Location-Oriented

This trend is even more compounded for local businesses, as 58% of mobile users search for local businesses daily. What’s more? 89% of those users search for a local business at least once per month. We also learn from HubSpot that, when consumers do a local search, 72% of them visit a store within five miles. What does this mean for business with an Internet presence? It’s time to make it mobile-friendly.

What Does the Rise of Mobile Search Mean for Businesses?

Websites now need to be responsively designed so they can serve mobile users just as well as desktop users. Responsive design is a design that adapts to the size of the users viewport (i.e. screen), by changing font sizes, adjusting images, and even collapsing page elements to make navigation simpler. Responsive websites that follow modern design standards help users access and understand the information they need more quickly.

Source
Additionally users now view responsive functionality as a trust signal. A study conducted by socPub indicates that 57% of Internet users will not recommend a business that has a poorly designed mobile site.

Because mobile users comprise an increasing number of searches and site visits, they now represent the largest source of traffic in a slew of markets (new industry segments falling into this bucket each month). Our clients regularly pick up market share with simple mobile-friendly design updates, especially within industries that are traditionally late-adopters.

Your Website is Now Your Storefront

Your site is now your storefront. If your site looks terrible or functions poorly, users will leave instead of working to get at your information – it costs a user nothing to click the next result in search.

Google Prioritizes Mobile-Optimized Sites

Google has switched over to mobile first indexing. Mobile-first indexing prioritizes mobile friendly sites over other sites in the organic search results. Even if your target consumers aren’t heavy mobile-users yet, your site still needs to be mobile-optimized if you want to show up higher in the search results (even for desktop-based searches).

Users Are Making Purchase Decisions from Search Alone

With mobile devices rapidly changing the way consumers access information your offsite optimizations are also becoming critical. For example most users performing local searches never go past the search results themselves (aka they don’t actually click into websites anymore). Local search users are typically able to surface the information they want directly within the search results through features like the local Map Pack.

How Can I Improve My Mobile SEO?

The first step toward reaching mobile users is having a mobile-friendly website. Currently, in 2021, responsive web design is the best design approach for mobile-friendliness. Responsive design is the best approach for mobile design because:

  • You will serve the same content to both mobile and desktop users
  • The content will adapt responsively to all screen sizes and mobile device types
  • Search equity is centralized to a single URL for all pages
  • It’s a better user experience
  • Google prefers responsive design

What exactly is responsive design?

Responsive design in an approach for creating web pages where layouts and content dynamically adapt to the size and orientation of the screen or viewport being used.

In the example below you can see that the desktop version of this responsive site the text and video are displayed side-by side, and in the mobile version of the site those elements have been stacked.

Desktop Layout - Responsive Site Mobile Version Responsive Design
Desktop
Mobile

This responsive theme adjusts to the width of different devices from smartphones to tablets, even large wide-screen viewports, by rearranging and resizing the design elements.

There have been a few ways to handle mobile sites since the invention of smartphones, the first two mobile design waves were plagued with usability issues, and hard to maintain. Let’s take a look at what didn’t work, and why you should consider migrating to a responsive design if you’re still employing one of these outdated mobile design tactics.

Outdated Approach #1: Mobile Subdomain, Separate Mobile Website

The first wave of design involved creating a different site entirely to serve as the mobile site. This approach involved serving a mobile version of the site using a different URL, a mobile URL. For those of you who have been around long enough, you may remember pages you visited from a mobile device redirecting from domain.com to m.domain.com.

This approach required setting up canonical tags for every page, as each mobile web page contained content duplicative to the desktop page. This approach also split the search equity for each page as desktop users interacted with the desktop site, and mobile users interacted with the mobile website.

When users shared pages from the site, creating backlinks they were split between the mobile subdomain and the regular site domain as separate URLs were being served to each user group. It also meant that every time an edit was made to content on the desktop site, a second round of edits had to be made on the separate mobile site. Mobile pages under this paradigm often provided a worse user experience as they typically served less content than the full desktop site did for desktop users.

Outdated Approach #2: Dynamic Serving of Mobile Sites

The next wave of design consolidated pages under a single URL, but dynamically served cached pages based on the user’s device type using a vary http response header.

This iteration of mobile design allowed sites to consolidate search equity between their desktop site and mobile site. It also did away with the need for canonical tags on virtually every site page.

However, it meant that every time a device came out with new dimensions, a new instance of the site had to be spun up, formatted, and tested to be served to users. This system became increasingly impossible to maintain as the market diversified and the dimensions for mobile screens became rapidly non-standard. Dynamically serving a mobile version of your site was plagued with issues including a repeated issues with serving the desktop version to mobile users.

Current Best Practice: Responsive Design

Responsive design consolidates the mobile version of a webpage and the desktop version of a webpage under a single URL. It also serves the same instance of code, regardless of the size of the mobile screen or desktop viewport.

This allows site owners to combine their desktop SEO and Mobile SEO efforts, employing a single set of SEO best practices and strategies. Responsive design is easier to maintain as you don’t have to manage different content or code for a single page.

Instead all elements fluidly rearrange to suite mobile visitors and desktop visitors as needed. If a user switches from full screen to half-screen with their browser, the design elements will shift accordingly so the user experience is largely unchanged.

How to Check If Your Mobile Site is Google-Friendly

In July 2019, there were over 1.69 billion more mobile searches than desktop searches performed in the US alone (source, source). Search itself has become mobile-first. The first place you’ll start when checking your site for mobile optimization is checking out how Google views your site.

Mobile SEO Strategy is All About Google

Google holds over 90% of the market share for mobile search traffic in the U.S., because Google has spent years optimizing search specifically for mobile users. Many of Google’s search results are so well optimized, that mobile users don’t even need to click into an actual result to find the information they need.

Rich snippets and rich results now display enough information for users to take action based off of the search results alone, from finding movie times to the addresses of local businesses, to how to troubleshoot tech problems.

How did Google get so far ahead of the competition with mobile search? They started testing and prioritizing mobile features years ago, and as mobile search volume overcame desktop search volume, Google shifted to prioritizing mobile users over desktop users.

A Brief History of Google’s Mobile Search Results

In 2015 Google rolled out mobile-friendly search results, serving a separate set of search results to mobile devices. This update, often called Mobilegeddon, prioritized mobile-friendly websites in the search results.


Source: Google/SOASTA Research, 2017.

In 2016 Google began to experiment with mobile-first indexing, cataloguing the mobile version of page content, rather than the desktop version.

In March of 2018 Google formally began rolling out mobile-first indexing, and migrating over to the mobile-version of pages for sites that it had already indexed as desktop versions. To quote Google themselves, “Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our – primarily mobile – users find what they’re looking for.” Essentially the entire index is going mobile-first. This process of migrating over to indexing the mobile version of websites is still underway. Website’s are being notified in Search Console when they’ve been migrated under Google’s mobile-first index.

In July of 2018 Google rolled out page speed as a mobile ranking factor, ranking sites with slow load times lower in the search results.

Figuring Out Which Trends Will Last

Over the past decade Google has also continually rolled out additional data-rich mobile-first search features from movie times, to reviews, to product images. Google often pivots when rolling out new features, as it continually tests and then prioritizes what works best for serving users the most valuable information.

For example, Google originally published a guide helping webmasters create separate mobile sites under the m.domain.com URL – a tacit approval of the process, only to pivot within a year to formally recommending responsive design under a single unified URL.

Source
Similarly, the AMP (accelerated mobile pages) standard, has been pushed heavily in the past few years. AMP pages, which load in a fraction of the time of normal pages, seem to be struggling with many of the issues that m.domain.com mobile pages had back in the day.

Sites using AMP pages are often managing two sets of page content, with one set slimmed down to meet the AMP standard. There are also challenges with AMP pages being served from a Google URL rather than the site’s own domain. While Google recently addressed some of these concerns with signed exchanges, but it’s still causing questions around whether link equity is being split between the AMP viewer URL, the original AMP source, and the AMP cache URL.

Trends that are here to stay? Responsive design, quality content that gets right to the point, making sites as fast as humanly possible.

Check if Google is Flagging Mobile Issues

So what should you pay the most attention to in terms of Mobile optimization? If you already have a website, start with Google’s Mobile Friendly Test. This tool will give you an aggregate rating for whether or not Google thinks your site is mobile-friendly. The tool will also prompt you to view a full usability report in Google Search Console.

If you want to access this report on your own directly from Search Console, login to your account for the domain, and use the left-hand navigation to click into “mobile usability” under Enhancements.

Here you will find a list of the mobile issues that Google has detected on your site. Examples included text being too small to read, clickable elements being too close together, content being wider than the screen, etc.

Click into any of these issues, and you’ll see more granular information to help you improve your mobile SEO, such as the pages where the errors are found. You’ll also see a space to validate that the error has been fixed once you make adjustments to your site.

These are errors Google is specifically recognizing and calling out for your site. From a search rankings perspective, these should be at the top of your list to fix.

Check if Google Is Indexing Your Webpages

Google can’t serve pages in the search results that it can’t see. Make sure that Google is indexing your pages for search.

Enable Crawl by Googlebot

Check your robots.txt file, and make sure that it’s not blocking Googlebot. Your robots.txt file can be used to block certain types of bots and crawlers, but if you’re trying to rank highly in the SERPs, Googlebot should not be one of them.

To check if your robots.txt file is blocking Googlebot, you can either use a free robots.txt tester, or use the link inspection feature in search console.

NoIndex

A few years ago you could check blocked resources straight from google console in a consolidated view, but as these issues became less prevalent google has dropped the aggregate view. Secondary tools like screaming frog can still give you a full list of NOINDEX and NOFOLLOW pages from your site. Alternatively you can check the status of individual links straight from Search console using the URL inspection tool.

This tool also allows you to manually submit links and request indexing of new pages, revised pages, and pages that crawlers have yet to discover.

Checking if Your Mobile Site is User-Friendly

Now that you’ve resolved a majority of the technical usability issues, it’s a good idea to check for issues mobile users face that may not have caught by Google.

How Does Your Site Appear on Mobile?

Start by taking a look at how your site appears on different devices, this free tool will let you select from a variety of mobile devices and desktops to give you a full sense of how your site looks on different devices.

You should quickly be able to see any major issues with formatting that could be hindering the mobile user experience, or making your site look unprofessional. Examples include poorly formatted text, grainy or stretched images, or overlapping page elements.

Work with your webmaster or web development team to clean up any design elements that aren’t displaying well on mobile. Once your site layout is mobile optimized, you’ll want to check that your site is compelling to mobile searchers on the Google search results page.

Are the Visible Portions of Page Titles and Metas Compelling?

Users only click into a site from search if the rich snippet, page title, and/or meta description are compelling. Your title tag for your page needs to front-load your target keyword(s), and your meta description should include the most pertinent information about your page first.

Page titles can be very similar between pages, so meta descriptions can often make the difference for which result or results site visitors click.

Also keep in mind that rich snippets can provide even less space for title tags and meta descriptions. In the example below you can see how each result only displays about 3-4 words from the page title.

If you use a major platform like Wordpress there are SEO plugins that will help you manage your title tags and meta tags. If your site is custom, you may need to edit this information directly in the html code.

If you’re seeing a good amount of organic traffic from your target keywords, the next step is to make sure that traffic is actually seeing your mobile optimized content.

Are You Losing Visitors to Page Speed?

Over half of mobile searchers will abandon a page that takes longer than three seconds to load. Separately, for every additional second it takes a page to load, conversions fall by 12% (Google, 2018).

To check your mobile page speed use Google’s PageSpeed Insights Tool, and see how quickly your site loads on a 4G connection. This tool will give you a granular breakdown of all speed issues you can address to improve your site speed.

Most major website platforms (Wordpress, Squaresace, Wix, etc) will have native features and plugins that will automatically optimize image files for mobile devices to reduce page load times.

Do Any Pages Have Super High Mobile Bounce Rates?

Bounce rates are a great indicator that a page is not providing value to users. If you see bounce rates are much higher on specific pages for mobile users than for desktop users this is a sign that the page may have some issues with either mobile formatting, mobile load times, or that the relevant content may take too long to scroll to on mobile.

To check bounce rates, simply login to your Google Analytics dashboard. You’ll be able to view aggregate bounce rates for your site, bounce rates by page, and track how bounce rates change as you make adjustments to webpage content.

Avoid Intrusive Popups

Intrusive pop ups, and poorly designed pop ups can increase your bounce rates on mobile and tablet devices. Intrusive popups can also hurt your organic search rankings, especially with Google. An update Google rolled out in 2016 devalues mobile pages that have intrusive pop ups, lowering the page’s rankings in the search results.

There are two major popup issues that can cause bounce rates and devaluing of a page in SERP. Pop ups that have not been optimized for mobile traffic can be impossible to close on small screens, and may cause mobile searchers to bounce from your site. Pop ups that prevent a user from accessing content on-load will hurt your mobile SEO especially with Google. Google considers pop ups that block site visitors from content to be “intrusive.”

Examples of intrusive pop-ups and interstitials:

  • A pop up that displays immediately, or while the user is trying to read through content
  • An interstitial that has to be exited before the user can access the main content
  • A full-screen interstitial that has to be scrolled past to access the main content

That doesn’t mean you should abandon popups entirely. Used correctly, and designed with mobile UX in mind, pop ups can help improve your conversion rate. These pop ups are ones that help the mobile user along their journey, are contextually relevant to the content, or are a legal requirement. Pop ups that appear as a user is looking to complete the next step in their journey are generally fine as well.

Examples of pop-ups and interstitials that are okay:

  • Pop ups that notify mobile searchers that a site uses cookies.
  • Pop ups that confirm a user’s age for restricted content or services.
  • Pop ups that take up a reasonable amount of room and are easy to dismiss.

Optimize Your Site for Voice Search

A report issued by PwC states that, compared to conducting a traditional search, 71 percent of respondents prefer voice searching. Now that we know users prefer voice search, let’s look at how we can optimize our websites to reach them.

1. BE CONCISE. The average voice response ANSWER is less than 30 words long. Avoid filler or unnecessary terms like “however” or “thus” and be as direct and straight to the point as possible while completely answering a question. Google actually has an entire guide outlining the type of responses selected for voice searches, and the biggest takeaway is that answers should be brief and direct.

2. Voice searches pull in part from “featured snippets.” That means, when someone asks a question using a voice search, Google pulls answers from approximately 30 percent of these snippets.

3. Consider the user’s intent. When crafting your content, ask yourself what users are searching for before landing on your site. Doing this will help enhance the content’s relevance. Therefore, if you’re optimizing your page for a specific featured snippet, your goal should be understanding your visitor’s intent and providing them with an answer immediately.

4. Use long tail keywords and questions in headers. Often, voice searches occur as though the user is speaking to a human. Short, choppy keywords are rarely in use. Long-tail keywords and phrases are how people talk. So, when optimizing your site, consider using these phrases in conjunction with questions. That way, your website will pop up more often when users are trying to solve a problem, find a product, or use a service.

5. Optimize for local searches. Users are going to search using local SEO. According to Small Business Trends, 58 percent of mobile users find local businesses using voice searches. Adding phrases to your content like, “near me” or your geographic area will help boost your rankings.

Are You Addressing the Customer’s Journey?

Mobile-friendly websites must think through the customer’s journey. Ask yourself these three questions:

  • What types of users hit my site?(Who are they, how old are they, what are their roles)
  • What would those users be want from my site?(ex: to establish pricing, to find my business location, to complete an online purchase, to share a story)
  • Can each user easily complete their journey using only the main nav?

Your main navigation should help users quickly and easily get what they want from your site, without a user needing to use site search or “click around.” Once you have a handle on your audience segmentation and goals, you should confirm that your users are not facing any major barriers along each journey.

There are a few ways to do that, here are two:

  • If you have a program like Hotjar or Lucky Orange installed that allows you to view your own users’ onsite journeys – you can watch user recordings to see if users are struggling to complete tasks.
    • Ex: Users abandon scrolling because information is too far down a page
    • Ex: Users have a lot of “U-Turns” – pressing back almost immediately because what they wanted wasn’t on the page they clicked into.
    • Ex: Users rage-click an element that’s not opening or functioning correctly.
    • Ex: You see error messages displayed to the user from your site.
    • Ex: You see users begin conversion, but abandon forms or carts.
  • You can conduct direct user research:
    • Recruit users that you’re able to interact with directly
    • Request they complete specific tasks on the site
    • Have them explain their thinking and reactions as they interact with your site

Your marketing shouldn’t be only about what devices your potential customer is using, it should be about the journey they’re taking. What are their lifestyles, habits, and device preferences? Conduct research, surveys, and interviews with your current audience. This marketing tactic is an excellent opportunity to develop a relationship with your existing customer base. Offer incentives and prizes to those who choose to participate.

Create Journey-Driven Designs

Designing websites focusing on mobile users means we have drastically less real estate, so minimalism is critical. The last a user wants to do is scroll through or resize your pages. According to a scrolling and attention study the

Nielsen Norman Group conducted, 74 percent of users indicated their viewing time is spent on the first two screens of content. Therefore, responsive design is the solution. You can accomplish this in a variety of ways, including:

  • Hiding content under sliders
  • Using sticky live chat or feedback widgets
  • Implementing mobile pop-ups
  • Redirecting to social media
  • Creating a bare-bones presentation
  • Eliminating sidebars
  • Taking advantage of banner space
  • Replacing graphics with a search bar

Pro-Tip: For mobile-users, one often overlooked difference is that tap-areas need to be large enough for users to click on interactive elements (links, buttons, drop-downs) with precision.

Mobile User Experience Optimization Recap

For local business:

  • Make sure to include NAP (name, address or service area, phone number) on your website.
  • Claim and complete your Google My Business (GMB) listing and your Bing Places account.
  • Optimize pages to include names of local cities and landmarks
  • Focus on location-based rich snippets like the Map Pack

For all businesses:

  • Make use of structured data to leverage google search’s rich snippet features.
  • Confirm your responsive design is acting as-expected.
    • You can use a tool like this Responsive Design Checker to confirm how your site looks at the most common breakpoints
    • You can check out alerts and mobile feedback directly from Google through your site’s Google Search Console
    • Install a user-session recording software
    • Hotjar, for example, will let you see if your users are struggling in any areas (ex: pages are too long and users abandon before hitting content critical to conversion).
  • Focus on SPEED:
      • Optimize images for mobile (reduce file size)

    Pro-tip start out with a responsive design or theme and it should handle this for you.

  • Minify CSS
  • Leverage caching
  • Enable Accelerated Mobile Pages (AMP)
  • Switch anything you have on flash over to HTML5 instead

Final Thoughts

Mobile searching remains the leader because everyone loves the convenience of using their devices. Your audience is busy, on-the-go, and living in a digitally-driven world. As a result, their mobile queries will continue to be on an upward rise. Even though mobile searches are similar to those on a desktop, your site must be optimized for your audience’s visits. Your brand should be easy to use and support your customer’s journey. A mobile-friendly design that responds to the level of mobile searches you receive should be your goals.

The post Mobile SEO – The Complete Guide 2022 appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/mobile-seo/feed/ 0
SEO for Nonprofits: How to Improve Online Visibility and Donations https://linkgraph.io/blog/seo-for-nonprofits/ https://linkgraph.io/blog/seo-for-nonprofits/#respond Tue, 27 Sep 2022 18:27:26 +0000 https://linkgraph.io/?p=18218 For nonprofits looking to increase the online visibility of their charity or organization, nonprofit SEO can be a strategic, impactful, and affordable investment.  Nonprofit marketing and development […]

The post SEO for Nonprofits: How to Improve Online Visibility and Donations appeared first on LinkGraph.

]]>
For nonprofits looking to increase the online visibility of their charity or organization, nonprofit SEO can be a strategic, impactful, and affordable investment. 

Nonprofit marketing and development teams rely on a variety of digital marketing strategies. These include content marketing, email marketing, and social media to engage with their donor base.

Although these channels are important, they don’t necessarily get your content in front of the eyes of new users like search engine optimization can.

So if your nonprofit organization has not made SEO an integral part of your marketing efforts, this guide will help you get started. Here’s how to implement a nonprofit SEO strategy.

What is Nonprofit SEO?

Generally speaking, nonprofit SEO is the process of optimizing a nonprofit website or web page so that it appears as high as possible in the search engine results pages (SERPs) for relevant keywords.`

This can be done through a variety of methods, including optimizing the website’s content, structure, and on-page elements like titles, metatags, and anchor text; building backlinks from high-quality websites; and optimizing the website for mobile devices.

Although SEO can benefit any industry, it is particularly helpful for those like nonprofit organizations which may have limited marketing budgets and need to get the most value for their marketing spend.

Why is SEO Important for Nonprofits?

With 3.8 million Google searches happening every minute, it’s clear that Internet users turn to search engines on a daily basis to answer their questions or find information.

That is no different when it comes to finding organizations they want to donate to, volunteer for, or simply learn more about their specific issues or causes.

For example, there are 7000 searches every month for the keyword “cancer charities”

screenshot of searchatlas keyword data for cancer charities

And that is just one of the hundreds (to thousands!) of different keywords users rely on when looking for information about cancer-related nonprofits.

So what does that mean for your charity? It means there are endless opportunities for your nonprofit website to rank in search engines, connect with wider audiences, and grow support and awareness for your organization.

Benefits of SEO for Nonprofits and Charities

Here are some of the tangible ways that ranking in search results can impact your nonprofit.

Increase Awareness for your Organization

How can users support your organization if they don’t know you exist? 

Search engines are a great way to help users discover your organization and learn about the great work you do with specific causes.

Gain More Financial Support

Nonprofit SEO is especially important for organizations that rely on financial support from the general public.

By appearing as high as possible in the SERPs, a nonprofit can attract more visitors to its website. This can result in more donations.

Often people want to donate, but they don’t necessarily know where. They turn to search engines to give them answers.

screenshot of keyword research in ahrefs

If your organization is not ranking on the first page of the SERPs for relevant keywords, your potential donors will turn to other organizations to give their time, resources, and money.

Educate People About your Causes

Maybe your organization is doing innovative work that you want to share with users, journalists, and other organizations alike.

Having optimized content on your website can help you share that work with a wider audience. In addition, it can help you get more people invested in your causes, research, or advocacy. 

Screenshot of SERP results from a nonprofit

Journalists often want to link to research reports or studies in their articles. If your nonprofit website has this content, it can be a part of your SEO strategy.

Getting Started with SEO for Nonprofit Organizations

SEO can feel complicated or intimidating for those who don’t understand it. Ultimately, it’s all about creating high-quality content for users and giving them a great website experience.

Keyword Research for Nonprofits

The foundation of SEO is keyword research. Keywords in SEO are the words and phrases that users type into search engines to find content like yours.

You can look at keywords as the roads and bridges that connect your nonprofit organization to your target audience. 

Ideally, your web pages should rank for keywords that have strong relevance to what services your nonprofit organization provides.

If you aren’t quite sure how to do keyword research, you can always order keyword research from our SEO experts in our builder. But if you want to do keyword research on your own, you can register for our SEO software and perform keyword research yourself. 

Here’s how to get started in finding the best keywords for your nonprofit.

1. Brainstorm Relevant Keywords and Subtopics

One of the best places to start is simply to brainstorm what users might be searching for in Google that has relevance to your nonprofit.

Here are some questions to ask during your brainstorming:

  • What is the primary purpose of my organization?
  • Who do we serve?
  • What are common questions people ask about our organization?
  • What questions do users ask about our primary cause or issues?
  • Where is my target audience located? 
  • What events does my organization regularly put on?
  • What issues do our donors care about?
  • What volunteer or service opportunities are available at my nonprofit?
  • And others!

These are just a few examples to get you thinking about what searches might be relevant to your website content.

2. Make a Keyword List Using a Keyword Tool

Once you have some ideas to start with, you can use a keyword research tool to get more information on what people are actually searching for in Google.

For example, let’s say my charity puts on an annual 5k walk that raises awareness and money for my organization. By using a keyword tool, I can see what relevant keywords users search for in relation to these events.

keyword research in searchatlas

If I have a landing page on my website about this event, I may want to optimize it for one of the above keywords. 

Here’s another example. Let’s say my organization is an animal shelter and we have the below landing page that describes our volunteer opportunities.

Ranking for relevant keywords below could mean more potential volunteers discovering this web page in search engines and submitting their information or signing up!

Once you find keywords that you know have relevance to your content, add them to a list in your keyword tool. You may want to make separate lists for each of the individual pages that you have on your website that you want to rank in search engines.

3. Choose Secondary Keywords

Once you have a keyword list, you will then want to move forward with choosing keyword targets.

Ideally, each rank worthy web page on your nonprofit website will target one primary keyword, along with some secondary keywords that have semantic and topical relevance. In the SEO world, this is referred to as a keyword cluster

What makes a keyword the right choice? Here are the most important metrics to remember.

  • Search Volume: This is the number of times that users are searching this keyword in Google every month. You will want to choose keyword targets with higher search volume so you have more opportunities for your content to be seen by users.
  • Cost-per-click: This is the price that advertisers pay to target this keyword in a Google Ads Campaign. If this number is high, it’s a good sign that this keyword must be valuable and bring qualified traffic
  • Keyword Difficulty: This is a keyword metric that estimates how competitive it is to rank for a keyword. If you are a newer nonprofit, you may want to consider keywords that have lower KD scores. This will give you a better chance of getting on page one. 

Choosing keyword targets is arguably one of the more strategic parts of SEO, so if you’re not sure how to move forward with keyword targeting, you may want to consider working with an SEO strategist.

You can even book a free call with one of our SEO professionals to talk through your keyword goals.

Onpage Optimization for Nonprofits

The next stage of your nonprofit SEO strategy is optimizing your web pages for your target keywords.

On-page optimization means improving the content signals on your web pages so Google sees them as relevant to the target keyword and high-quality.  

You will do the process of on-page optimization for each web page that you want to rank in search results. I’ll model the process with this homepage from the Humane Society of Houston.

We want this page to rank for the below keywords related to animal shelters.

These keywords were chosen because they have high search volume, low Keyword Difficulty, and strong relevance to our website’s content.

1. Create High-Quality, In-Depth Content

The first step of the optimization process is making sure that your web page content is valuable, high-quality, and relevant to users’ search intent.

What is search intent? Well, it’s the intention behind the keyword.

For example, someone searching for “animal shelter houston,” may be wanting information about a variety of things. Maybe they want to adopt a pet. Maybe they want to volunteer. They might just be looking for a place to drop off a dog that they found in their backyard.

SEO strategists would classify this particular keyword as having “informational,” search intent. The users is simply trying to find more information. 

The Humane Society of Houston does a great job of including a broad picture of their various services on their homepage. This means users of all three intentions listed above can find what they are looking for.

2. Optimize your Meta Data

Meta tags are SEO HTML tags that exist on the HTML version of your web page. They tell search engines what your content is about.

They are also visible to the users in the SERPs, as they form the text of your search engine result.

There are a variety of tags that you can optimize on your nonprofit website to increase your chances of ranking for your target keywords.

They can also make your content more engaging and clickable for users.

Title Tag

The title tag should give your users a good idea of what your web page is about. It should also include the primary keyword that you want your website to rank for. 

Here is what the title tag looks like in HTML:

Title tags should also not exceed 60 characters, as this is the maximum that Google will display in the SERPs.

Meta Description

The meta description gives more information to the user about your web page content.

However, Google sometimes will show its own meta description based on what it thinks the user is looking for. Still, you should make sure that your meta description includes your target keywords and meets character count best practices.

For meta descriptions that means 120-160 words!

Robots Tag

The robots tag tells search engine robots whether or not they should crawl the web page content and add it to their index.

If you want to search engines to promote a web page, then you will want to use the “index, follow” directive in your robots tag.

There may be pages on your website that you don’t want to rank in search engine results, like thank you or confirmation pages. On those pages, you may want to consider adding a “noindex, nofollow,” so Google knows not to add the page to its index.You can review the following article for more information on robots tags and directives.

Header Tags

Header tags are used to divide up the sections of your web page to make it more easy to navigate and find information they are looking for.

For a page like your homepage, your header tags can provide a scope of your nonprofit organizations services, events, or volunteer opportunities.

Nonprofit Content Strategy for SEO

Beyond optimizing your existing web page content, creating new content on a regular basis gives your nonprofit more opportunities to appear in the SERPs. 

Content like blogs, infographics, white papers, annual reports, and more can all rank in search engines and drive website traffic.

Here’s an example of an annual report from the Chron’s and Colitis Foundation that both provides valuable information to users and has strong ranking potential.

This type of content also can be used in your social media and email marketing efforts, because people will visit your website if you feature great content there.

So make sure that you are doing the work of creating a content calendar and investing in content development. You can use our content planner tool to help you identify relevant keywords and generate blog and article ideas.

Local SEO for Nonprofits

Google will show users local search results if it believes that the user is searching for service offerings in their specific area. 

Ranking for local searches is all about distance, prominence, and relevance. If your nonprofit organization serves your local area, local SEO strategies can help you appear in locations based searches like, “pet rescues near me,” and in the Google Map Pack.

To optimize for local searches, here are some strategies to consider:

  • Include location information on every page of your website. This includes address, hours of operation, phone number, and essential contact information.
  • Get listed in online directories. Google will look for consistent and accurate information about your charity across these directories when determining whether to rank pages from your website. You can get listed in hundreds of directories for only $20 a month.

Link Building for Nonprofits

Another major part of your nonprofit SEO strategy will be link building.

Link building is the process of getting other websites on the internet to link to yours.

Why do you need link building? Well backlinks are Google’s top ranking factor, as they signal to Google crawlers that your website is trusted and reputable in your industry. 

Here are a few ways to get started with link building for your nonprofit:

  • Ask your nonprofit partners: Sometimes all you have to do is ask! If your nonprofit has partners in the space, see if they will include a link to your website from one of their own web pages!
  • Reach out to bloggers journalists : Bloggers and journalists are often looking for high quality content and relevant stories for their own coverage.
  • Order a link building campaign: You can order backlinks from agencies like LinkGraph that do white hat, Google compliant link building. We will create original content that includes links back to your website and pitch it to relevant websites that are looking to feature high-quality content.

Page Speed Optimization

Another important factor that will impact whether or not your non profit organization website ranks in searches is how fast and responsive your web pages are for users.

Nobody likes a slow website, which is why Google prioritizes web pages in the SERPs that are high-performing. To test out how your website measures up, you can use Google’s Pagespeed Insights tool.

If your web pages have low scores on mobile and desktop, it could mean that your users are leaving your website when items are not loading quickly enough.

Investing in page speed optimization can help you improve your search engine rankings and create a better experiences for users.

Next Steps for your Nonprofit Marketing Team

Now that you understand the positive impact that SEO can have on the growth and visibility of your organization, it’s time to get started with SEO.

Here are a few next steps if you want to start.

  1. Create a Google Search Console Account: This free platform from Google will let you start tracking your keywords, organic traffic, and impressions. 
  2. Reach out to one of our SEO Strategists: Our team of experts can help you get started by identifying keyword opportunities and where your website can see the most organic growth.
  3. Sign up for a free trial of our SEO software: You’ll get access to a range of SEO tools that allow you to start DIYing your SEO. From keyword research to keyword tracking, you can do it all in our platform.

The post SEO for Nonprofits: How to Improve Online Visibility and Donations appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/seo-for-nonprofits/feed/ 0