You searched for Enterprise SEO - LinkGraph https://linkgraph.io/ High authority link building services, white hat organic outreach. Thu, 15 Dec 2022 12:42:58 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 https://linkgraph.io/wp-content/uploads/2021/06/cropped-LinkGraph-Favicon-32x32.png You searched for Enterprise SEO - LinkGraph https://linkgraph.io/ 32 32 Content Pruning Guide for Content Managers and SEOs https://linkgraph.io/blog/content-pruning/ https://linkgraph.io/blog/content-pruning/#respond Fri, 09 Dec 2022 05:47:54 +0000 https://linkgraph.io/?p=23597 If you’re looking for ways to improve the SEO of your website, content pruning is an effective method to consider.  Content pruning is all about removing any […]

The post Content Pruning Guide for Content Managers and SEOs appeared first on LinkGraph.

]]>
If you’re looking for ways to improve the SEO of your website, content pruning is an effective method to consider. 

Content pruning is all about removing any unnecessary content from your website, which can help your website rank higher in search engine results and can also improve user experience.

In this article, you will learn how to do content pruning for SEO, from identifying what needs to be pruned to taking steps to ensure that it’s done correctly. Keep reading to find out more.

What is Content Pruning?

Content pruning involves a thorough review of all content on a website to identify any content that could be seen as irrelevant or low quality from the perspective of search engine algorithms. Once identified, content pruning involves removing those low-quality pages from the website or replacing them with better content. 

A part of regular website maintenance is making sure that all of the content on the website is up-to-date and relevant to the website’s goals. This can help ensure that a website has content that is useful to its visitors and provides value to users who arrive from search engines. 

By removing low-quality or outdated content, websites can see improved search visibility for their highest-value, highest-converting web pages.

Why Remove Content From My Website?

Content production takes time and resources. So you may be wondering: Why remove content from my website after all the work it took to create it?

Although it may feel counterintuitive to your content strategy, content pruning can actually have major benefits to your search engine performance. This is even more true for websites that have a robust SEO content strategy and are publishing new web pages on a regular basis.

Some of those benefits include the following:

  • Improve a website’s visibility by allowing search engines to index your best and highest-converting web pages
  • Ensure visitors are presented with the most up-to-date information
  • Provide higher-quality content and a better user experience
  • Prevent visitors from seeing any low-quality pages
  • Ensure your crawl budget is spent on rank-worthy content

Any content that sits on your websites that doesn’t pull its weight in either traffic or conversions isn’t actually bringing value to your business.

By taking the time to regularly prune their content, content managers can ensure that their website is performing at its best. That’s why sometimes content pruning is the right choice for your content strategy.

What Makes a Web Page Prunable?

Here are some of the qualities to look for when searching for content on your website that may need to be pruned.

Low-Quality

Bad content can have a negative impact on a website’s rankings. Low-quality content can include pages with duplicate content, thin content, or other qualities. It can also include pages that are not user-friendly or are low on useful information. Content pruning can help to identify and remove such pages, allowing search engines to easily index the more relevant, more quality content on the website.

Duplicate Content

Duplicate content is web pages with the same content or similar content that search engine crawlers do not identify as distinct. Google does not want to see duplicate content on a website unless it has been clearly identified as such via a canonical tag.

Thin Content

Thin content is often short-form content that doesn’t provide any real value to the user. Although there is no exact content length that Google privileges in search engine results, experiments have shown that longer, in-depth content tends to rank higher.

Most often, thin content can be combined with other pages on your website to provide a more comprehensive, in-depth answer to a topic, question, or keyword query. Combining content, or redirecting thin pages to more in-depth ones, are also a part of the content pruning process.

Outdated Content

The reality is, the content on our website will become outdated over time. This is why creating evergreen content is important, however, it’s unlikely that your long-form content will last forever without the need for updating. Trends, technologies, and knowledge will change, and web pages should include the most up-to-date, useful information for search engines. 

Also, outdated information can be confusing for visitors and lead to a poor user experience. Removing outdated content can ensure that visitors are presented with the most relevant and useful information.

Under-performing Content

If you have a web page on your website that does not get traffic or conversions, what value is it bringing your business? If the web page does not rank in search results, convert users, or is not a vital part of the buyer journey, it doesn’t really have a place on your website unless you take the time to improve its content.

How to Find Pages for Content Pruning

You can use the Page Pruning tool in the SearchAtlas dashboard to discover pages that may be eligible for content pruning.

To find the tool, navigate to Site Auditor > Page Pruning. 

This tool will show you any pages that are eligible for pruning due to a variety of reasons:

  • Low organic impressions
  • Low clicks
  • Indexability
  • Total ranking keywords
  • Average position
  • Content quality/scores

Remember, just because a page appears on this list doesn’t mean that it has to be pruned/deleted, but that it may be eligible based on its performance metrics.

Next Steps for Page Pruning

Once you have reviewed the software’s suggestions and confirmed that the pages are eligible for pruning, here are the next options for you to take.

1. Improve the Content on the Page

The underperformance of the page may rest in the fact that the content is thin or is not providing a comprehensive answer to the user’s questions. 

You can look to the “Boostable,” tab in the Page Pruning tool to identify those pages that just might need a slight content score boost.

The URLs that are listed here are already earning organic impressions but are not seeing as much success in organic traffic. Most likely, Google sees those pages as relevant but is not ranking them on the first page as a result of the content.  

You can use the SEO Content Assistant in your SearchAtlas dashboard to discover ways to strengthen and improve your content. Or, use the on-page audit tool to see what on-page elements may be impacting your performance.

Follow the guided suggestions for focus terms, headings, questions, and internal links. Include them on the page to make the content more rank worthy. 

2. Update the Content to be More Evergreen

If your content covers trends or keywords that have seasonal search volume, that may impact their underperformance.

Consider updating the content with more evergreen information so the content has a longer shelf life in the SERPs.

Also, make sure that the information on the page is up-to-date with accurate, relevant information. Over time, links may break or content may become outdated. Updating your blogs and articles every 1-2 years should be a part of your regular website maintenance.

3. Build Backlinks to the Page

If both the SEO Content Assistant and on-page content school confirm that your content has high scores and is rank-worthy, you may just need a bit more of a link boost.

Backlinks are Google’s number one ranking factor. If you don’t have very many backlinks pointing to your web page, that may be a reason why it is not ranking on the first page.

You can use the backlink research tool in your dashboard to see which of your web pages have the least amount of link equity.

Consider investing in a link-building campaign in order to improve the off-site signals of the content. Doing so is likely to improve the overall keyword rankings, impressions, and organic clicks.

4. Reoptimize the Page for a Different Keyword

Another possible explanation for your content’s poor performance may be keyword related. 

Some keywords are more competitive than others. If you optimized the page for a keyword that is out of reach, reoptimization may be your next step.

When choosing keywords for SEO, you want to make sure your website has a realistic chance of ranking for the target keyword. Websites with higher Domain Authority will stand a better chance of ranking for competitive keywords.

At LinkGraph, we suggest keywords that are less than or equal to your website’s Domain Authority.

So once you find a more realistic goal, optimize for that keyword instead. This will likely involve changing metadata, website copy, and headings on the page. But it can make a huge difference in improving organic performance.

5. Redirect the Page to a More High-Quality One

A page may be flagged for pruning because Google is ranking a more helpful piece of content on your website.

This is known as keyword cannibalization. It happens when two pieces of content are very similar and Google doesn’t know which to promote. If there is a page that is ranking less often but is similar in relevance, you can do your “content pruning,” by adding a 301 redirect from the less comprehensive page to the better performing. 

6. Combine Thin Content into a More Comprehensive Resource

If you have a series of pages that are thin on content but relate to a similar cluster of keywords, consider combining those pages into a more useful, long-form resource.

Why? Because Google likes to promote content that satisfies users’ search intent. That means not only answering their initial questions but all of the additional questions that might follow regarding that primary topic. 

So before giving up on that set of keywords entirely, combine those various pages into one page. Then, see if the overall keyword rankings improve.

7. Consider Removing the Page Entirely

This is the last step you want to consider after you have concluded that none of the above steps help to elevate content performance.

The reality is, if a piece of content is not driving website traffic, converting users, or an essential part of the buying journey, it doesn’t really deserve space on your website. 

Take the ultimate step and trim that content branch off your tree.

Conclusion

Making content pruning a regular part of your website maintenance is a good habit to get into. This is especially true for websites that publish a lot of content and have a robust SEO strategy.

You can also use the same SearchAtlas tools to scale up your content marketing and blog strategy. Connect with one of our product managers to learn more about our enterprise SEO software platform.

The post Content Pruning Guide for Content Managers and SEOs appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/content-pruning/feed/ 0
Noindex Nofollow and Disallow: Search Crawler Directives https://linkgraph.io/blog/noindex-nofollow-and-disallow-search-crawler-directives/ https://linkgraph.io/blog/noindex-nofollow-and-disallow-search-crawler-directives/#respond Mon, 14 Nov 2022 15:57:54 +0000 https://linkgraph.io/?p=3108 Are you using NoIndex, NoFollow, and Disallow correctly? Not using them the right way can cost your web pages organic rankings. Learn all about them here.

The post Noindex Nofollow and Disallow: Search Crawler Directives appeared first on LinkGraph.

]]>
There are three directives (commands) that you can use to dictate how search engines discover, store, and serve information from your site as search results:

  • NoIndex: Don’t add my page to the search results.
  • NoFollow: Don’t look at the links on this page.
  • Disallow: Don’t look at this page at all.

These directives allow you to control which of your site pages can be crawled by search engines and appear in search.

What does No Index mean?

The noindex directive tells search crawlers, like googlebot, not to include a webpage in its search results.

Indexing is the process by which Google scans, or ‘crawls,’ the internet for new content that is then added to the search engine’s library of search-accessible content.

How Do You Mark A Page NoIndex?

There are two ways to issue a noindex directive:

  1. Add a noindex meta tag to the page’s HTML code
  2. Return a noindex header in the HTTP request

By using the “no index” meta tag for a page, or as an HTTP response header, you are essentially hiding the page from search.

The noindex directive can also be used to block only specific search engines. For example, you could block Google from indexing a page but still allow Bing:

Example: Blocking Most Search Engines*

<meta name=”robots” content=”noindex”>

Example: Blocking Only Google

<meta name=”googlebot” content=”noindex”>

Please note: As of September 2019, Google no longer respects noindex directives in the robots.txt file. Noindex now MUST be issued via HTML meta tag or HTTP response header. For more advanced users, disallow still works for now, although not for all use cases.

What is the difference between noindex and nofollow?

It’s a difference between storing content, and discovering content:

noindex is applied at the page-level and tells a search engine crawler not to index and serve a page in the search results.

nofollow is applied at the page or link level and tells a search engine crawler not to follow (discover) the links.

Essentially the noindex tag removes a page from the search index, and a nofollow attribute removes a link from the search engine’s link graph.

NoFollow As a Page Attribute

Using nofollow at a page level means that crawlers will not follow any of the links on that page to discover additional content, and the crawlers will not use the links as ranking signals for the target sites.

<meta name=”robots” content=”nofollow”>

NoFollow as a Link Attribute

Using nofollow at a link level prevents crawlers from exploring ad specific link, and prevents that link from being used as a ranking signal.

The nofollow directive is applied at a link level using a rel attribute within the a href tag:

<a href=”https://domain.com” rel=”nofollow”>

For Google specifically, using the nofollow link attribute will prevent your site from passing PageRank to the destination URLs.


However, Google did recently announce that as of March 1, 2020 the search engine will begin to treat NoFollow links as a “hints” that contribute to a site’s overall search authority.

Why Should You Mark a Page as NoFollow?

For the majority of use cases, you should not mark an entire page as nofollow – marking individual links as nofollow will suffice.

You would mark an entire page as nofollow if you did not want Google to view the links on the page, or if you thought the links on the page could hurt your site.

In most cases blanket page-level nofollow directives are used when you do not have control over the content being posted to a page (ex: user generated content can be posted to the page).

Some high-end publishers have also been blanket applying the nofollow directive to their pages to dissuade their writers from placing sponsored links within their content.

How Do I Use NoIndex Pages?

Mark pages as noindex that are unlikely to provide value to users and should not show up as search results. For example, pages that exist for pagination are unlikely to have the same content displayed on them over time.

Domain.com/category/resultspage=2 is unlikely to show a user better results than domain.com/category/resultspage=1 and the two pages would only compete with each other in search. It’s best to noindex pages whose only purpose is pagination.

Here are types of pages you should consider noindexing:

  • Pages used for pagination
  • Internal search pages
  • Ad-Optimized Landing pages
    • Ex: Only displays a pitch and sign up form, no main nav
    • Ex: Duplicate variations of the same content, only used for ads
  • Archived author pages
  • Pages in checkout flows
  • Confirmation Pages
    • Ex: Thank you pages
    • Ex: Order complete pages
    • Ex: Success! Pages
  • Some plugin-generated pages that are not relevant to your site (ex: if you use a commerce plugin but don’t use their regular product pages)
  • Admin pages and admin login pages

Marking a Page Noindex and Nofollow

A page marked both noindex and nofollow will block a crawler from indexing that page, and block a crawler from exploring the links on the page.

Essentially, the image below demonstrates what a search engine will see on a webpage depending on how you’ve used noindex and nofollow directives:

Find out how to get #1 on Google rankings and beat the competition
Book a Call

Marking an Already Indexed Page as NoIndex

If a search engine has already indexed a page, and you mark it as noindex, then next time the page is crawled it will be removed from the search results.

For this method of removing a page from the index to work, you must not be blocking (disallowing) the crawler with your robots.txt file.

If you are telling a crawler not to read the page, it will never see the noindex marker, and the page will stay indexed although its content will not be refreshed.

How do I stop search engines from indexing my site?

If you want to remove a page from the search index, after it has already been indexed, you can complete the following steps:

  1. Apply the noindex directiveAdd the noindex attribute to the meta tag or HTTP response header
  2. Request the search engine crawl the pageFor Google you can do this in search console, request that Google re-index the page. This will trigger Googlebot crawling the page, where Googlebot will discover the noindex directive.You will need to do this for each search engine that you want to remove the page.
  3. Confirm the page has been removed from searchOnce you’ve requested the crawler revisit your webpage, give it some time, and then confirm that your page has been removed from the search results. You can do this by going to any search engine and entering site colon target url, like in the image below.

    If your search returns no results, then your page has been removed from that search index.
  4. If the page has not been removedCheck that you do not have a “disallow” directive in your robots.txt file. Google and other search engines cannot read the noindex directive if they are not allowed to crawl the page.If you do, remove the disallow directive for the target page, and then request crawling again.
  5. Set a disallow directive for the target page in your robots.txt fileDisallow: /page$
    You’ll need to put the dollar sign on the end of the URL in your robots.txt file or you may accidentally disallow any pages under that page, as well as any pages that begin with the same string.Ex: Disallow: /sweater will also disallow /sweater-weather and /sweater/green, but Disallow: /sweater$ will only disallow the exact page /sweater.

How to Remove a Page from Google Search

If the page you want removed from search is on a site that you own or manage, most sites can use the Webmaster URL Removal Tool.

The Webmaster URL removal tool only removes content from search for about 90 days, if you want a more permanent solution you’ll need to use a noindex directive, disallow crawling from your robots.txt, or remove the page from your site. Google provides additional instructions for permanent URL removal here.

If you’re trying to have a page removed from search for a site that you do not own, you can request Google removes the page from search if it meets the following criteria:

  • Displays personal information like your credit card or social security number
  • The page is part of a malware or phishing scheme
  • The page violates the law
  • The page violates a copyright

If the page does not meet one of the criteria above, you can contact an SEO firm or PR company for help with online reputation management.

Should you noindex category pages?

It is usually not recommended to noindex category pages, unless you are an enterprise-level organization spinning up category pages programmatically based on user-generated searches or tags and the duplicate content is getting unwieldy.

For the most part if you are tagging your content intelligently, in a way that helps users better navigate your site and find what they need, then you’ll be okay.

In fact, category pages can be goldmines for SEO as they typically show a depth of content under the category topics.

Take a look at this analysis we did in December, 2018 to quantify the value of category pages for a handful of online publications.

*Analysis performed using AHREFS data.

We found that category landing pages ranked for hundreds of page 1 keywords, and brought in thousands of organic visitors each month.

The most valuable category pages for each site often brought in thousands of organic visitors each.

Take a look at EW.com below, we measured the traffic to each page (represented by the size of the circle) and the value of the traffic to each page (represented by the color of the circle).

Monthly Organic Traffic to Page = Size
Monthly Organic Value of Page = Depth of Color

Now imagine the same charts, but for product-based sites where visitors are likely to make active purchases.

That being said, if your categories similar enough to cause user confusion or compete with each other in search then you may need to make a change:

  • If you are setting the categories yourself, then we would recommend migrating content from one category to the other and reducing the total number of categories you have overall.
  • If you are allowing users to spin up categories, then you may want to noindex the user generated category pages, at least until the new categories have undergone a review process.

How do I stop Google from indexing subdomains?

There are a few options to stop Google from indexing subdomains:

  • You can add a password using an .htpasswd file
  • You can disallow crawlers with a robots.txt file
  • You can add a noindex directive to every page in the subdomain
  • You can 404 all of the subdomain pages

Adding a Password to Block Indexing

If your subdomains are for development purposes, then adding an .htpasswd file to the root directory of your subdomain is the perfect option. The login wall will prevent crawlers for indexing content on the subdomain, and it will prevent unauthorized user access.

Example use cases:

  • Dev.domain.com
  • Staging.domain.com
  • Testing.domain.com
  • QA.domain.com
  • UAT.domain.com

Using robots.txt to Block Indexing

If your subdomains serve other purposes, then you can add a robots.txt file to the root directory of your subdomain. It should then be accessible as follows:

https://subdomain.domain.com/robots.txt

You will need to add a robots.txt file to each subdomain that you are trying to block from search. Example:

https://help.domain.com/robots.txt

https://public.domain.com/robots.txt

In each case the robots.txt file should disallow crawlers, to block most crawlers with a single command, use the following code:

User-agent: *

Disallow: /

The star * after user-agent: is called a wildcard, it will match any sequence of characters. Using a wildcard will send the following disallow directive to all user agents regardless of their name, from googlebot to yandex.

The backslash tells the crawler that all pages off of the subdomain are included in the disallow directive.

How to Selectively Block Indexing of Subdomain Pages

If you would like some pages from a subdomain to show up in search, but not others, you have two options:

  • Use page-level noindex directives
  • Use folder or directory-level disallow directives

Page level noindex directives will be more cumbersome to implement, as the directive needs to be added to the HTML or Header of every page. However, noindex directives will stop Google from indexing a subdomain whether the subdomain has already been indexed or not.

Directory-level disallow directives are easier to implement, but will only work if the subdomain pages are not in the search index already. Simply update the subdomain’s robots.txt file to disallow crawling of the applicable directories or subfolders.

Free SEO proposal when you schedule with LinkGraph
Book a Call

How Do I Know if My Pages are NoIndexed?

Accidentally adding a no index directive pages on your site can have drastic consequences for your search rankings and search visibility.

If you find a page isn’t seeing any organic traffic despite good content and backlinks, first spot check that you haven’t accidentally disallowed crawlers from your robots.txt file. If that doesn’t solve your issue, you’ll need to check the individual pages for noindex directives.

Checking for NoIndex on WordPress Pages

WordPress makes it easy to add or remove this tag on your pages. The first step in checking for nofollow on your pages is by simply toggling the Search Engine Visibility setting within the “Reading” tab of the “Settings” menu.

This will likely solve the problem, however this setting works as a ‘suggestion’ rather than a rule, and some of your content may end up being indexed anyway.

In order to ensure absolute privacy for your files and content, you will have to take one final step either password protecting your site using either cPanel management tools, if available, or through a simple plugin.

Likewise, removing this tag from your content can be done by removing the password protection and unchecking the visibility setting.

Checking for NoIndex on Squarespace

Squarespace pages are also easily NoIndexed using the platform’s Code Injection capability. Like WordPress, Squarespace can easily be blocked from routine searches using password protection, however the platform also advises against taking this step to protect the integrity of your content.

By adding the NoIndex line of code within each page you want to hide from internet search engines and to each subpage below it, you can ensure the safety of secured content that should be barred from public access. Like other platforms, removing this tag is also fairly straightforward: simply using the Code Injection feature to take the code back out is all you will need to do.

Squarespace is unique in that its competitors offer this option primarily as a part of the suite of settings in page management tools. Squarespace departs here, allowing for personal manipulation of the code. This is interesting because you are able to see the change you are making to your page’s content, unlike the others in this space.

Checking for NoIndex on Wix

Wix also allows for a simple and fast fix for NoIndexing issues. In the “Menus & Pages” settings, you can simply deactivate the option to ‘show this page in search results’ if you want to NoIndex a single page within your site.

As with its competitors, Wix also suggests password protecting your pages or entire site for extra privacy. However, Wix departs from the others in that the support team does not prescribe parallel action on both fronts in order to secure content from the crawler. Wix makes a particular note about the difference between hiding a page from your menu and hiding it from search criteria.

This is particularly useful advice for less experienced website builders who may not initially understand the difference considering that removal from your site menu makes the page unreachable from the site, but not from a prudent Google search term.

Get 7 Days Free to use the most powerful SEO software on the planet
Learn More

The post Noindex Nofollow and Disallow: Search Crawler Directives appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/noindex-nofollow-and-disallow-search-crawler-directives/feed/ 0
Great Place to Work® Names LinkGraph One of the Fortune Best Workplaces in Advertising & Marketing™ in 2022 https://linkgraph.io/media/great-place-to-work-names-linkgraph-one-of-the-fortune-best-workplaces-in-advertising-marketing-in-2022/ Mon, 12 Sep 2022 08:17:28 +0000 https://linkgraph.io/?post_type=announcement&p=17915 LinkGraph is honored by Great Place to Work® and Fortune as one of the 2022 Best Workplaces in Advertising & Marketing™. LinkGraph is ranked #27 on the […]

The post Great Place to Work® Names LinkGraph One of the Fortune Best Workplaces in Advertising & Marketing™ in 2022 appeared first on LinkGraph.

]]>

LinkGraph is honored by Great Place to Work® and Fortune as one of the 2022 Best Workplaces in Advertising & Marketing™. LinkGraph is ranked #27 on the list of 50 U.S. companies. Earning a spot means that LinkGraph is one of the best companies to work for in the country. LinkGraph was also Certified™ by Great Place to Work® in 2021. 

The analysis of survey responses from more than 9,000 workers of Great Place to Work-Certified organizations in the advertising and marketing sector served as the foundation for the Best Workplaces in Advertising & Marketing award. In that study, 93% of the workers at LinkGraph believed the company was a wonderful place to work. 

“This is the first year Fortune has created a category for this type of award and LinkGraph is proud to be one of the first recipients of this award,”  said Founder and CTO of Linkgraph, Manick Bhan.  “Here at LinkGraph, we really work hard to create an environment that supports and fosters creativity and productivity in an uplifting space.”

About LinkGraph

LinkGraph pairs award-winning SEO strategies with cutting-edge software to help professionals increase organic traffic and conversions. We specialize in technical SEO, link building, paid media management, and conversion rate optimization.

Using our enterprise SEO software platform, SearchAtlas, site owners can execute a comprehensive on-page and off-site strategy, all from the convenience of a single platform.

So whether you want to work with our SEO professionals or take ownership of your SEO with our powerful software, LinkGraph is here to help you take your website to the top of the SERPs.

For more information on LinkGraph’s award-winning campaign strategies, book a meeting with one of our SEO experts.

About the Best Workplaces in Advertising & Marketing™

Great Place to Work® selected the Fortune Best Workplaces in Advertising & Marketing™ by gathering and analyzing confidential survey responses from over 9,000 employees from Great Place to Work-Certified™ companies in the advertising and marketing industry. Company rankings are derived from 60 employee experience questions within the Great Place to Work® Trust Index™ survey. Great Place to Work determines its lists using its proprietary For All™ methodology to evaluate and certify thousands of organizations in America’s largest ongoing annual workforce study, based on over 1 million survey responses and data from companies representing more than 6.1 million employees, this year alone.

The post Great Place to Work® Names LinkGraph One of the Fortune Best Workplaces in Advertising & Marketing™ in 2022 appeared first on LinkGraph.

]]>
How to Optimize Internal Links for SEO https://linkgraph.io/blog/internal-links-for-seo/ https://linkgraph.io/blog/internal-links-for-seo/#respond Fri, 09 Sep 2022 16:20:51 +0000 https://linkgraph.io/?p=17897 Internal links allow Google to rank your site more accurately and index your site more effectively.  Your website’s Internal links not only improve the user experience, they […]

The post How to Optimize Internal Links for SEO appeared first on LinkGraph.

]]>
Internal links allow Google to rank your site more accurately and index your site more effectively. 

Your website’s Internal links not only improve the user experience, they communicate to web crawlers your site architecture and how your web content interrelates. 

Without a strong, strategic internal linking structure, your site may lose SEO value and struggle to rank in search engines.

Here is a guide on SEO best practices for internal links, and some mistakes you might be making that could be impacting your organic visibility.

What are Internal Links?

An internal link is a hyperlink that points to a different page on the same website.

a web page with two internal links pointing to two other pages on the same website

They are commonly used to help users navigate between different pages of a website, but can also be used for SEO purposes.

Internal links help to keep visitors on your website longer, which can improve your site’s SEO performance.

What are the Different Types of Internal Links?

There are a few different types of internal links you likely have on your website right now. 

Some of them will bring more SEO value than others, so it’s good to know the difference between each.

Menu/Navigation

The links in your menu/navigation bar are some of the most important internal links. These links remain consistent no matter where a site visitor travels across your website.

a screenshot of a homepage with a red box around the navigation menu

They should point to the most important pages (e.g. product categories, primary services, blog, about, etc.) and should give users a high-level overview of what type of content is on your website.

Because the majority of your link equity is most likely on your homepage, these internal links will distribute a significant amount of page rank across your website, so make sure the pages linked there are the most important and the ones you want to rank.

The internal links you include here will also communicate to those users visiting your website for the first time where to go next. 

Footer Links

Footer links are at the bottom of your web pages. Like the nav bar, the footer is like an anchor that remains consistent across your website.

screenshot of a web page with a red box around the footer links

There may be some repetition in the links you include in your navigation menu and your footer, and that’s okay. They also will be sending quite a bit of link equity from your homepage to the pages linked there.

If users reach the bottom of a web page and have not found a place to click next, you want them to find what they are looking for in the footer.

Buttons/CTA Links

The internal links that you include on your buttons or CTAs are important for shaping the user or buyer journey across your website and for conversion rate optimization.

screenshot of a web page with a red box around the button or CTA links on the page

Most likely, CTA links are pointing to web pages that push users further down the conversion funnel, whether that is to a web page to book a meeting, request a demo, submit an email address, or add an item to a cart.

The anchor text of these internal links will be primarily user and conversion focused.

Sidebar Internal Links

Sidebar links are often used to provide users options of relevant content or what page they could go to next. 

For publishers that feature a lot of content on their website, sidebar links can help site visitors who are browsing your website without necessarily looking for something specific, but are just exploring the various content you offer.

screenshot of a web page with a red box around the sidebar links

Sidebar links are very common on news sites, recipe sites, or those that want the opportunity to show users multiple pages (and thus multiple advertisements).

In-Article Links

In-article links are those that are included in the body of blog posts or long-form articles. They point to relevant content that can provide users with more context or information.

Screenshot of a blog post with red boxes around the two internal links

These types of links are very common because they have loads of SEO value. 

If you are not linking to other relevant articles on your website within each blog post, you’re missing out on opportunities to improve your ranking positions and search engine visibility.

Why are Internal Links Important for SEO?

The SEO benefits of internal links are significant, and can improve your search engine visibility for a variety of reasons.

1. Direct Users & Google to your Most Important Pages

Internal links let Google know the most important content on your website. You can use internal links to help Google understand which pages to promote in the SERPs.

2. Help Google Find and Index your Pages

When indexing sites, search engine crawlers begin on your homepage and spread out from there, using internal links as their navigational guide. 

When you have a strong internal linking system, Google is more likely to find and index all your URLs, so your newest content has ranking potential.

3. Communicate Topical Relevance Through Anchor Text

You may wonder how Google knows what your site and landing pages are about. 

Google’s web crawlers use the anchor text from internal linking to understand the purpose and meaning of your content and its relevance to specific search terms.

Anchor text best practices can improve your SEO.

4. Maximized Crawl Budget

Strategic use of noindex and nofollow tags with your internal links can help you ensure that Google is crawling and indexing your most important pages.

For pages that don’t need to be indexed, like thank you or confirmation pages, internal links with nofollow directives can prevent low-value or low-converting pages from ending up in Google’s index. 

It also leaves room in your website’s crawl budget for Google to index those pages that you do want to rank.

5. Better User Experience

Internal links also make your website a better place for site visitors.

Navigation links guide users along a conversion journey after they find you in the SERPs, and in-content links can point them to other relevant pages.

6. Displays Topical Depth and Breadth

Interlinking your topically related pages can turn your website into a topical powerhouse.

Having lots of internal links in your blog posts to related topics or subtopics shows Google crawlers that your website has topical authority, and is a go-to expert source in a particular industry niche or topic area.

How to Analyze My Internal Links for SEO

If you are not sure whether or not you have internal link issues on your website, a site crawler or site audit tool can help you identify any issues.

To use SearchAtlas’ free site auditor, register for a trial of our SEO software.

To run a site audit, do the following.

  1. Register for an account with SearchAtlasscreenshot of SearchAtlas registration page
  2. Navigate to the Site Audit tool in your dashboardScreenshot of SearchAtlas tutorial page with red arrows pointing to the site auditor tool
  3. Enter your homepage url into the Auditor and click “Audit Site”screenshot of text field in site auditor and button that says "Audit Site"
  4. Select your preferred User Agent, Crawl Speed, and Crawl Budgetscreenshot of searchatlas site auditor
  5. Wait for your audit to generate. Depending on the size of your website, it may take up to a day for the auditor to crawl all of your pages. You’ll receive an email when your site audit is ready.screenshot of searchatlas site auditor email alert
  6. Look for your homepage in the Sites List and click “View Audit.”screenshot of a complete site audit in searchatlas site lists

If you are not comfortable using our software on your own, you can also order an Internal Linking Analysis in our order builder. Our technical SEO experts will determine if there are any link issues on your site and provide a roadmap for how to optimize your internal linking profile for better organic visibility.

Common Issues with Internal Links

You can use the SearchAtlas Site Auditor to see whether or not you are utilizing internal linking best practices. 

Our report will flag any internal linking issues that may be preventing your web pages from earning higher keyword rankings in the SERPs.

Not Enough Internal Links

One of the most common mistakes that new or unoptimized websites make is that they do not include enough internal links on their web pages.

If your web pages are failing to include the right amount of internal links, it will be flagged in your SearchAtlas site audit report.

screenshot of the too few internal links issue in the searchatlas site auditor

This may or may not be an easy fix, depending on the number of web pages you have on your website. 

To resolve the issue, do the following:

  • If you already have relevant content on your website but you are just not linking to it, adding internal links to those pages is the first step to resolving this issue.
  • But if you are a newer website, you will need to write and publish relevant content on your website, and it will need to be high-quality in order to bring SEO value. Then, once the content is live on your website, you can take the next step of adding internal links.

Too Many Internal Links

Although you want to include internal links on your web pages, too many outlinks on a page (both external and internal) can appear like over-optimization to Google.

screenshot of the too many outlinks issue in searchatlas site audit report

Make sure that you are only including links to relevant, helpful content. And don’t overdo it by stuffing your navigation menu or footer with too many internal links. 

Reserve those links for the most important pages on your website – the ones you really want to rank in the SERPS.

Broken Internal Links

Another very common issue that may be flagged in your site audit report is broken internal links. 

screenshot of the broken internal links issue message in the searchatlas site auditor

A broken internal link occurs when you move or delete a page on your website, and you do not update previous internal links with the new destination url.

As a result, those internal links point to 404 pages. Sending Google crawlers and users to a dead page is not good for SEO or for the user experience.

Broken internal links are very common with large enterprise or ecommerce websites that are constantly updating their content. 

To resolve a broken internal link, take one of the following actions:

  1. Restore the dead/deleted page
  2. Update the internal link with a new destination url

Internal Links with Redirects

Sometimes, webmasters may not be worried about internal links because they use 301 redirects whenever they move or delete a page.

Although 301 redirects are good for SEO in terms of the links from other websites that point to your web pages, internal links with 301 redirects are not considered SEO best practice.

screenshot of internal links with redirects in searchatlas site audit report

Why? Because redirecting internal links slow down your website and cause Google crawlers to have to move through your website at a slower pace.

Whenever you move a page, a part of your website maintenance needs to be updating any internal links with the new destination url.

This shows Google crawlers that you are an attentive webmaster, and thus makes them more likely to promote your pages.

Unoptimized Anchor Text

The anchor text that you use to internally link your pages is also important to your keyword rankings and your user experience.

Anchor text lets Google know what your other web pages are about, how your content interrelates, and displays the many valuable pieces of content that live permanently on your website.

For more details on anchor text best practices, read this anchor text guide.

Final Thoughts on Internal Links

Your website’s internal link profile is essential to optimize if you want to rank for high-value keywords in your industry.

Taking the time to audit your internal links and repair any issues can be all the difference in your ranking positions.

Still not sure how to resolve internal linking problems? Connect with our SEO strategists to see how we can help.

The post How to Optimize Internal Links for SEO appeared first on LinkGraph.

]]>
https://linkgraph.io/blog/internal-links-for-seo/feed/ 0
LinkGraph Won the Clutch 2022 award as one of New York’s B2B Leaders https://linkgraph.io/media/linkgraph-won-the-clutch-2022-award-as-one-of-new-yorks-b2b-leaders/ Mon, 29 Aug 2022 18:08:34 +0000 https://linkgraph.io/?post_type=announcement&p=17237 Clutch Leader Awards Recognizes LinkGraph As A New York B2B Leader Washington, D.C., August 23, 2022 — Clutch, an independent B2B ratings and reviews site, announces its […]

The post LinkGraph Won the Clutch 2022 award as one of New York’s B2B Leaders appeared first on LinkGraph.

]]>
clutch awards 2022

Clutch Leader Awards Recognizes LinkGraph As A New York B2B Leader

Washington, D.C., August 23, 2022 — Clutch, an independent B2B ratings and reviews site, announces its highly anticipated leaders’ rankings for New York’s top B2B firms in 2022. For the 3rd year in a row, LinkGraph is acknowledged as an exemplary leader in the B2B space.

The Clutch Leader Awards highlight the exceptional contributions of New York’s top service providers across different industries. Every company ranked was thoroughly vetted and investigated by the platform’s analysts over the course of the year. Various performance metrics, such as customer reviews and social media presence, are carefully taken into account for the distinguished honor.

“ To be recognized as an exemplary trailblazer and innovative brand by Clutch 3 years in a row is a prestigious honor,” said Founder and CTO of Linkgraph, Manick Bhan. “Receiving this award affirms our commitment to our team and clients while establishing the efficacy of our culture and business model. We could not do this without our amazing staff and clients.”

About LinkGraph

LinkGraph pairs award-winning SEO strategies with cutting-edge software to help professionals increase organic traffic and conversions. We specialize in technical SEO, link building, paid media management, and conversion rate optimization.
Using our enterprise SEO software platform, SearchAtlas, site owners can execute a comprehensive on-page and off-site strategy, all from the convenience of a single platform.
So whether you want to work with our SEO professionals or take ownership of your SEO with our powerful software, LinkGraph is here to help you take your website to the top of the SERPs.
For more information on LinkGraph’s award-winning campaign strategies, book a meeting with one of our SEO experts.

About Clutch Awards

Clutch is the leading ratings and reviews platform for IT, marketing, and business service providers. Each month they release sets of awards for different localities, with their schedule posted here. Also each month, over half a million buyers and sellers of services use the Clutch platform, and the user base is growing over 50% a year. Clutch has been recognized by Inc. Magazine as one of the 500 fastest-growing companies in the U.S. and has been listed as a top 50 startup by LinkedIn.

The post LinkGraph Won the Clutch 2022 award as one of New York’s B2B Leaders appeared first on LinkGraph.

]]>