September 1, 2015 by Roman Ožana

Avoid SEO Disaster, Part 1 – Indexing and Redirection

Avoid SEO Disaster, Part 1 – Indexing and Redirection

One of the most important parts of a strong SEO strategy is being able to make the connection between the changes that occur on your website in relation to the amount of organic visits to your website. In a perfect world, the things we fixed on web pages would always stay fixed. Unfortunately, this is easier said than done, especially when it comes to search engine optimization and the many factors that can affect your search performance.

Regularly monitoring changes on your site can help you isolate some of these factors, verify fixes stay in place, and help you steer clear of possible indexing issues.

This post is part one of a series meant to show you some of the ways Testomato can lend a helping hand with preventing SEO disaster and achieve higher website performance. In this post, we’ll go over how Testomato can help prevent accidental noindexation and improper redirects.

SEO No-Nos: Accidental NoIndex and Disallow

In 2014, Groupon conducted a risky experiment and completely de-indexed their site for one day to find out how much of their direct site traffic was organic search.

The results? 60% of URLs reported as “direct traffic” are actually organic traffic from Google.

Being indexed is critical for search results and making changes that create indexing problems is not only a major SEO issue – it could potentially cost your site traffic it will never get back.

For example, pages that have this meta-robots tag will not be indexed by search engines:

<meta name="robots" content="NOINDEX,FOLLOW" />

And this happens more often than you’d think.

It’s easy to do. Developers often build their sites in staging areas and hide their projects. However, in the rush of releasing a new update live, it’s easier to forget to allow your pages to be indexed.

In addition, using a disallow command in the robots.txt file to tell search engines not to crawl specific content can end up being very dangerous if used incorrectly. You could end up disallowing an entire site.

Here’s an example of a file that disallows the entire site:

User-agent: *
Disallow: /

This means your site will not be crawled, your content won’t be indexed, and you won’t show up in search results.

Solution: Monitor for changes to robot.txt

Using Testomato, you can set up a rules to monitor for specific strings of text on specific pages to monitor for changes to robots.txt.

To check for the inclusion of index meta tags on category pages, you can create the following rule:

index

To check that your robots.txt file isn’t blocking search engines, you can create the following rule:

robots.txt

If either of these checks fails, you’ll immediately receive an email alert or a browser notification via Testomato’s Chrome extension.

Doing this kind of monitoring allows you to get alerted automatically about indexing issues before you notice a drop in site traffic or you fall in search results.

Improper Redirection of Old Pages

A redirect is when one URL is forwarded to a different URL from the one requested by a user.

There are several reasons that to redirect a page: your domain changes, you redesign your site with new URLs, a URL is broken, and more.

Whatever the reason, it’s critical to make sure you use the correct redirect method to preserve your website’s ranking and make sure your content can still be found by both users and search engines.

This Moz guide does a great job of summing up the different types of redirection and best practices, but here’s a quick overview of its contents.

  • There are 3 types of redirects:
    • 301 – “Moved Permanently”
    • 302 – “Found” or “Moved Temporarily”
    • Meta refresh – executed at page level (instead of server level)
  • 301 redirects are considered best practice because they pass roughly 90-99% of link juice (ranking power) to the redirection page.
  • 301 redirects indicate to search engines that a page has changed location and its content can also be found at the new URL.

Solution: Monitor for 302 redirects

The biggest problem with 302 redirects is that they do NOT tell search engines that a page has moved permanently and therefore, the qualities of the redirected page are not passed on to the new URL. While visitors will end up on the new page, the old version can still be indexed, causing duplicate content and PageRank splitting issues.

To keep an eye on your 301s to make sure they don’t transform into 302s, you can create the following rule in Testomato:

h1 tags
301 moved permanently

Be sure to check out our Help Docs for more information about how to set up Rules in Testomato.

The next post in this series will tackle SEO issues with changes to critical HTML elements and the importance of uptime.

What SEO problems do you check for the most? 

Please leave us a comment here or tweet us @testomatocom.

Is your website available? Testomato can check your website availability every 15 seconds from 10 different locations around the world and will send you an alert if the site is unavailable.

Keep track of the important parts of your website with simple checks that can run every minute. They can check plain words, HTML code, HTTP headers, redirects … and much more. Never miss anything.

Websites break, it happens all time. You'll be the first to know. Testomato has an extensive database of and will let you know if it finds any error on monitored url.