Tag Archive for: Google Webmaster Tools

Technical SEO can be interesting, but no one likes coming across the same problems time and time again. That’s why it’s shocking how many websites are struggling with the same issues.

Here are some of the most frequent issues that can found while doing a site audit. We also have the solutions, so you can be prepared if you come across any of these issues.

1) Uppercase vs. Lowercase URLs – This happens most often on sites that use .NET. The server is configured to respond to URLs with uppercase letters and doesn’t redirect or rewrite to lowercase versions. This issue is slowly disappearing because search engines are improving a lot at recognizing canonical versions and disregarding copies. Just because it is going away doesn’t mean this issue should be ignored. Search engines still make mistakes doing this, so don’t rely on them.

Luckily, there is a an easy fix for this issue in the form of a URL rewrite module, which solves the issue on IIS 7 servers. There is a convenient option inside the interface that allows you to enforce lowercase URLs. If you do this, a rule is added to the web config file and this problem is gone.

2) Multiple Versions of the Homepage – If you are auditing a .NET website, go check to see if www.example.com/default.aspx exists. Most likely, it does. The page is a duplicate that search engines often find via navigation or XML sitemaps. Other platforms will instead make URLs like www.example.com/index.html or www.example.com/home. Most contemporary search engines automatically fix the problem, but why not make sure there isn’t an issue to be fixed?

The best way to solve this problem begins with doing a crawl of the site and exporting it into a CSV filtered by META title column. Do a search for the homepage title and you’ll quickly spot duplicates of your homepage. An easy fix for these duplicates is to add a 301 redirect version of the page that directs to the correct version.

You can also do a crawl with a tool like Screaming Frog to find internal links that point to the duplicate pages. Then, you can edit the duplicate pages so they direct to the correct URL. Having internal links that go via 301 can cost you some link equity.

3) Query Parameters Added to the End of URLs – This issue is most common on database driven eCommerce websites because there are tons of attributes and filtering options. This means often you will find URLs like www.example.com/product-category?color=12. In that example, the product is filtered by color. Filtering like this can be good for users, but bad for searches. Unless your customers do not search for the specific product using color, the URL is probably not the best landing page to target with keywords.

Another issue that tends to show up on tons of crawls of sites is when these parameters are combined together. The worst is when the parameters can be combined in different orders but return the same content, such as:

www.example.com/product-category?color=12&size=5 

www.example.com/product-category?size=5&color=12

Because both of these have different paths but return the same content, they are seen as duplicate content. It is important to remember Google allocates crawl budget based on PageRank. Make sure your budget is being used efficiently.

To begin fixing this issue, you need to address which pages you want Google to crawl and index. Make this decision based on keyword research and cross reference all database attributes with your core target keywords. You need to figure out what attributes are keywords used to find products. In figuring this out, it is possible to find high search volume for certain keywords, for example “Nike” = “Running Shoes.” If you find this, you want a landing page for “Nike Running Shoes” to be crawlable and indexable. Make sure the database attribute has an SEO friendly URL and ensure that the URLs are part of the navigation structure of your site so that a good flow of PageRank users can find the pages easily.

The next step depends on whether you want the specific attribute indexed or not. If the URLs are not already indexed, add the URL structure to your robots.txt file and test your regex properly to make sure you don’t block anything accidentally. Also, make sure you use the Fetch as Google feature in Webmaster tools. Remember, however, if the URLs are already indexed, adding them to your robots.txt file will not remove them.

If the URLs are indexed, unfortunately you are in need of the rel=canonical tag. If you inherit one of these situations and are not able to fix the core of the issue, the rel=canonical tag covers the issue in hope that it can be solved later. You’ll want to add this tag to the URLs you do not want indexed and point to the most relevant URL you do want indexed.

4) Soft URL Errors – A soft 404 is a page that looks like a 404 but returns a HTTP status code 200. If this happens, the user sees something resembling “Sorry the page you requested cannot be found”, but the code 200 tells search engines that the page is still working. This disconnect can be the source of the issue with pages being crawled and indexed when you don’t want them to be. A soft 404 also means real broken pages can’t be found.

Thankfully, this problem has a very easy fix for any developer who can set the page to return a 404 status code instead of a 200. You can use Google Webmaster tools to find any soft 404s Google has detected. You can also perform a manual check by going to a broken URL and seeing what status code is returned.

5) 302 Redirects Instead of 301 Redirects – Because users won’t be able to tell there is even a problem, this is a pretty easy problem for developers to make. A 301 redirect is permanent. Search engines recognize this and send link equity elsewhere. A 302 redirect is temporary and search engines will expect the original page to return soon, which leaves link equity where it is.

Find 302s by using a deep crawler like Screaming Frog. It allows you to filter by 302s, which you can then check individually. You can then ask your developers to change any that should be 301s.

6) Broken or Outdated Site Maps – XML sitemaps may not be essential, but they are very useful to search engines that make sure they can find all the URLs that matter. XML sitemaps help show the search engines what is important. Letting your sitemap become outdated causes them to contain broken links and miss any new content and URLs. Keeping sitemaps updated is especially important for big sites that add new pages frequently. Bing also penalizes sites with too many issues in their sitemaps.

Audit your current sitemap for broken links. After, speak to your developers about updating your XML sitemap and make it dynamic so that it updates frequently. How frequently depends on your resources, but doing this will save you a lot of trouble later.

It is very possible you will come across other issues while doing an audit, but, hopefully, if you come across any of these, you are now prepared to fix the problem.

 

For more Technical SEO Problems, read this article by Paddy Moogan at SEOmoz.

 

So you’ve been having steady traffic on your website for a while now. You are eyeing expansion and everything seems fine. Suddenly, your traffic nose-dives. There are a few reasons this can happen.  Some are very easy to fix, while others are more problematic. Here is a six question checklist to help identify what is causing the issue and how to solve the problem. Before going through the checklist, please make sure this is actually a search issue. If you go to Google Analytics and go to Traffic Sources->Sources->Search->Organic and select a range of a few months, you will see an image depicting your traffic with the sudden drop. If there is a drop in traffic on this chart, you likely have a serious search issue.

  1. Has the Analytics Tool Been Removed or Altered? – This tiny issue is by far the easiest to fix. It can happen frighteningly easy and frequently, but use an Analytics checker/debugger tool, and you can quickly find and fix this issue.
  2. Have There Been Any Significant Website Changes? – Sometimes just redesigning your website can shatter your rankings. Even removing content or restructuring what order your content appears in can have big effects on your site. Usually, this issue can be solved by using an Analytics tool to see if there were any changes right before the drop in traffic.
  3. Have You Been Hacked? – It is not as easy to know when you’ve been hacked as movies might make you think. There are spam attacks that don’t affect Active pages, but instead create new directories with spam content. This makes the hack harder to find. If a search engine picks up on it, the website is often penalized. Google and Bing Webmaster Tools can help you quickly find out if you are vulnerable to hackers. You can also do a manual search of “links:[your URL here]” to search for meta descriptions that look like spam. The solutions to being hacked vary but once you’ve uncovered the issue, you can look for specific solutions. Usually, it will involve loading a clean backup and changing all passwords.
  4. Has There Been a Major Algorithm Update? – If none of the above questions have helped, it’s quite possible your traffic issues are being caused by an update. Look through available and reputable resources to see if other websites are dealing with similar issues. If your problems are caused by algorithmic changes, it is time to seek professional help. Changing code won’t fix the problem. Instead, you will likely have to make large shifts in company practices, strategies, content and link building.
  5. Has the Site Been Hit With a Ranking Penalty? – If an algorithmic update isn’t your problem, chances are your Web site has been penalized. If you have incurred penalties, it means there has been a sudden spike in bad links or spam content. You should audit who is in charge of making sure your site is following SEO policies and make sure they are up to date with the best practices.
  6. Are You The Victim of Negative SEO? – Negative SEO happens when a competitor manages to automate spammy links at your Web site, causing site wide ranking drops. Bing Webmaster Tools has a Disavow Links Tool but unless you’ve seen a sudden rise in spam content or bad links, you can ignore this tool.

These steps can help you fix sudden traffic drops but you may have to hire professional assistance if the first two questions didn’t help. Luckily, these problems are fixable and soon your site will be back up to its steady flow of traffic.

You can read more about identifying search traffic drops in John Lynch’s article over at Search Engine Watch.

 

Google Webmaster Tools has always been a way to see some backlinks to your pages on a site you control. They’ve recently made a change to give you a “link download option” where you can download a full list of backlinks to your site and include a column for dates each link was discovered.

This way you can check and see how old your oldest links are as well as what links have surfaced recently.

To find this option, go to your site inside Webmaster Tools, click on Traffic->Links to Your Site. Then from there choose “More >>” under either “Who links the most” or under “Your most linked content”. On the following page you have three options:

  1. Download this table
  2. Download more sample links
  3. Download latest links
The new option is the third in this list. This is where you can get a full list of all the sites Google has listed that link to you, plus the date this was discovered.

You can see pictures and other details at Search Engine Roundtable.

The stance Google’s taken on search engine optimization has always been a little hazy.  However, they do acknowledge many elements of SEO as being important for a good web design (as implied by all the information given within the Google Webmaster Tools pages).  But it appears they’re taking it a step further.

Google has begun a small project by showing how to improve SEO for some select pages within countries in northern Europe.  They will then put up a post about each page describing their findings and what they recommend to improve listing positions.

This isn’t exactly offering SEO services for clients (like our own Tulsa SEO services), but it is showing that they acknowledge how important SEO is for improving results, and are even willing to help by giving some tips.

Mark Jackson has a good article detailing more on this in ClickZ.

Google has a ton of different tools available inside of the Google interface.  You can check to see all the pages on a single site, you can look through specific title tags, and with the “link:” command you can see links to a particular page or site.

However, this command is by no means the main tool you should use to get backlinks.  There are several holes in this command, and Google themselves advise to take it with a grain of salt.  SEOmoz has a great post about several misconceptions on this command.

So how do you get a decent report on backlinks?  There’s no perfect tool, but the two I’d recommend using are the Google Webmaster Tools and the Yahoo Site Explorer.  Both give a much better amount of information than the “link:” command and can give you a better concept on just what kind of backlinks a site has.  Which, as we all know, is indicative of the quality of SEO for a particular page.