Posts

Google Mac Search

Roughly a month after Google announced it would be completely shutting down the Google PageRank Toolbar, the service finally went dark over the weekend.

Now, those who have the PageRank toolbar installed will not be able to see the 1-to-10 rankings of sites they visit, making the toolbar officially useless. The move also cuts data from any third-party toolbar that tries to retrieve a PageRank score.

Of course, if you’ve been using the PageRank toolbar recently, you have been working from outdated data. Google hasn’t updated the public PageRank scores in years, so it makes sense to finally shut it down for good.

It must be noted that PageRank isn’t completely gone. It just isn’t available to the public anymore. Google will keep using the PageRank algorithm internally to help evaluate websites.

For more about the legacy of PageRank and the PageRank toolbar, I recommend reading Danny Sullivan’s RIP Google PageRank score: A retrospective on how it ruined the web.

Source: Robert Scoble / Flickr

Source: Robert Scoble / Flickr

Despite once being the gold standard for assessing a site’s authority and optimization, Google announced this week it will be shutting down PageRank and all toolbars featuring the tool will no longer show a PageRank score.

The algorithm-based tool would assess web pages and rank them on a scale of one to ten based on numerous signals that Google uses to evaluate pages. It was an easy-to-understand way to quickly “score” a website and know if optimization, link building, or other marketing efforts were having a positive effect. However, the tool has not been updated in years.

Many webmasters have been holding onto hope that PageRank would get an algorithm update, but the company has been slowly moving away from it for some time. PageRank scores were never displayed inside Google Chrome, and the scored were dropped from Google Search Console in 2009. The Google Open Directory website, which showed PageRank metrics, was also shut down in 2010.

The last bastion of the PageRank score was the IE Google toolbar, which continued to show scores up until now.

Google has confirmed it will not be updating the tool for the public, but it will continue to be used by Google internally.

The move signals a big shift away from Google’s old way of doing things, but in practical terms it will change very little. Since PageRank hadn’t been updated since 2013, SEOs and webmasters have learned to rely on other tools and methods of assessing their marketing efforts.

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

It remains incredibly unclear what Google’s thoughts or plans are for PageRank, as Matt Cutts, Google’s head of search spam, commented on Twitter yesterday that there won’t be any updates to PageRank or the toolbar anytime before 2014.

Neils Bosch asked the esteemed Google engineer whether there would be an update before next year, to which Cutts responded, “I would be surprised if that happened.”

According to Search Engine Land, it has been over 8 months since the last Google Toolbar PageRank update, back on February 4, 2013. Many have proclaimed the toolbar dead, but Cutts has personally defended the toolbar on a Webmaster chat within the past year, and said the toolbar won’t be going away.

However, as Cutts himself explained, Chrome doesn’t have a PageRank extension, Google dropped support for Firefox in 2011, and Internet Explorer 10 doesn’t support toolbar extensions. It seems clear there will be less and less of an audience for the toolbar, so its relevancy and use will likely taper off until it just kind of disappears.

It is always possible that Google might put out a surprise update next year, but don’t expect PageRank to be around forever.

Recently, Google updated the link schemes web page that gives examples of what Google considers to be spammy backlinks. The additions are pretty notable as article marketing or guest posting campaigns with keyword rich anchor text have been included. Advertorials with paid links and links with optimized anchor text in press releases or articles were also added.

With all the new additions, it can be hard to keep up to date with what Google is labeling spammy backlinks or backlink schemes. But, Free-SEO-News’ recent newsletter simply and efficiently lays out the 11 things that Google doesn’t like to see in backlink campaigns.

  1. Paid Links – Buying or selling links that pass PageRank has been frowned upon for a long time. This includes exchanging money for links or posts that contain links, sending ‘free’ products in exchange for favors or links, or direct exchange of services for links. It is pretty simple, buying links in any way will get you in trouble.
  2. Excessive Link Exchanges – While exchanging links with relevant other websites in your industry is absolutely normal for websites, over-using those links or cross-linking to irrelevant topics is a big sign of unnatural linking. Simple common sense will keep you from getting in trouble, just don’t try to trick the system.
  3. Large-Scale Article Marketing or Guest Posting Campaigns – Similar to the last scheme, posting your articles and guest posts on other websites it perfectly normal. However, doing it in bulk or posting the same articles to numerous websites will appear to be blogspam to Google. Also, if you do guest posts just to get keyword rich backlinks, you will see similar penalties. Only publish on other websites when it makes sense and offers value.
  4. Automated Programs or Services to Create Backlinks – There are tons of ads for tools and services that promise hundreds or thousands of backlinks for a low price and very little work. While they may do what they say, Google also easily spots these tools and won’t hesitate to ban a site using them.
  5. Text Ads That Pass PageRank – If you’re running a text ad on another website, you have to make sure to use the rel=nofollow attribute, otherwise it appears to be a manipulative backlink.
  6. Advertorials That Include Links That Pass PageRank – If you pay for an article or ad, always use the rel=nofollow attribute. Simply put, if you paid for an ad or article, it won’t do you any good and can bring a lot of damage if you don’t use the attribute.
  7. Links with Optimized Anchor Text in Articles or Press Releases – Stuffing articles and press releases with optimized anchor text has been a strategy for a long time, but Google has shut it down recently. If your page has a link every four to five words, you’re probably looking at some penalties.
  8. Links From Low Quality Directories or Bookmark Sites – Submitting your site to hundreds of internet directories is an utter waste of time. Most links won’t ever get you a single visitor and won’t help your rankings. Instead, only focus on directories that realistically could get you visitors.
  9. Widely Distributed Links in the Footers of Various Websites – Another older trick that Google has put the squash on was to put tons of keyword rich links to other websites in the footer. These links are always paid links and are an obvious sign of link schemes.
  10. Links Embedded in Widgets – It isn’t uncommon for widget developers to offer free widgets that contain links to other sites. It also isn’t uncommon for these developers to reach out to site owners and offer to advertise through these widgets. However, Google hates these links and considers them a scheme. I’d suggest against it, but if you do advertise through these widgets, use the nofollow attribute.
  11. Forum Comments With Optimized Links in the Post – It is very easy to get a tool that automatically posts to forums and include links to websites. It is a pretty blatant form of spam which won’t get any actual visibility on the forums and the links are more likely to get you banned than draw a single visitor.

There’s a pretty obvious underlying trend in all of these tactics that Google fights. They all attempt to create artificial links, usually in bulk. Google can tell the quality of a link and all of these schemes are easily identifiable. Instead, focus on building legitimate quality links, and use respected tools such as SEOprofiler. It will take longer, but you’re site will do much better.

Last week, Matt Cutts responded to a question he receives fairly regularly concerning the PageRank feature in the Google toolbar. Specifically, why haven’t they removed it? It is apparent that many believe that the PageRank feature is “widely used by link sellers as a link grading system.”

There is, of course, some truth to this. While spammers do take advantage of the PageRank system, Cutts says that it is still relevant to many others. “There are a lot of SEO’s and people in search who look at the PageRank toolbar, but there are a ton of regular users as well.” Apparently, many internet users see the PageRank feature as indicative of reputability  and Google doesn’t plan on forcing them to stop.

That doesn’t mean PageRank is here to stay forever. While Google plans to keep supporting it so long as it is relevant to their users, it is telling that Chrome does not have the PageRank feature built into Chrome. Now, IE 10 is disavowing add ons, meaning Google’s toolbar will no longer work with the browser.

Considering that Internet Explorer was the only browser supporting the Google toolbar, it is highly likely the PageRank feature, as well as the toolbar as a whole, will fade away before long. As Matt Cutts puts it, “the writing is on the wall” that the new iteration of IE could be the end of PageRank, but we will have to wait and see.

If you’ve ever received a notification from Google about a manual spam action based on “unnatural links” pointing to your webpage, Google has a new tool for you.

Links are one of the most known about factors Google uses to order search results, and they examine the links between sites to decide which pages are reputable. As you probably know, this is the foundation of PageRank, another of the most well-known “signals” Google uses to order search results. Google is concerned about spammers trying to take advantage of PageRank, and often they have to take manual action.

The notification you may have received in Webmaster Tools about those unnatural links suggests you got caught up in linkspam. Linkspam is the use of paid links, link exchanges, and other tactics like those. The best response to the message would be to remove as many low quality links as possible from your site. This keeps Google off of your back, and will improve the reputation of your site as a whole.

If you can’t seem to get rid of all of the links for some reason, Google’s new tool can help you out. The Disavow Links page allows you to input URLs which you would like disavowed from your site, and the “domain :” keyword will help you disavow links across all pages on a specific site.

Everyone is allowed one disavow file per website, and the file is shared among site owners through Webmaster Tools.

If you need assistance finding bad links in your site, the “Links to Your Site” feature in Webmaster Tools can also assist you in starting your search.

Google’s Webmaster Central Blog included a few quick answers in their announcement for the tool for questions you may have, noting that most sites will not need to use the feature in any way unless they’ve received a notification.

 

It’s hard to keep up with Google’s constant adjustments, and AuthorRank is a future feature that isn’t as understood as it probably should be. Its history dates back to August of 2005 when Google filed a patent for “Agent Rank”.

This patent included ranking “agents” and using the public reception to the content they create to determine their rank. Basically, the more popular websites with positive responses would be higher in rankings than less-authoritive “agents”.

After the patent, “AgentRank” disappeared for a while, until in 2011 Eric Schmidt made references to identifying agents in order to improve search quality. A month later, they filed a patent for what is assumed to have become Google+, which acts as a digital signature system for identification, which can be tied to content. And that content can be ranked. Hello, AuthorRank.

It has yet to be officially implemented, but there have been rumors all year that AuthorRank is under development, and AJ Kohn has stated it could completely change the search engine game. It would act as a factor in PageRank, which makes high-quality content higher ranked.

Mike Arnesen at SEOmoz says it’s not a matter of “if Google rolls out AuthorRank, but when.” He also has some great suggestions of how to be prepared for when AuthorRank arrives. I highly suggest reading his extensive article, because I agree strongly with the idea AuthorRank will be here sooner rather than later.

With Google’s recent focus on social media, and the natural concept that people want to see quality content in their results, it is just a matter of time before AuthorRank is a serious concern to the SEO industry.

 

Technical SEO can be interesting, but no one likes coming across the same problems time and time again. That’s why it’s shocking how many websites are struggling with the same issues.

Here are some of the most frequent issues that can found while doing a site audit. We also have the solutions, so you can be prepared if you come across any of these issues.

1) Uppercase vs. Lowercase URLs – This happens most often on sites that use .NET. The server is configured to respond to URLs with uppercase letters and doesn’t redirect or rewrite to lowercase versions. This issue is slowly disappearing because search engines are improving a lot at recognizing canonical versions and disregarding copies. Just because it is going away doesn’t mean this issue should be ignored. Search engines still make mistakes doing this, so don’t rely on them.

Luckily, there is a an easy fix for this issue in the form of a URL rewrite module, which solves the issue on IIS 7 servers. There is a convenient option inside the interface that allows you to enforce lowercase URLs. If you do this, a rule is added to the web config file and this problem is gone.

2) Multiple Versions of the Homepage – If you are auditing a .NET website, go check to see if www.example.com/default.aspx exists. Most likely, it does. The page is a duplicate that search engines often find via navigation or XML sitemaps. Other platforms will instead make URLs like www.example.com/index.html or www.example.com/home. Most contemporary search engines automatically fix the problem, but why not make sure there isn’t an issue to be fixed?

The best way to solve this problem begins with doing a crawl of the site and exporting it into a CSV filtered by META title column. Do a search for the homepage title and you’ll quickly spot duplicates of your homepage. An easy fix for these duplicates is to add a 301 redirect version of the page that directs to the correct version.

You can also do a crawl with a tool like Screaming Frog to find internal links that point to the duplicate pages. Then, you can edit the duplicate pages so they direct to the correct URL. Having internal links that go via 301 can cost you some link equity.

3) Query Parameters Added to the End of URLs – This issue is most common on database driven eCommerce websites because there are tons of attributes and filtering options. This means often you will find URLs like www.example.com/product-category?color=12. In that example, the product is filtered by color. Filtering like this can be good for users, but bad for searches. Unless your customers do not search for the specific product using color, the URL is probably not the best landing page to target with keywords.

Another issue that tends to show up on tons of crawls of sites is when these parameters are combined together. The worst is when the parameters can be combined in different orders but return the same content, such as:

www.example.com/product-category?color=12&size=5 

www.example.com/product-category?size=5&color=12

Because both of these have different paths but return the same content, they are seen as duplicate content. It is important to remember Google allocates crawl budget based on PageRank. Make sure your budget is being used efficiently.

To begin fixing this issue, you need to address which pages you want Google to crawl and index. Make this decision based on keyword research and cross reference all database attributes with your core target keywords. You need to figure out what attributes are keywords used to find products. In figuring this out, it is possible to find high search volume for certain keywords, for example “Nike” = “Running Shoes.” If you find this, you want a landing page for “Nike Running Shoes” to be crawlable and indexable. Make sure the database attribute has an SEO friendly URL and ensure that the URLs are part of the navigation structure of your site so that a good flow of PageRank users can find the pages easily.

The next step depends on whether you want the specific attribute indexed or not. If the URLs are not already indexed, add the URL structure to your robots.txt file and test your regex properly to make sure you don’t block anything accidentally. Also, make sure you use the Fetch as Google feature in Webmaster tools. Remember, however, if the URLs are already indexed, adding them to your robots.txt file will not remove them.

If the URLs are indexed, unfortunately you are in need of the rel=canonical tag. If you inherit one of these situations and are not able to fix the core of the issue, the rel=canonical tag covers the issue in hope that it can be solved later. You’ll want to add this tag to the URLs you do not want indexed and point to the most relevant URL you do want indexed.

4) Soft URL Errors – A soft 404 is a page that looks like a 404 but returns a HTTP status code 200. If this happens, the user sees something resembling “Sorry the page you requested cannot be found”, but the code 200 tells search engines that the page is still working. This disconnect can be the source of the issue with pages being crawled and indexed when you don’t want them to be. A soft 404 also means real broken pages can’t be found.

Thankfully, this problem has a very easy fix for any developer who can set the page to return a 404 status code instead of a 200. You can use Google Webmaster tools to find any soft 404s Google has detected. You can also perform a manual check by going to a broken URL and seeing what status code is returned.

5) 302 Redirects Instead of 301 Redirects – Because users won’t be able to tell there is even a problem, this is a pretty easy problem for developers to make. A 301 redirect is permanent. Search engines recognize this and send link equity elsewhere. A 302 redirect is temporary and search engines will expect the original page to return soon, which leaves link equity where it is.

Find 302s by using a deep crawler like Screaming Frog. It allows you to filter by 302s, which you can then check individually. You can then ask your developers to change any that should be 301s.

6) Broken or Outdated Site Maps – XML sitemaps may not be essential, but they are very useful to search engines that make sure they can find all the URLs that matter. XML sitemaps help show the search engines what is important. Letting your sitemap become outdated causes them to contain broken links and miss any new content and URLs. Keeping sitemaps updated is especially important for big sites that add new pages frequently. Bing also penalizes sites with too many issues in their sitemaps.

Audit your current sitemap for broken links. After, speak to your developers about updating your XML sitemap and make it dynamic so that it updates frequently. How frequently depends on your resources, but doing this will save you a lot of trouble later.

It is very possible you will come across other issues while doing an audit, but, hopefully, if you come across any of these, you are now prepared to fix the problem.

 

For more Technical SEO Problems, read this article by Paddy Moogan at SEOmoz.