Tag Archive for: matt cutts

Last week, Matt Cutts responded to a question he receives fairly regularly concerning the PageRank feature in the Google toolbar. Specifically, why haven’t they removed it? It is apparent that many believe that the PageRank feature is “widely used by link sellers as a link grading system.”

There is, of course, some truth to this. While spammers do take advantage of the PageRank system, Cutts says that it is still relevant to many others. “There are a lot of SEO’s and people in search who look at the PageRank toolbar, but there are a ton of regular users as well.” Apparently, many internet users see the PageRank feature as indicative of reputability  and Google doesn’t plan on forcing them to stop.

That doesn’t mean PageRank is here to stay forever. While Google plans to keep supporting it so long as it is relevant to their users, it is telling that Chrome does not have the PageRank feature built into Chrome. Now, IE 10 is disavowing add ons, meaning Google’s toolbar will no longer work with the browser.

Considering that Internet Explorer was the only browser supporting the Google toolbar, it is highly likely the PageRank feature, as well as the toolbar as a whole, will fade away before long. As Matt Cutts puts it, “the writing is on the wall” that the new iteration of IE could be the end of PageRank, but we will have to wait and see.

There is more than enough talk out there about negative SEO, and how to prevent it or fight back against it, but Matt Cutts says the actual number of occurrences of people trying to use negative SEO is extremely low. He explains that Google designs their algorithms to try to ensure that they can avoid penalizing innocent sites and now that Google has added the Disavow Links tool to their repertoire, it is very easy to shut down “black hat” SEO if it does happen to you.

Cutts, the head of the Google Webspam team took to YouTube to answer the huge number of questions he has received about negative SEO, and also further explain the Disavow Links tool, clearing up any misconceptions there could be. Cutts doesn’t think negative SEO should be a concern for the vast amount of website owners out there, unless you are in extremely competitive spheres. “There’s a lot of people who talk about negative SEO, but very few people who actually try it, and fewer still who actually succeed,” he said.

SEO as a whole can be split into two different categories: on-page and off-page optimization techniques. On-page optimization is focused on everything you can do to boost your rankings directly on your webpage. Off-page SEO concerns aspects that function elsewhere, like backlink management.

Some might argue that on-page optimization has been weakened by Google updates that have sought to weed out pages using methods like keyword stuffing. While this is kind of true, it does not fully discredit on-page methods.

You can still take advantage of proper keyword usage, titles, and URL management, but as Matt Cutts puts it, “there’s diminishing returns.” Christian Arno from Search Engine Journal explains what still works in on-page optimization, and while some of the old techniques have been cut down, the most effective techniques are still tried and true.

Have you ever wondered how Google handles web spam in other languages or other countries? According to Matt Cutts, head of Google’s webspam team,  they have people placed on the ground across the globe to handle markets for their native countries.

Cutts, was responding to a question asked online, when he said, “If an algorithm misses something, they are there to find the spam. They know the lay of the land, they know who the big players are, and they’re really quite expert. So if there’s some really unique kind of link spam going on in Poland, for example, there’s a person there.”

The video is below. The question was poorly phrased (Europe is smaller than the US? Really?), but it helps illustrate just how international of a company Google is trying to be. I’ve heard European countries use Google less than the US, but clearly Google is still trying to offer the same experience across the globe.

 

Many webmasters believed that the 700,000 notifications Google sent out in the first two months of this year were related to link notifications. Not true, says Google’s head of search spam, Matt Cutts.

According to Cutts, 90% of the messages sent out via Google Webmaster Tools are related to black hat issues. Their estimates are that only 3% of the messages were about unnatural links on a page. You can find out more from Search Engine Land

When it comes to improving organic search rankings, business owners will do anything to get ahead. That’s why so many wonder whether using AdWords and being paying Google customers will help their rankings or not. Matt Cutts, Google’s Web Spam boss, says it does not.

Cutts participated in a Google Webmaster Q&A in October and explained that Google attempts to be as fair and even as possible, regardless who they’re dealing with. This means that those that pay for AdWords and those that don’t get equal treatment.

Check out some of Cutts Q&A session at the iNeedHits blog.

Have you ever wondered if your site was penalized by Google through automated algorithms or a real human person? Now, you will almost always know because Google reports almost 100 percent of manual penalties.

Matt Cutts, head of Google’s web spam team, described this new policy at Pubcon this year, saying, “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site.”

“If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much all of those situations.”

Cutts did clarify that there may be rare instances where this doesn’t occur, but their aim to get to 100-percent.

In June, at SMX Advanced, Cutts gave a figure of 99 percent reporting, but Cutt believes they are currently reporting every instance of manual actions.

Danny Sullivan from Search Engine Land has more information about the distinction between manual and algorithmic actions.

 

I recently wrote about the release of Google’s Disavow Links tool, but there are some more questions popping up that need answering. So, let’s cover a little bit more about the tool.

First off, the tool does not immediately take effect. This is one of many reasons Google suggests publishers try to remove questionable links first by working with site owners hosting links, or companies that they may have purchased links through.

Instead of disavowing the links immediately, “it can take weeks for that to go into effect,” said Matt Cutts, head of Google’s web spam team at a keynote during the Pubcon conference. Google also has reserved the right to not use submissions if it feels they are questionable.

It is important to be accurate when making your file to submit to Google. Because of the delay in processing the file, it may take another few weeks to “reavow” links you didn’t mean to discount.

Once you have submitted a file to Google, you can download it, change it, and then resubmit.

The tool is mainly designed for site owners affected by the Penguin Update, which was focused on hitting sites that may have purchased links or gained them through spamming. Before, Google ignored bad links, but now they act as a negative mark against the site.

This change prompted fear in some of the SEO industry that site owners would create bad links pointing to their site, or “negative SEO.” This tool helps to ensure that negative SEO is not a worry by allowing you to disavow any of those types of links.

Danny Sullivan from Search Engine Land has even more information about the tool, and Matt Cutts has a 10 minute long video answering questions.

 

The Googlebot is Google’s automated program for searching and indexing content on the Internet. In the realm of SEO, the first part of good optimization is all about crafting textual content that’s visible and makes sense to Googlebot. After Googlebot indexes a page, the Google algorithm takes the content text and automatically ranks it on the search results page according to the search terms that the user enters into Google search. If your optimized website performs well for the term “electronic widgets,” for example, the Google algorithm will place your site near or at the top of the search results whenever someone uses Google to search for “electronic widgets.” Did you know that in addition to its automated components like Googlebot and the algorithm that Google also uses human site raters in the ranking of websites?

Google employs hundreds of site raters who rate a huge number of websites on relevancy. The input collected from this team doesn’t directly influence the search results, but it does influence the Google engineers in changing the algorithm to better serve more relevant results to the search engine user.

In this great video, Google senior software engineer Matt Cutts, demystifies this process by explaining how human website raters are used in testing changes to the Google algorithm. Essentially, after a change to the automatic search ranking is made, Google performs many test queries and evaluates what has changed in the results. The new search results are checked against the results before the change, and then presented to the human raters – in what Matt Cutts calls a “blind taste test” – to determine which set of search engine results are more relevant and useful. Only after analyzing and evaluating the feedback of the human raters are the new search results then tested with a small, carefully selected number of Internet users. Only if this last round of surveys on the algorithm change prove the results more accurate and useful will the updated algorithm be integrated into Google Search for the use of the public. It’s an exhaustive process, but that’s how much Google wants its search engine to be the most relevant on the web.

Watch the video here:

As most everyone has noticed by this point, Google Instant is now live.  Searches provide results real-time, making the things you’re searching for appear more quickly, and in some cases, allowing searchers to find other results they may not otherwise have discovered.  There is a lot of speculation out there on how this will affect SEO.

Read more