Tag Archive for: matt cutts

It remains incredibly unclear what Google’s thoughts or plans are for PageRank, as Matt Cutts, Google’s head of search spam, commented on Twitter yesterday that there won’t be any updates to PageRank or the toolbar anytime before 2014.

Neils Bosch asked the esteemed Google engineer whether there would be an update before next year, to which Cutts responded, “I would be surprised if that happened.”

According to Search Engine Land, it has been over 8 months since the last Google Toolbar PageRank update, back on February 4, 2013. Many have proclaimed the toolbar dead, but Cutts has personally defended the toolbar on a Webmaster chat within the past year, and said the toolbar won’t be going away.

However, as Cutts himself explained, Chrome doesn’t have a PageRank extension, Google dropped support for Firefox in 2011, and Internet Explorer 10 doesn’t support toolbar extensions. It seems clear there will be less and less of an audience for the toolbar, so its relevancy and use will likely taper off until it just kind of disappears.

It is always possible that Google might put out a surprise update next year, but don’t expect PageRank to be around forever.

Have you ever searched for a term only to find a page that says “we have no articles for [your search term]” and a whole bunch of ads? Most people have come across these sites with auto-generated content, often called “Made for AdSense” or MFA sites. These pages are created for the sole reason of luring people in, and hoping they click an AdSense ad to leave the page instead of hitting the back button.

The majority of these types of websites use a script to automatically generate content that takes snippets from search results or web pages with those keywords. They don’t offer real content in any way and have absolutely no legitimate value. It makes many wonder why they’ve encountered these kinds of pages in the Google search results.

One user directly asked Matt Cutts, Google’s head of webspam, if the search engine is doing anything about the pages, such as penalties or removing these sites from the index. As you would expect, Google already has a policy in place, and Cutts encourages users to report any pages like this they come across. He states:

We are absolutely willing to take action against those sites. We have our rules in our guidelines about auto-generated pages that have very little value and I have put out in the past specific calls for sites where you search for a product – a VCR, a laptop, or whatever – and you think you’re going to get a review, and the first thing you see is ‘0 Reviews found for [blah blah blah].’

As Google sees it, even if these pages are from legitimate search engines, they don’t belong in the rankings. Users don’t really like searching for something and being sent to another page of search results. They want to be directed straight to real content.

There are very few times when search results snippets should be indexed. The only real time it might be considerable is if you have exclusive data that no one else has. But, there is no time when a supposed search results page with 0 results should ever be indexed.

To put it simply, Google is already trying to fight against these sites. They aim to find and penalize all they can, but they also want people to report them with a spam report if possible so that the lowest amount possible slip through the cracks.

Don’t say Google doesn’t at least try to listen to webmasters. Though many webmasters have some pretty big (and often legitimate) grudges against the biggest search engine, it can’t be said they don’t at least try to reach out for opinions. One example of Google trying to receive feedback from site owners appeared last night, as Matt Cutts, Google’s head of webspam, tweeted out a call for webmasters and SEOs to fill out a survey.

Specifically, Cutts called for owners of small but high-quality websites who believe they should be doing better than they are in the rankings. It won’t end up affecting your rankings immediately, but it may give Google some information that will help them keep the playing field vaguely even for small businesses and big companies alike. The form reads:

Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.

The survey only asks two short questions. First, it calls for the name and URL of the small site you believe should be ranking well. Secondly, Google would obviously like to hear your opinion about why the site should rank higher. It is extremely straightforward, and shouldn’t take all that long for most webmasters to complete.

 

In his attempt to fix some confusing wording Google has been using, Matt Cutts, Google’s head of webspam, used his latest Webmaster Help video to clarify that page load speed is not any more important for rankings on mobile than it is for desktop searches.

This comes after Google has been publicly emphasizing the need for sites to load quickly, noting that mobile users are highly likely to leave a page if it doesn’t load fast enough. While Google isn’t backing off of that stance, Cutts wanted to make it clear that there isn’t a difference in how this speed is ranked from mobile to desktop.

If all things are equal, meaning all other aspects of two sites are ranked evenly, the site that loads faster will almost certainly be given the higher ranking in search results by Google, but that is true on smartphones and desktop computers alike. It is also just a sensible part of the algorithm, as slow pages will likely lose a large number of visitors just during the loading time, making it a lower-value site.

But, as internet speeds across devices and across the globe vary, Cutts said Google doesn’t have plans to give an exact amount of seconds your site should load in, but if it becomes obvious to Google that mobile users are getting more frustrated by slow sites than their desktop counterparts, they may consider weighting loading speed more for mobile searches. It just isn’t the case yet, and there are no plans currently to make it so.

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?

This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.

Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.

He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”

It isn’t uncommon for webmasters or SEOs who operate numerous sites in a network to ask how many of them they can link together without bringing down the ax of Google. Finally that question made its way to Google’s head of Webspam who responded in one of his regular YouTube videos.

The question was phrased “should a customer with twenty domain names link it all together or not?” While blog networks can easily find legitimate reasons to link together twenty or more sites (though Cutts advises against it), it seems interesting to use the number in question to discuss normal webpages. As Cutts put it, “first off, why do you have 20 domain names? […] If it is all, you know, cheap-online-casinos or medical-malpractice-in-ohio, or that sort of stuff… having twenty domain names can look pretty spammy.”

When I think of networks with numerous full sites within them, I think of Gawker or Vice, two online news sources who spread their news out across multiple sites that are more focused on unique topics. For example, Vice also runs Motherboard, a tech focused website, as well as Noisey, a site devoted to music. Gawker on the other hand runs Deadspin, Gizmodo, iO9, Kotaku, and Jezebel, among a couple others. Note, at most those networks run 8 unique sites. There is little reason any network of unique but connected sites with more parts than that.

However, there are times when having up to twenty distinct domain names could make sense without being spammy. Cutts points out that when you have many different domain names that are all localized versions of your site, it is ok to be linking to them. Even in that scenario however, you shouldn’t be linking them in the footer. The suggested fix is to place them in a drop down menu where users have access.

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

For those still pushing backlinks as the golden goose of SEO, a recent revision to Google’s Ranking help guidelines could be potentially frightening. But, if you’ve been watching the changes in SEO over the past few years it shouldn’t come as much of a surprise. Google has become more and more strict about backlink quality and linkbuilding methods, and links were bound to be dethroned.

As reported by Search Engine Watch, it was spotted late last week that Google updated the Ranking help article to say “in general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.” Before, it told webmasters that they could improve their rank “by increasing the number of high-quality sites that link to their pages.”

There have been countless signs that Google would officially step back from linkbuilding as one of the most important ranking signals. There were widespread complaints for a while about competitors using negative SEO techniques like pointing bad links to websites, and every Penguin iteration that comes out is a significant event in SEO.

To top it all off, when Matt Cutts, the esteemed Google engineer, was asked about the top 5 basic SEO mistakes, he spent a lot of time talking about the misplaced emphasis on link building.

“I wouldn’t put too much of a tunnel vision focus on just links,” Cutts said. “I would try to think instead about what I can do to market my website to make it more well known within my community, or more broadly, without only thinking about search engines.”

Depending on your skill set, a recent Webmaster video may be good or bad news to bloggers and site owners out there. Most people have never considered whether stock photography or original photography has any effect on search engine rankings. As it happens, not even Matt Cutts has thought about it much.

There are tons of writers out there who don’t have the resources or talent with a camera to take pictures for every page or article they put out. Rather than deliver countless walls of text that people don’t like looking at, most of us without the artistic talent instead use stock photos to make the pages less boring and help our readers understand us more. For now, we have nothing to worry about.

Cutts, the head of Google’s Webspam team, used his latest Webmaster Chat to address this issue, and he says that to the best of his knowledge, original vs. stock photography has no impact on how your pages rank. However, he won’t rule it out for the future.

“But you know what that is a great suggestion for a future signal that we could look at in terms of search quality. Who knows, maybe original image sites might be higher quality, whereas a site that just repeat the same stock photos over and over again might not be nearly as high quality. But to the best of my knowledge, we don’t use that directly in our algorithmic ranking right now.”

Logically, I would say that if Google does decide to start consideration photo originality on web pages, Cutts appears to be more worried about sites that use the same images “over and over” rather than those who search for relevant and unique stock images for articles. Penalizing every website owner without a hired photographer to continuously produce images for every new page would seem a bit overkill.