Tag Archive for: Webmaster Chat

matt-cutts

Google’s Matt Cutts

With the big crackdown on spammy link building practices over the past two years at Google, there are still many webmasters left with questions about what exactly constitutes a spammy practice. Google has previously advised against using links in forum “signatures” as a means of link building, but what about using a link in a comment when it is topically relevant and contributes to the conversation? That is exactly the question Matt Cutts answered in a Webmaster Chat video on Wednesday.

The short answer is that using links to your site in your comments is fine the majority of the time. Everyone who actually contributes to forums has a habit of linking to relevant information, and that often includes their own blogs. But, like everything, it can be abused.

Matt gave some tips to ensure your comments don’t get flagged as spammy by Google or the sites you are commenting on.

  • If you can, use your real name when commenting. Using a company name or anchor text you want to rank for gives the appearance of commenting for commercial marketing purposes, which raises the spam alarm.
  • If you are using leaving links in blog post comments as your primary means for linkbuilding and the majority of your links come from blog comments, Google will probably flag you.

You can see the video below.

 

If there is one way to concisely explain the changes Google’s search algorithms have gone through in the past couple years, it would boil down to “bigger is not always better.” Gone are the days that you can jam as many keywords as you could fit into a paragraph of text, or buy up countless thousands of links and hope to rank highly.

However, the more you do to offer quality content and information to your users while staying in line with Google’s practices, the more success you’ll see.

Those two ideas are fairly common knowledge now, but they have created their own fair share of questions. Where should the balance between quantity and quality lie? How is this content evaluated? Does quantity of content outweigh quality of content?

Google has given some insight into how content is evaluated in the past, and it is clear that you won’t get far with an excessive amount of paper-thin content. Still, the number of indexed pages your site has does indeed have an effect on your ranking. So how exactly does this work and what is the balance?

Matt Cutts, Google’s head of Webspam, addressed this type of issue head-on in his most recent Webmaster Chat video. He was asked, “Does a website get a better overall ranking if it has a large amount of indexed pages?”

Cutts explained that having more indexed pages isn’t a magic ticket to higher rankings. He said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

However, having more indexed pages does have some clear benefits. The more pages you have, the more opportunities you have to rank for different keywords. But, this is only because you should be covering a larger variety of keywords and topics across that larger group of pages.

A larger number of indexed pages is also likely to improve your overall links and PageRank, which can affect your ranking. But, the link isn’t direct. Simply having more pages won’t improve much for you. Instead, you have to use those extra pages to deliver valuable content and information to your users. If you’re just filling your site with a meaningless wealth of pages to be indexed, you won’t be seeing any improvement anytime soon.

It remains incredibly unclear what Google’s thoughts or plans are for PageRank, as Matt Cutts, Google’s head of search spam, commented on Twitter yesterday that there won’t be any updates to PageRank or the toolbar anytime before 2014.

Neils Bosch asked the esteemed Google engineer whether there would be an update before next year, to which Cutts responded, “I would be surprised if that happened.”

According to Search Engine Land, it has been over 8 months since the last Google Toolbar PageRank update, back on February 4, 2013. Many have proclaimed the toolbar dead, but Cutts has personally defended the toolbar on a Webmaster chat within the past year, and said the toolbar won’t be going away.

However, as Cutts himself explained, Chrome doesn’t have a PageRank extension, Google dropped support for Firefox in 2011, and Internet Explorer 10 doesn’t support toolbar extensions. It seems clear there will be less and less of an audience for the toolbar, so its relevancy and use will likely taper off until it just kind of disappears.

It is always possible that Google might put out a surprise update next year, but don’t expect PageRank to be around forever.

Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?

This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.

Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.

He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”

Depending on your skill set, a recent Webmaster video may be good or bad news to bloggers and site owners out there. Most people have never considered whether stock photography or original photography has any effect on search engine rankings. As it happens, not even Matt Cutts has thought about it much.

There are tons of writers out there who don’t have the resources or talent with a camera to take pictures for every page or article they put out. Rather than deliver countless walls of text that people don’t like looking at, most of us without the artistic talent instead use stock photos to make the pages less boring and help our readers understand us more. For now, we have nothing to worry about.

Cutts, the head of Google’s Webspam team, used his latest Webmaster Chat to address this issue, and he says that to the best of his knowledge, original vs. stock photography has no impact on how your pages rank. However, he won’t rule it out for the future.

“But you know what that is a great suggestion for a future signal that we could look at in terms of search quality. Who knows, maybe original image sites might be higher quality, whereas a site that just repeat the same stock photos over and over again might not be nearly as high quality. But to the best of my knowledge, we don’t use that directly in our algorithmic ranking right now.”

Logically, I would say that if Google does decide to start consideration photo originality on web pages, Cutts appears to be more worried about sites that use the same images “over and over” rather than those who search for relevant and unique stock images for articles. Penalizing every website owner without a hired photographer to continuously produce images for every new page would seem a bit overkill.