Posts

Matt CuttsGoogle has been bringing down the hammer on spammy websites quite a bit recently with more specific penalties for sites that aren’t following guidelines. There have been several high-profile cases such as the Rap Genius penalty, and several attacks on entire spammy industries. But, if you are responsible for sites with spammy habits, a single manual action can hurt more than just one site.

It has been suggested that Google may look at your other sites when they issue manual actions, and Matt Cutts has all but confirmed that happens at least some of the time.

Marie Haynes reached out to Cutts for help dealing with a spammy client, and his responses make it clear that the client appears to be linked to “several” spammy sites. Over the course of three tweets, Cutts makes it obvious that he has checked out many of the spammer’s sites, not just the one who has received a manual action, and he even tells one way Google can tell the sites are associated.

Of course, Google probably doesn’t review every site penalized webmasters operate, but it shows they definitely do when the situation calls for it. If your spammy efforts are caught on one site, chances are you are making the same mistakes on almost every site you operate and they are all susceptible to being penalized. In the case of this client, it seems playing against the rules has created a pretty serious web of trouble.

Leave it to Matt Cutts to always be there to clear the air when there is an issue causing some webmasters confusion. One webmaster, Peter, asked Matt Cutts whether geo-detecting techniques is actually against Google’s policies, as it is common for websites to be designed so that users are given the information (price, USPs) most relevant to their lives based on geo-location.

In some understandings of Google’s policies, this may be against the rules, but it turns out all is fine, so long as you avoid one issue.

In one of his Webmaster Chat videos, Cutts explained that directing users to a version of a site, or delivering specific information based on location are not spammy or against Google’s policies. It only makes sense to offer viewers information that actually applies to their lives.

What Google does consider spam is directing their crawlers or GoogleBot to a web page of content that users cannot see. Sending GoogleBot to a different location that what visitors see is a bad idea, which is considered spam or a form of cloaking. Instead, treat GoogleBot as you would any user, by checking the location information and sending the crawler to the normal page reflecting that data.

Google has been very clear about their stance on manipulative or deceptive behavior on websites. While they can’t tackle every shady practice sites have been enacting, they have narrowed their sites on a few manipulative acts they plan on taking down.

The first warning came when Google directly stated their intention to penalize sites who direct mobile users to unrelated mobile landing pages rather than the content they clicked to access. While that frustrating practice isn’t exactly manipulative, it is an example of sites redirecting users without their consent and can be terrible to try to get out of (clicking back often just leads to the mobile redirect page, ultimately placing you back at the page you didn’t ask for in the first place).

Now, Google is aiming at a similar tactic where site owners have been inserting fake pages into the browser history, so that when users attempt to exit, they are directed to a fake search results page that is entirely filled with ads or deceptive links, like the one below. It is basically a twist on the tactic which keeps placing users trying to exit back on the page they clicked to. The only way out is basically a flurry of clicks which end up putting you much further back in your history than you intended. You may not have seen it yet, but it has been popping up more and more lately.

Fake Search Results

The quick upswing is probably what raised Google’s interest in the tactic. As Search Engine Watch explains, deceptive behavior on sites has pretty much always been against Google’s guidelines and for them to make a special warning to sites adopting the practice suggests this practice is undergoing widespread dissemination to sites that are okay pushing Google’s limits.