Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?

This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.

Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.

He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”

It isn’t uncommon for webmasters or SEOs who operate numerous sites in a network to ask how many of them they can link together without bringing down the ax of Google. Finally that question made its way to Google’s head of Webspam who responded in one of his regular YouTube videos.

The question was phrased “should a customer with twenty domain names link it all together or not?” While blog networks can easily find legitimate reasons to link together twenty or more sites (though Cutts advises against it), it seems interesting to use the number in question to discuss normal webpages. As Cutts put it, “first off, why do you have 20 domain names? […] If it is all, you know, cheap-online-casinos or medical-malpractice-in-ohio, or that sort of stuff… having twenty domain names can look pretty spammy.”

When I think of networks with numerous full sites within them, I think of Gawker or Vice, two online news sources who spread their news out across multiple sites that are more focused on unique topics. For example, Vice also runs Motherboard, a tech focused website, as well as Noisey, a site devoted to music. Gawker on the other hand runs Deadspin, Gizmodo, iO9, Kotaku, and Jezebel, among a couple others. Note, at most those networks run 8 unique sites. There is little reason any network of unique but connected sites with more parts than that.

However, there are times when having up to twenty distinct domain names could make sense without being spammy. Cutts points out that when you have many different domain names that are all localized versions of your site, it is ok to be linking to them. Even in that scenario however, you shouldn’t be linking them in the footer. The suggested fix is to place them in a drop down menu where users have access.

Yesterday was the big day. July 22 marked the deadline for the roughly 2 million Adwords campaigns that have held out on converting to Adwords Enhanced and will be automatically upgraded. Google had blatantly stated the that yesterday was a hard deadline for the last 25 percent of Adwords users to migrate, but as per usual, the process will actually occur over a long period.

In an Inside Adwords blog post about the change, Google explained, “…starting today, we will begin upgrading all remaining campaigns automatically, bringing everyone onto the new AdWords platform. As with many product launches, the rollout will be gradually completed over several weeks.”

The forced upgrade brings about quite a few changes in how you should manage your campaigns, and to help everyone get started, Search Engine Watch brought together a group of professionals in the field to offer their advice.

Google also offered their own suggestions.

  1. Review your mobile bid adjustments – For most campaigns, the auto-upgrade default is based on bids from similar advertisers. You will need to visit the ‘Settings’ tab to optimize for your business.
  2. Identify unwanted keyword duplication in overlapping campaigns – If you previously were using similar legacy campaigns for every device type, it is suggested you identify matching campaigns and remove any unwanted duplicate keywords in the enhanced campaign.
  3. Review Display Network campaigns – You will want to verify that your display ads are reaching users on all desired devices and that you are using the correct bidding strategies.
  4. Explore the Enhanced Campaign features – It is recommended you try out upgraded sitelinks and upgraded call extensions to start. Then you can further boost results by creating mobile preferred ads and setting bid adjustments for location and time.
Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

Despite telling us that Google would no longer confirm when new Panda updates occur, they announced today that they were rolling out a new update that is “more finely targeted” than the original release of Penguin 2.0.

Unlike many Penguin updates, most webmasters actually seem happy to see the new version, as they are already claiming recovery from the original algorithm.

Google has said that their plan is to release Panda algorithm updates monthly over a ten day period, but Matt Cutts, head of Google’s Webspam team, implied there way a delay for this refresh because they wanted to ensure the signals would be loosened up a little from the last release.

The official statement from Google simply says, “In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.”

Search Engine Journal says the update has resulted in

  • Increase in impressions but same amount of CTR’s (viewable when logged into Google’s Webmaster Tools)
  • Informational sites such as Wikipedia and About.com have seen big impacts in their rankings
  • Authority sites are more prominent in SERPs.
  • Sites using Google+ are getting better rankings

Their suggestions for the future? It’s reaching the point where not using Google+ can hurt your site, and it is time to enable Google Authorship.

Google has been very clear about their stance on manipulative or deceptive behavior on websites. While they can’t tackle every shady practice sites have been enacting, they have narrowed their sites on a few manipulative acts they plan on taking down.

The first warning came when Google directly stated their intention to penalize sites who direct mobile users to unrelated mobile landing pages rather than the content they clicked to access. While that frustrating practice isn’t exactly manipulative, it is an example of sites redirecting users without their consent and can be terrible to try to get out of (clicking back often just leads to the mobile redirect page, ultimately placing you back at the page you didn’t ask for in the first place).

Now, Google is aiming at a similar tactic where site owners have been inserting fake pages into the browser history, so that when users attempt to exit, they are directed to a fake search results page that is entirely filled with ads or deceptive links, like the one below. It is basically a twist on the tactic which keeps placing users trying to exit back on the page they clicked to. The only way out is basically a flurry of clicks which end up putting you much further back in your history than you intended. You may not have seen it yet, but it has been popping up more and more lately.

Fake Search Results

The quick upswing is probably what raised Google’s interest in the tactic. As Search Engine Watch explains, deceptive behavior on sites has pretty much always been against Google’s guidelines and for them to make a special warning to sites adopting the practice suggests this practice is undergoing widespread dissemination to sites that are okay pushing Google’s limits.

Anyone keeping track should know that Google isn’t afraid to shutter a beloved service or tool at their whim. We’re all still mourning the loss of Google Reader, but Eric Siu from Entrepeneur says we should also be gearing up to lose the popular Google Adwords External Keyword Research Tool.

The free tool for Adwords is commonly used by site owners to dig up statistics on keyword search volume, estimated traffic volume, and average cost per click, but the most loved capability was the determining which specific keywords a site owner should targer with their future optimization strategies and PPC campaigns.

Google hasn’t announced anything yet, so there isn’t a confirmed shutdown date or any known information, but rumors suggest it could happen anytime. Google has implied the Adwords tool will be combined into a new program referred to as the Keyword Planner, but it won’t necessarily be the same.

The External Keyword Research Tool essentially contained an assortment of disjointed workflows which gave the site owner a little freedom as to how they use the tool, but the Keyword Planner has one explicit purpose – helping advertisers set up their new PPC ad groups and campaigns as quickly as possible. But, the Keyword Planner doesn’t include ad share statistics or local search trends.

If the External Keyword Research Tool is at all a part of your PPC or SEO campaigns, you should likely begin getting to know the Keyword Planner now. You’ll have to eventually.

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

It has been well over a month since Penguin 2.0 was unleashed upon the world and the search industry is still reeling from the results of the algorithm update aimed at link profiles, low quality backlinks, and over-optimized anchor texts.

The average estimate says that Penguin 2.0 affected over 2-percent of all English queries. That doesn’t sound like much, but when SEO Roundtable took a poll in late May over half their readers say they had been hit by the changes.

First, it should be said that some portion of those may have been affected by a separate algorithm update released shortly before the new version of Penguin, but that update was aimed at typically spammy sectors like payday loans and pornography.

The majority of those saying they were affected by Penguin however were most likely correct about their recent drop in rankings or loss of traffic. It is either that, or far too many involved were misreading their data or somehow unaware that their payday loan site might be targeted by Google. Let’s assume that’s not the case, because that option sounds highly unlikely.

But, time has passed since Penguin came out. I’ve seen at least 10 articles detailing how to recover from Penguin, and numerous others focused on all the areas Penguin targeted. We should all be getting back to normal, right?

According to the recent poll from SEO Roundtable on the topic, that is not the case. Over 60 percent of those responding have said they haven’t recovered from the algorithm update, with only 7.5-percent saying they have fully recovered.

What does this mean? Well the respondents are clearly SEO informed people who keep up to date with the latest blogs, since they responded to one of the more reputable sites available on the issue. One major issue is that full recovery from Penguin isn’t possible for many of those affected until the next refresh. It is hard to know when that refresh could happen, though it may not be until the next update is announced.

The other issue is simply that those articles telling SEOs how to recover from Penguin range from completely valid to “how to try to cheat the new system” which can be confusing for inexperienced or uninformed SEOs. The best suggestion for solving this problem is playing close attention to what sites you are reading and always take the more conservative advice.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?

rsz_1377498_16940838Google has made it very clear that mobile SEO is going to play a big part in their plan moving forward. Last month, Google’s webspam team leader Matt Cutts stated as such during the SMX Advanced Conference in Seattle and Google’s own Webmaster Central Blog confirmed the changes will be here very soon. A recent update told webmasters, “We plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.”

It isn’t like these changes are coming out of nowhere. Analysts have been encouraging site owners and SEO professionals to pay attention to their mobile sites for years and mobile traffic increases show no signs of slowing down. So, you would think most companies with a fair amount of resources would already be ahead of the curve, but a recent assessment run by mobile marketing agency Pure Oxygen Labs shows that the top 100 companies on the Fortune 500 list are actually in danger of Google penalties in the near future.

Pure Oxygen Labs used their proprietary diagnostic tools to evaluate sites against Google’s best-practice criteria, according to Search Engine Land. They hoped to see how many sites redirected smartphone users to mobile pages, how these redirects are configured, and how widely responsive design was actually being used to reach mobile users.

Only six of the 100 Fortune 500 companies had sites that properly follow Google’s best-practices. The report stated that 11 percent of the sites use responsive design techniques, while only 56 percent of the sites served any sort of content formatted for their mobile users. That means 44 percent had absolutely nothing in the way of mobile optimized sites or content.

The six that actually completely complied with Google’s policies included Google, so it should be noted that means only five outside companies were safe from future penalties at the moment.

There were multiple reasons sites were ill-equipped, but the most common problems were faulty redirects and lack of responsive design, both issues Google has singled out recently as their primary targets for future attacks on poorly configured mobile sites.