DuckDuckGo LogoInternet security and privacy has been at the forefront of many people’s minds with the recent headlines about the NSA keeping data on the public’s online activity, and the issue has had subtle affects on search engines. We’ve seen a small group of searchers migrating to search engines with stricter privacy policies. Of course, those who are truly outraged by the NSA news would expect to see a pretty large shift, but so far the change has been slow. But, it is picking up momentum.

More and more people are learning about how Google actually decides which results to show you, as an individual, and many are a little concerned. While Google sees the decision to collect data on users as an attempt to individually tailor results, a few raise their eyebrows at the idea that a search engine and huge corporation is keeping fairly detailed tabs on the internet activities of users. The internet comes with an assumption that our activity is at least fairly private, though that notion is getting chipped away at daily. But, there is still the widespread assumption that our e-mails or simple search habits are our business alone, an assumption that is also being proved wrong.

These privacy issues have a fair number of people looking for search engines that keep our searches completely anonymous and don’t run data collection processes. The most notable solution people seem to be moving to is DuckDuckGo.com, a search engine whose privacy policy claims will not retain any personal information or share that information with other sites. The search engine has been seeing a traffic rise by close to 2 million searches per day since the NSA scandal broke.

There are numerous debates surrounding these issues. Political discourse focuses on the legality and ethical aspects of the government and large corporations working together to collect information on every citizen of the United States (other companies included in the NSA story include Yahoo, Facebook, and Microsoft). But, as SEO professionals, the bigger question is the ethical and practical reality of individually tailored results which rely entirely on data collection.

If you’ve ever taken a look at the ads on the edges of websites, you’ve probably noticed that the ads are loosely based on your personal information. The ads reflect your gender, age, location, and sometimes loose search histories. The ads you are shown are chosen based on information your computer relays to almost every site you access. Google acts the same way, but they collect this data and combine it extended data of your search history to deliver search results they believe are more relevant to you.

There is a practicality to this. We all have fine tuned personal tastes, and innately we desire for search engines to show us exactly what we want with the first search result, every time. While poll responses say that the majority of people don’t want personalized search results, are online actions belie our true desires for efficient search. The best way to do this is to gather data and use the data to fine-tune results. On a broad scale, we don’t want results for a grocery store in Los Angeles when we are physically situated in Oklahoma. On a smaller scale, we don’t want Google showing us sites we never go to when our favorite resource for a topic is a few results down the page.

In this respect, the move towards search engines like DuckDuckGo is actually a step back. These privacy-focused search engines are essentially acting how Google used to. They use no personal information, and simply try to show the best results for a specific search. It is a trade of privacy for functionality, and this could possibly explain the slow uptake or migration to these types of search engines. But, people are moving.

The longer the NSA story stays in the news, the more searches DuckDuckGo receives, and this could potentially have a significant affect on the search market in the future. The question is, do we want to sacrifice personal privacy and assumed online anonymity for searches that match our lives? Andrew Lazaunikas recently wrote an article on the debate for Search Engine Journal. He admits DuckDuckGo delivers excellent, unbiased results, but in the end, “when I want to know the best pizza place or car dealer in my area, the local results that Google and Bing shows are superior.”

Lazaunikas isn’t deterred by the aspect, and notes, “I can still get the information I need from DuckDuckGo by modifying my search.” He ends his statement by vowing to use DuckDuckGo more in the future, but the question is whether the public at large will follow. For the moment, it seems as though most people prefer quick easy searches and familiarity to trying out these new search engines.

After two fairly explicit warnings about advertorials this year, Google has added advertorials to their webmaster guidelines, as well as other popular spammy linking techniques in the Link Schemes help document.

Google Continues To Downplay Links

The biggest change is the removal of the entire first paragraph from the help article, which addressed how incoming links influence rankings. Search Engine Journal says the removed paragraph read:

Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.

Links have been steadily falling out of favor throughout the past few years, and it appears we are finally reaching a tipping point for Google’s reduction of linking’s role in search algorithms. Or, as Google has been advising, high-quality sites matter much more than links of any quality.

Keyword-Rich/Optimized Anchor Text Links

Google also tackled heavily-optimized anchor text used in press releases that are usually distributed across other sites. The technique has enjoyed a quick rise in highly competitive markets, and Google appears to finally be putting the squash on the practice. They did note that guest posting is still a popular practice, which can be valuable when done correctly. However, sites that accept guest blogging have been using nofollow or an optimized URL link to avoid issues.

Advertorials

And of course, the final change is the addition of advertorials as an example of unnatural links that violate Google guidelines.

Advertorials or native advertising where payment is received for articles that include links that pass PageRank.

Google has been making swift changes to linking policy and practice, so it is highly likely changes like this will keep occurring. Links can still be a strong weapon in your SEO strategy, but you have to tread carefully, and they maybe shouldn’t be your highest priority when optimizing.

Google has begun the process of pushing over the last few stragglers to Adwords Enhanced, and to reflect the big changes taking place, they’ve also been updating just about everything related to AdWords. Over the past week, they’ve redesigned the AdWords Help Center, as well as making some changes to how AdWords quality scores are reported.

AdWords Help Center Redesign

AdWords Help Center Graphic

The AdWords Help Center has always been an important resource for both new and old PPC campaign managers. Just as Google offers best practices for SEO, the help center for AdWords helps break down exactly how managing ads works and the best suggestions for those just getting started. The new redesign came with three major updates aimed to improve how the help center works and update the information contained within.

  1. Improved Navigation – To start out, Google has made the site much easier to get around, making the information more readily available. From the main navigation, you can now find portals to information on setting up and basic AdWords info, managing ads, community resources, and guides to success.
  2. More Visual Help – Google has openly said they will be making the Help Center more visual by filling it with infographics and screenshots. But, the Search Engine Journal report on the update found very little visual additions from the update. It is possible these additions are taking longer to implement, or that they have stepped away from this addition, but there are some new graphics to help explain AdWords, such as the one above.
  3. Guides to Success – Google has added a collection of instructional guides and tips to help get greener PPC managers started with their AdWords campaigns, but the information can also provide a helpful refresher for AdWords veterans who might not have checked up on Google’s latest suggestions.

Quality Score Reporting Revisions

The more functional change Google has made is an update to how the AdWords quality scores are reported within accounts. The company says these changes are aimed at making it easier for advertisers to adjust and revise any ads based on quality score, and to make it easier for users to gain more information on what is and isn’t working.

In their announcement, Google said:

As part of our ongoing efforts to help improve the quality of our ads, we’re announcing an update that changes how each keyword’s 1-10 numeric Quality Score is reported in AdWords. Under the hood, this reporting update will tie your 1-10 numeric Quality Score more closely to its three key sub factors — expected clickthrough rate, ad relevance, and landing page experience. We expect this update to reach all advertisers globally within the next several days.
We’re making this change so that the Quality Score in your reports more closely reflects the factors that influence the visibility and expected performance of your ads. We hope that providing you more transparency into your 1-10 Quality Score will help you improve the quality of your ads.

The way Google is calculating quality scores hasn’t changed at all, so there isn’t a great need to suddenly change how you’re running your campaigns, but they are simply changing the way these scores are reported to us and expanding on the information available.

However, advertisers using quality scores as part of automated rules will need to change or correct how the rules are interfacing with the new display methods.

Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?

This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.

Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.

He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”

It isn’t uncommon for webmasters or SEOs who operate numerous sites in a network to ask how many of them they can link together without bringing down the ax of Google. Finally that question made its way to Google’s head of Webspam who responded in one of his regular YouTube videos.

The question was phrased “should a customer with twenty domain names link it all together or not?” While blog networks can easily find legitimate reasons to link together twenty or more sites (though Cutts advises against it), it seems interesting to use the number in question to discuss normal webpages. As Cutts put it, “first off, why do you have 20 domain names? […] If it is all, you know, cheap-online-casinos or medical-malpractice-in-ohio, or that sort of stuff… having twenty domain names can look pretty spammy.”

When I think of networks with numerous full sites within them, I think of Gawker or Vice, two online news sources who spread their news out across multiple sites that are more focused on unique topics. For example, Vice also runs Motherboard, a tech focused website, as well as Noisey, a site devoted to music. Gawker on the other hand runs Deadspin, Gizmodo, iO9, Kotaku, and Jezebel, among a couple others. Note, at most those networks run 8 unique sites. There is little reason any network of unique but connected sites with more parts than that.

However, there are times when having up to twenty distinct domain names could make sense without being spammy. Cutts points out that when you have many different domain names that are all localized versions of your site, it is ok to be linking to them. Even in that scenario however, you shouldn’t be linking them in the footer. The suggested fix is to place them in a drop down menu where users have access.

Yesterday was the big day. July 22 marked the deadline for the roughly 2 million Adwords campaigns that have held out on converting to Adwords Enhanced and will be automatically upgraded. Google had blatantly stated the that yesterday was a hard deadline for the last 25 percent of Adwords users to migrate, but as per usual, the process will actually occur over a long period.

In an Inside Adwords blog post about the change, Google explained, “…starting today, we will begin upgrading all remaining campaigns automatically, bringing everyone onto the new AdWords platform. As with many product launches, the rollout will be gradually completed over several weeks.”

The forced upgrade brings about quite a few changes in how you should manage your campaigns, and to help everyone get started, Search Engine Watch brought together a group of professionals in the field to offer their advice.

Google also offered their own suggestions.

  1. Review your mobile bid adjustments – For most campaigns, the auto-upgrade default is based on bids from similar advertisers. You will need to visit the ‘Settings’ tab to optimize for your business.
  2. Identify unwanted keyword duplication in overlapping campaigns – If you previously were using similar legacy campaigns for every device type, it is suggested you identify matching campaigns and remove any unwanted duplicate keywords in the enhanced campaign.
  3. Review Display Network campaigns – You will want to verify that your display ads are reaching users on all desired devices and that you are using the correct bidding strategies.
  4. Explore the Enhanced Campaign features – It is recommended you try out upgraded sitelinks and upgraded call extensions to start. Then you can further boost results by creating mobile preferred ads and setting bid adjustments for location and time.
Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

Despite telling us that Google would no longer confirm when new Panda updates occur, they announced today that they were rolling out a new update that is “more finely targeted” than the original release of Penguin 2.0.

Unlike many Penguin updates, most webmasters actually seem happy to see the new version, as they are already claiming recovery from the original algorithm.

Google has said that their plan is to release Panda algorithm updates monthly over a ten day period, but Matt Cutts, head of Google’s Webspam team, implied there way a delay for this refresh because they wanted to ensure the signals would be loosened up a little from the last release.

The official statement from Google simply says, “In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.”

Search Engine Journal says the update has resulted in

  • Increase in impressions but same amount of CTR’s (viewable when logged into Google’s Webmaster Tools)
  • Informational sites such as Wikipedia and About.com have seen big impacts in their rankings
  • Authority sites are more prominent in SERPs.
  • Sites using Google+ are getting better rankings

Their suggestions for the future? It’s reaching the point where not using Google+ can hurt your site, and it is time to enable Google Authorship.

Google has been very clear about their stance on manipulative or deceptive behavior on websites. While they can’t tackle every shady practice sites have been enacting, they have narrowed their sites on a few manipulative acts they plan on taking down.

The first warning came when Google directly stated their intention to penalize sites who direct mobile users to unrelated mobile landing pages rather than the content they clicked to access. While that frustrating practice isn’t exactly manipulative, it is an example of sites redirecting users without their consent and can be terrible to try to get out of (clicking back often just leads to the mobile redirect page, ultimately placing you back at the page you didn’t ask for in the first place).

Now, Google is aiming at a similar tactic where site owners have been inserting fake pages into the browser history, so that when users attempt to exit, they are directed to a fake search results page that is entirely filled with ads or deceptive links, like the one below. It is basically a twist on the tactic which keeps placing users trying to exit back on the page they clicked to. The only way out is basically a flurry of clicks which end up putting you much further back in your history than you intended. You may not have seen it yet, but it has been popping up more and more lately.

Fake Search Results

The quick upswing is probably what raised Google’s interest in the tactic. As Search Engine Watch explains, deceptive behavior on sites has pretty much always been against Google’s guidelines and for them to make a special warning to sites adopting the practice suggests this practice is undergoing widespread dissemination to sites that are okay pushing Google’s limits.

Anyone keeping track should know that Google isn’t afraid to shutter a beloved service or tool at their whim. We’re all still mourning the loss of Google Reader, but Eric Siu from Entrepeneur says we should also be gearing up to lose the popular Google Adwords External Keyword Research Tool.

The free tool for Adwords is commonly used by site owners to dig up statistics on keyword search volume, estimated traffic volume, and average cost per click, but the most loved capability was the determining which specific keywords a site owner should targer with their future optimization strategies and PPC campaigns.

Google hasn’t announced anything yet, so there isn’t a confirmed shutdown date or any known information, but rumors suggest it could happen anytime. Google has implied the Adwords tool will be combined into a new program referred to as the Keyword Planner, but it won’t necessarily be the same.

The External Keyword Research Tool essentially contained an assortment of disjointed workflows which gave the site owner a little freedom as to how they use the tool, but the Keyword Planner has one explicit purpose – helping advertisers set up their new PPC ad groups and campaigns as quickly as possible. But, the Keyword Planner doesn’t include ad share statistics or local search trends.

If the External Keyword Research Tool is at all a part of your PPC or SEO campaigns, you should likely begin getting to know the Keyword Planner now. You’ll have to eventually.

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.