It isn’t uncommon for webmasters or SEOs who operate numerous sites in a network to ask how many of them they can link together without bringing down the ax of Google. Finally that question made its way to Google’s head of Webspam who responded in one of his regular YouTube videos.

The question was phrased “should a customer with twenty domain names link it all together or not?” While blog networks can easily find legitimate reasons to link together twenty or more sites (though Cutts advises against it), it seems interesting to use the number in question to discuss normal webpages. As Cutts put it, “first off, why do you have 20 domain names? […] If it is all, you know, cheap-online-casinos or medical-malpractice-in-ohio, or that sort of stuff… having twenty domain names can look pretty spammy.”

When I think of networks with numerous full sites within them, I think of Gawker or Vice, two online news sources who spread their news out across multiple sites that are more focused on unique topics. For example, Vice also runs Motherboard, a tech focused website, as well as Noisey, a site devoted to music. Gawker on the other hand runs Deadspin, Gizmodo, iO9, Kotaku, and Jezebel, among a couple others. Note, at most those networks run 8 unique sites. There is little reason any network of unique but connected sites with more parts than that.

However, there are times when having up to twenty distinct domain names could make sense without being spammy. Cutts points out that when you have many different domain names that are all localized versions of your site, it is ok to be linking to them. Even in that scenario however, you shouldn’t be linking them in the footer. The suggested fix is to place them in a drop down menu where users have access.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

Despite telling us that Google would no longer confirm when new Panda updates occur, they announced today that they were rolling out a new update that is “more finely targeted” than the original release of Penguin 2.0.

Unlike many Penguin updates, most webmasters actually seem happy to see the new version, as they are already claiming recovery from the original algorithm.

Google has said that their plan is to release Panda algorithm updates monthly over a ten day period, but Matt Cutts, head of Google’s Webspam team, implied there way a delay for this refresh because they wanted to ensure the signals would be loosened up a little from the last release.

The official statement from Google simply says, “In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.”

Search Engine Journal says the update has resulted in

  • Increase in impressions but same amount of CTR’s (viewable when logged into Google’s Webmaster Tools)
  • Informational sites such as Wikipedia and About.com have seen big impacts in their rankings
  • Authority sites are more prominent in SERPs.
  • Sites using Google+ are getting better rankings

Their suggestions for the future? It’s reaching the point where not using Google+ can hurt your site, and it is time to enable Google Authorship.

Google has been very clear about their stance on manipulative or deceptive behavior on websites. While they can’t tackle every shady practice sites have been enacting, they have narrowed their sites on a few manipulative acts they plan on taking down.

The first warning came when Google directly stated their intention to penalize sites who direct mobile users to unrelated mobile landing pages rather than the content they clicked to access. While that frustrating practice isn’t exactly manipulative, it is an example of sites redirecting users without their consent and can be terrible to try to get out of (clicking back often just leads to the mobile redirect page, ultimately placing you back at the page you didn’t ask for in the first place).

Now, Google is aiming at a similar tactic where site owners have been inserting fake pages into the browser history, so that when users attempt to exit, they are directed to a fake search results page that is entirely filled with ads or deceptive links, like the one below. It is basically a twist on the tactic which keeps placing users trying to exit back on the page they clicked to. The only way out is basically a flurry of clicks which end up putting you much further back in your history than you intended. You may not have seen it yet, but it has been popping up more and more lately.

Fake Search Results

The quick upswing is probably what raised Google’s interest in the tactic. As Search Engine Watch explains, deceptive behavior on sites has pretty much always been against Google’s guidelines and for them to make a special warning to sites adopting the practice suggests this practice is undergoing widespread dissemination to sites that are okay pushing Google’s limits.

A website redesign can be unbelievably exciting, but it can also be dangerous to your traffic. If you don’t communicate well with them, designers and creative teams can accidentally throw out all your hard work on optimization in favor of purely visual aspects of the site. You can lose content, functionality, and all the other optimization that has won you the favor of search engines.

With a few considerations and regular contact with the design team, all of these problems can be prevented. Brad Miller pointed out seven factors you should consider when tackling a redesign. Just don’t get to eager to delve into changing how your site looks, and you can end up with a great looking site that works as well or better as your old design.

  1. Always start with research – Any design that is going to give you results is built on research. You need to know who you’re targeting, what the best functionality practices are, the current standards, and doing extensive market research. This shouldn’t come part of the way through the design or after the site is built. It should always be the very first move you make.
  2. A Redesign Changes Your Site Structure – A quality redesign can be much more than a new coat of paint on an old frame. It gives you the opportunity to change how your site is structured entirely, which should be used as an opportunity to optimize your site for visibility and conversions. Consider what pages are succeeding and what isn’t on your page and reassess how you can efficiently design your site.
  3. Redirects – Before redesigning begins, you should make an inventory of every page and incoming links on your site, including subdomains. As the structure of your site is changed, including the URLs, a strategy will need to be put into place for redirects to protect any SEO rankings. Audit where links are coming from and going to, then map out all your pages as well as their new redirects.
  4. Navigation – You need to consider how people will be finding your site from the start, and putting that information into your URL structure. Can you shorten URLs or make them more streamlined? As sites grow, URLs can become unweildly, and should be trimmed as much as possible. Once you have people on your site, however, you need to understand how they will navigate around the site. Where are they entering your site from? What do you want visitors to do? If you know how visitors navigate your site, you can design it to direct them where you want.
  5. How is the Content Going to Be Presented? – Content is the keystone to a successful online marketing campaign, but it is still an afterthought for many site designs. Content should be visible and worth the attention of your viewers. Decide before hand whether you will have a blog and how that blog is going to be used.
  6. Technical SEO – Way too many redesigns play with factors that need to be controlled for proper optimization. They build sites that look great, but take ages to load losing visitors and credibility with search engines. However, you can use the redesign to toy with some behind the scenes factors like ensuring your site is compliant with all the standard best practices of design and SEO and cleaning up your code to make sure search engine crawlers will be able to easily understand your site.
  7. Testing – Test everything you can afford to. Not only do you gain invaluable data about your consumers and how your site is actually being used, but you get the chance to actively connect with customers and mold your new site to their needs.

Anyone keeping track should know that Google isn’t afraid to shutter a beloved service or tool at their whim. We’re all still mourning the loss of Google Reader, but Eric Siu from Entrepeneur says we should also be gearing up to lose the popular Google Adwords External Keyword Research Tool.

The free tool for Adwords is commonly used by site owners to dig up statistics on keyword search volume, estimated traffic volume, and average cost per click, but the most loved capability was the determining which specific keywords a site owner should targer with their future optimization strategies and PPC campaigns.

Google hasn’t announced anything yet, so there isn’t a confirmed shutdown date or any known information, but rumors suggest it could happen anytime. Google has implied the Adwords tool will be combined into a new program referred to as the Keyword Planner, but it won’t necessarily be the same.

The External Keyword Research Tool essentially contained an assortment of disjointed workflows which gave the site owner a little freedom as to how they use the tool, but the Keyword Planner has one explicit purpose – helping advertisers set up their new PPC ad groups and campaigns as quickly as possible. But, the Keyword Planner doesn’t include ad share statistics or local search trends.

If the External Keyword Research Tool is at all a part of your PPC or SEO campaigns, you should likely begin getting to know the Keyword Planner now. You’ll have to eventually.

Bing has been regularly growing its market share over the past year, but don’t think it is at the expense of Google. In June, Bing’s share of all searches went up to 17.9 percent, but it was Yahoo who dropped to 11.4 percent, according to comScore. Yahoo lost exactly as much search as Bing gained, which may not have been what Yahoo CEO Marissa Mayer was hoping for when they signed the search deal with Microsoft.

Earlier this year, Mayer said, “One of the points of the alliance is that we collectively want to grow share rather than just trading share with each other. We need to see monetization working better because we know that it can and we’ve seen other competitors in the space illustrate how well it can work.”

Meanwhile, as Search Engine Watch reports, Google has held steady with exactly two-thirds of the market share, though it is down .1 percent from last year’s June share of 66.8 percent.

In 2012, Bing held 15.6 percent of the market, but they have been making regular gains, almost exclusively at the expense of smaller search engines. Yahoo on the other hand is at an all-time low, down from 13 percent last year.

Any SEO professional who has been around for a couple design trends knows what its like to bump heads with designers about the design methods and usability. There are certain innovative design trends that can be wonders for usability, but are completely at odds with standard SEO practices. According to Janet Driscoll Miller, that doesn’t mean we have to throw out both, we just have to be creative with our solutions to integrate creative design.

Parallax design is the most popular trend that runs into this issue with usabilty and SEO. It has actually been around for a few years, but it has recently gained quite a bit of notoriety as designers have used it to animate pages in a way that scrolling makes the entire page change what is being shown. It’s really easier to show people than to describe.

The most commonly seen site with this design style is the Spotify front page which essentially moves layers as the viewer scrolls downwards.

Spotify Screenshot

 

What makes parallax design so popular is it basically allows the site to walk a visitor through a story by scrolling down the page. Google has even used the style for their big “How Search Works” site, where Google tried to explain how it works to the average internet user. It directs how visitors view the site, rather than letting visitors click around at will.

The big problem is that parallax designs are essentially extremely large one page websites, which is extremely difficult to optimize for many search terms. All of your keywords have to be concentrated onto one page, rather than spread out across many as Google is used to. On top of that, inbound links to your site all point to a single page, not specific content.

Another interesting problem is that parallax design doesn’t work on mobile phones of any kind. As mobile traffic rises, that means more and more people aren’t able to use pages in this style. It also means site owners have to basically create an entire separate mobile version of their site. Many companies already do this, as Google did for the “How Search Works” site. Until responsive design popped up, it was common practice to build a second mobile site.

None of this means we should immediately cut out parallax design. As stated before, parallax design is unparalleled at telling stories, and some site don’t have to rely heavily on SEO to drive traffic. There is also an approach which allows you to use parallax design and a multipage site, by creating accomying sub-pages, like Spotify did. The home page is a parallax design, but the links take you to content on separate, static pages. That creates a static URL for different content and allows keywords to be more spread out.

Deciding whether or not to hop on these trends all depends on what you intend to achieve with your site. If you intend to tell a story or direct visitors through your site in a linear fashion, parallax design is possibly the best answer.

What makes one SEO company successful and another fail? There could be a multitude of factors, but according to a recent study by Ascend2, chances are social integration is the key. Successful SEO companies integrate social media into their SEO plans and strategies far more than companies who report they are struggling.

Ascend2, a popular research agency, surveyed nearly 600 businesses and marketing professionals from around the world and asked the participants to rate their own companies’ SEO success. The survey then compared the answers from the 15 percent who rated their companies as “very successful” with SEO and the 18 percent who reported being “not successful.” This creates a relatively small sample size, but the findings are still interesting, and as Matt McGee from Search Engine Land suggests, would appear to hold up to larger samples.

By far the biggest difference between the companies is their use of social media within their SEO strategy. Their charts show that 38 percent of those reporting “very successful” with SEO are doing extensive social integration, while only two percent of the “not successful” companies say they are. On top of that, a full 50 percent of the “not successful” companies report doing no social media integration at all. Frighteningly, when looking at the results of all those surveyed, almost a quarter of the companies said they were not integrating social media into their SEO strategies.

Ascend2 Chart

The full report is available for free, though you do have to give your contact information. You would think at this point most SEO professionals would be aware of how important social media is to your SEO strategies. These results however show just how many companies are working with strategies that are behind the times and dragging their companies down.

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

Bing LogoWhen companies take the leap to establishing their brand’s reputation online, the focus is always on taking advantage of every opportunity Google gives you to try to connect with potential consumers.

However, any SEO or online business who is only paying attention to Google isn’t completely controlling their online reputation. Online reputation management requires understanding a complex ecosystem of sites where users are able to connect with your brand, and those include other search engines, social media, local search platforms such as Yelp, and business accreditation sites like those for the Better Business Bureau.

Of course, taking control of the first page of Google is the best first step for a company hoping to take the reigns on their online brand, but it isn’t the only step. Google controls roughly two thirds of all search traffic, but that also means you’re missing out on a third of all of the marketplace.

The second most popular search engine is Bing, and they’ve been making notable gains lately, rising to 17.4 percent of the market share from 13 percent last year. Microsoft has been marketing Bing rather strongly and it is clear the search engine will only keep gaining ground for the near future. Once you’ve taken control of the first page of Google, George Fischer suggests trying to capitalize on the often forgotten market of Bing, and he explains how you can do so in his article for Search Engine Watch.