A website redesign can be unbelievably exciting, but it can also be dangerous to your traffic. If you don’t communicate well with them, designers and creative teams can accidentally throw out all your hard work on optimization in favor of purely visual aspects of the site. You can lose content, functionality, and all the other optimization that has won you the favor of search engines.

With a few considerations and regular contact with the design team, all of these problems can be prevented. Brad Miller pointed out seven factors you should consider when tackling a redesign. Just don’t get to eager to delve into changing how your site looks, and you can end up with a great looking site that works as well or better as your old design.

  1. Always start with research – Any design that is going to give you results is built on research. You need to know who you’re targeting, what the best functionality practices are, the current standards, and doing extensive market research. This shouldn’t come part of the way through the design or after the site is built. It should always be the very first move you make.
  2. A Redesign Changes Your Site Structure – A quality redesign can be much more than a new coat of paint on an old frame. It gives you the opportunity to change how your site is structured entirely, which should be used as an opportunity to optimize your site for visibility and conversions. Consider what pages are succeeding and what isn’t on your page and reassess how you can efficiently design your site.
  3. Redirects – Before redesigning begins, you should make an inventory of every page and incoming links on your site, including subdomains. As the structure of your site is changed, including the URLs, a strategy will need to be put into place for redirects to protect any SEO rankings. Audit where links are coming from and going to, then map out all your pages as well as their new redirects.
  4. Navigation – You need to consider how people will be finding your site from the start, and putting that information into your URL structure. Can you shorten URLs or make them more streamlined? As sites grow, URLs can become unweildly, and should be trimmed as much as possible. Once you have people on your site, however, you need to understand how they will navigate around the site. Where are they entering your site from? What do you want visitors to do? If you know how visitors navigate your site, you can design it to direct them where you want.
  5. How is the Content Going to Be Presented? – Content is the keystone to a successful online marketing campaign, but it is still an afterthought for many site designs. Content should be visible and worth the attention of your viewers. Decide before hand whether you will have a blog and how that blog is going to be used.
  6. Technical SEO – Way too many redesigns play with factors that need to be controlled for proper optimization. They build sites that look great, but take ages to load losing visitors and credibility with search engines. However, you can use the redesign to toy with some behind the scenes factors like ensuring your site is compliant with all the standard best practices of design and SEO and cleaning up your code to make sure search engine crawlers will be able to easily understand your site.
  7. Testing – Test everything you can afford to. Not only do you gain invaluable data about your consumers and how your site is actually being used, but you get the chance to actively connect with customers and mold your new site to their needs.

Anyone keeping track should know that Google isn’t afraid to shutter a beloved service or tool at their whim. We’re all still mourning the loss of Google Reader, but Eric Siu from Entrepeneur says we should also be gearing up to lose the popular Google Adwords External Keyword Research Tool.

The free tool for Adwords is commonly used by site owners to dig up statistics on keyword search volume, estimated traffic volume, and average cost per click, but the most loved capability was the determining which specific keywords a site owner should targer with their future optimization strategies and PPC campaigns.

Google hasn’t announced anything yet, so there isn’t a confirmed shutdown date or any known information, but rumors suggest it could happen anytime. Google has implied the Adwords tool will be combined into a new program referred to as the Keyword Planner, but it won’t necessarily be the same.

The External Keyword Research Tool essentially contained an assortment of disjointed workflows which gave the site owner a little freedom as to how they use the tool, but the Keyword Planner has one explicit purpose – helping advertisers set up their new PPC ad groups and campaigns as quickly as possible. But, the Keyword Planner doesn’t include ad share statistics or local search trends.

If the External Keyword Research Tool is at all a part of your PPC or SEO campaigns, you should likely begin getting to know the Keyword Planner now. You’ll have to eventually.

Bing has been regularly growing its market share over the past year, but don’t think it is at the expense of Google. In June, Bing’s share of all searches went up to 17.9 percent, but it was Yahoo who dropped to 11.4 percent, according to comScore. Yahoo lost exactly as much search as Bing gained, which may not have been what Yahoo CEO Marissa Mayer was hoping for when they signed the search deal with Microsoft.

Earlier this year, Mayer said, “One of the points of the alliance is that we collectively want to grow share rather than just trading share with each other. We need to see monetization working better because we know that it can and we’ve seen other competitors in the space illustrate how well it can work.”

Meanwhile, as Search Engine Watch reports, Google has held steady with exactly two-thirds of the market share, though it is down .1 percent from last year’s June share of 66.8 percent.

In 2012, Bing held 15.6 percent of the market, but they have been making regular gains, almost exclusively at the expense of smaller search engines. Yahoo on the other hand is at an all-time low, down from 13 percent last year.

Any SEO professional who has been around for a couple design trends knows what its like to bump heads with designers about the design methods and usability. There are certain innovative design trends that can be wonders for usability, but are completely at odds with standard SEO practices. According to Janet Driscoll Miller, that doesn’t mean we have to throw out both, we just have to be creative with our solutions to integrate creative design.

Parallax design is the most popular trend that runs into this issue with usabilty and SEO. It has actually been around for a few years, but it has recently gained quite a bit of notoriety as designers have used it to animate pages in a way that scrolling makes the entire page change what is being shown. It’s really easier to show people than to describe.

The most commonly seen site with this design style is the Spotify front page which essentially moves layers as the viewer scrolls downwards.

Spotify Screenshot

 

What makes parallax design so popular is it basically allows the site to walk a visitor through a story by scrolling down the page. Google has even used the style for their big “How Search Works” site, where Google tried to explain how it works to the average internet user. It directs how visitors view the site, rather than letting visitors click around at will.

The big problem is that parallax designs are essentially extremely large one page websites, which is extremely difficult to optimize for many search terms. All of your keywords have to be concentrated onto one page, rather than spread out across many as Google is used to. On top of that, inbound links to your site all point to a single page, not specific content.

Another interesting problem is that parallax design doesn’t work on mobile phones of any kind. As mobile traffic rises, that means more and more people aren’t able to use pages in this style. It also means site owners have to basically create an entire separate mobile version of their site. Many companies already do this, as Google did for the “How Search Works” site. Until responsive design popped up, it was common practice to build a second mobile site.

None of this means we should immediately cut out parallax design. As stated before, parallax design is unparalleled at telling stories, and some site don’t have to rely heavily on SEO to drive traffic. There is also an approach which allows you to use parallax design and a multipage site, by creating accomying sub-pages, like Spotify did. The home page is a parallax design, but the links take you to content on separate, static pages. That creates a static URL for different content and allows keywords to be more spread out.

Deciding whether or not to hop on these trends all depends on what you intend to achieve with your site. If you intend to tell a story or direct visitors through your site in a linear fashion, parallax design is possibly the best answer.

What makes one SEO company successful and another fail? There could be a multitude of factors, but according to a recent study by Ascend2, chances are social integration is the key. Successful SEO companies integrate social media into their SEO plans and strategies far more than companies who report they are struggling.

Ascend2, a popular research agency, surveyed nearly 600 businesses and marketing professionals from around the world and asked the participants to rate their own companies’ SEO success. The survey then compared the answers from the 15 percent who rated their companies as “very successful” with SEO and the 18 percent who reported being “not successful.” This creates a relatively small sample size, but the findings are still interesting, and as Matt McGee from Search Engine Land suggests, would appear to hold up to larger samples.

By far the biggest difference between the companies is their use of social media within their SEO strategy. Their charts show that 38 percent of those reporting “very successful” with SEO are doing extensive social integration, while only two percent of the “not successful” companies say they are. On top of that, a full 50 percent of the “not successful” companies report doing no social media integration at all. Frighteningly, when looking at the results of all those surveyed, almost a quarter of the companies said they were not integrating social media into their SEO strategies.

Ascend2 Chart

The full report is available for free, though you do have to give your contact information. You would think at this point most SEO professionals would be aware of how important social media is to your SEO strategies. These results however show just how many companies are working with strategies that are behind the times and dragging their companies down.

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

Bing LogoWhen companies take the leap to establishing their brand’s reputation online, the focus is always on taking advantage of every opportunity Google gives you to try to connect with potential consumers.

However, any SEO or online business who is only paying attention to Google isn’t completely controlling their online reputation. Online reputation management requires understanding a complex ecosystem of sites where users are able to connect with your brand, and those include other search engines, social media, local search platforms such as Yelp, and business accreditation sites like those for the Better Business Bureau.

Of course, taking control of the first page of Google is the best first step for a company hoping to take the reigns on their online brand, but it isn’t the only step. Google controls roughly two thirds of all search traffic, but that also means you’re missing out on a third of all of the marketplace.

The second most popular search engine is Bing, and they’ve been making notable gains lately, rising to 17.4 percent of the market share from 13 percent last year. Microsoft has been marketing Bing rather strongly and it is clear the search engine will only keep gaining ground for the near future. Once you’ve taken control of the first page of Google, George Fischer suggests trying to capitalize on the often forgotten market of Bing, and he explains how you can do so in his article for Search Engine Watch.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

It has been well over a month since Penguin 2.0 was unleashed upon the world and the search industry is still reeling from the results of the algorithm update aimed at link profiles, low quality backlinks, and over-optimized anchor texts.

The average estimate says that Penguin 2.0 affected over 2-percent of all English queries. That doesn’t sound like much, but when SEO Roundtable took a poll in late May over half their readers say they had been hit by the changes.

First, it should be said that some portion of those may have been affected by a separate algorithm update released shortly before the new version of Penguin, but that update was aimed at typically spammy sectors like payday loans and pornography.

The majority of those saying they were affected by Penguin however were most likely correct about their recent drop in rankings or loss of traffic. It is either that, or far too many involved were misreading their data or somehow unaware that their payday loan site might be targeted by Google. Let’s assume that’s not the case, because that option sounds highly unlikely.

But, time has passed since Penguin came out. I’ve seen at least 10 articles detailing how to recover from Penguin, and numerous others focused on all the areas Penguin targeted. We should all be getting back to normal, right?

According to the recent poll from SEO Roundtable on the topic, that is not the case. Over 60 percent of those responding have said they haven’t recovered from the algorithm update, with only 7.5-percent saying they have fully recovered.

What does this mean? Well the respondents are clearly SEO informed people who keep up to date with the latest blogs, since they responded to one of the more reputable sites available on the issue. One major issue is that full recovery from Penguin isn’t possible for many of those affected until the next refresh. It is hard to know when that refresh could happen, though it may not be until the next update is announced.

The other issue is simply that those articles telling SEOs how to recover from Penguin range from completely valid to “how to try to cheat the new system” which can be confusing for inexperienced or uninformed SEOs. The best suggestion for solving this problem is playing close attention to what sites you are reading and always take the more conservative advice.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?

As always, there are a lot of different opinions about link building across the web. There are still those who offer ways to “dominate” links with schemes that push the boundaries of what Google allows and some who are beginning to completely write off link building as a practice.

It is a bit hasty to completely do away with your linking efforts, as they are certainly still a consideration by the search engines. But, we also live in an entirely different linking climate than that of just a couple years (or months) ago. Moderation and quality are the key words in the link building discussion these days, and it is important to know when someone is giving bad advice.

If only you explicitly knew what link building tactics you should just not do, right? Erin Everhart from Search Engine Land offers just that with her article from last week laying out exactly what linking techniques we can just cut out of our routines, and how to pinpoint when people are trying to give you bad advice.

Of course, it all starts with discussing that special word “quality,” which is now the most important factor in all your link building efforts. Google no longer cares if you have countless links to your site, if none of them are reputable. Actually, they do care. They will penalize your site for trying to use bad links to boost your profile. Natural, quality links from sites people actually read are the only way to get positive results from your link building, and anything else is just as likely to hurt you.

In the vein of quality over quantity, mass article submission is almost as bad as farming huge numbers of shoddy links. Its the obvious successor to the new “content is king” mantra everyone is espousing now, as those who were directly gaining scores of low quality links turned to submitting the same weak article to hundreds of different sites.

Of course, there are other link problems aside from link farming in various manners. Though it has become less popular after Penguin, there are still backlink profiles out there with higher exact-match anchor text percentages than their company name. Anchor text is still very important, but there is no reasonable scenario in which you should end up with that high of a percentage. You need way fewer exact-match links than you did a couple years ago, so just follow the rule of moderation.

Guest blogging is even becoming a problem. There are so many sites hiring “article writers” who churn out 10 to 15 articles a day that the tactic has become yet another link building scheme. Instead of outright buying links, they are buying writers to build them cheap links. Guest blogging can be great when done correctly, but you have to take the time to ensure they know a lot about the industry they represent and will provide value to your site.

There are even more link building tactics still happening right now despite Google’s best efforts to shut down the more spammy efforts. Everhart covers a few more in her article, but the main point is that any good-natured SEO tactic can be corrupted and used to try to trick the search engines, the problem is Google and Bing have gotten much smarter, and they will almost always catch you if you try to outwit them. Play be the rules, and give your sites the attention they need, but don’t try to play the system.