Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

It has been well over a month since Penguin 2.0 was unleashed upon the world and the search industry is still reeling from the results of the algorithm update aimed at link profiles, low quality backlinks, and over-optimized anchor texts.

The average estimate says that Penguin 2.0 affected over 2-percent of all English queries. That doesn’t sound like much, but when SEO Roundtable took a poll in late May over half their readers say they had been hit by the changes.

First, it should be said that some portion of those may have been affected by a separate algorithm update released shortly before the new version of Penguin, but that update was aimed at typically spammy sectors like payday loans and pornography.

The majority of those saying they were affected by Penguin however were most likely correct about their recent drop in rankings or loss of traffic. It is either that, or far too many involved were misreading their data or somehow unaware that their payday loan site might be targeted by Google. Let’s assume that’s not the case, because that option sounds highly unlikely.

But, time has passed since Penguin came out. I’ve seen at least 10 articles detailing how to recover from Penguin, and numerous others focused on all the areas Penguin targeted. We should all be getting back to normal, right?

According to the recent poll from SEO Roundtable on the topic, that is not the case. Over 60 percent of those responding have said they haven’t recovered from the algorithm update, with only 7.5-percent saying they have fully recovered.

What does this mean? Well the respondents are clearly SEO informed people who keep up to date with the latest blogs, since they responded to one of the more reputable sites available on the issue. One major issue is that full recovery from Penguin isn’t possible for many of those affected until the next refresh. It is hard to know when that refresh could happen, though it may not be until the next update is announced.

The other issue is simply that those articles telling SEOs how to recover from Penguin range from completely valid to “how to try to cheat the new system” which can be confusing for inexperienced or uninformed SEOs. The best suggestion for solving this problem is playing close attention to what sites you are reading and always take the more conservative advice.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?

rsz_1377498_16940838Google has made it very clear that mobile SEO is going to play a big part in their plan moving forward. Last month, Google’s webspam team leader Matt Cutts stated as such during the SMX Advanced Conference in Seattle and Google’s own Webmaster Central Blog confirmed the changes will be here very soon. A recent update told webmasters, “We plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.”

It isn’t like these changes are coming out of nowhere. Analysts have been encouraging site owners and SEO professionals to pay attention to their mobile sites for years and mobile traffic increases show no signs of slowing down. So, you would think most companies with a fair amount of resources would already be ahead of the curve, but a recent assessment run by mobile marketing agency Pure Oxygen Labs shows that the top 100 companies on the Fortune 500 list are actually in danger of Google penalties in the near future.

Pure Oxygen Labs used their proprietary diagnostic tools to evaluate sites against Google’s best-practice criteria, according to Search Engine Land. They hoped to see how many sites redirected smartphone users to mobile pages, how these redirects are configured, and how widely responsive design was actually being used to reach mobile users.

Only six of the 100 Fortune 500 companies had sites that properly follow Google’s best-practices. The report stated that 11 percent of the sites use responsive design techniques, while only 56 percent of the sites served any sort of content formatted for their mobile users. That means 44 percent had absolutely nothing in the way of mobile optimized sites or content.

The six that actually completely complied with Google’s policies included Google, so it should be noted that means only five outside companies were safe from future penalties at the moment.

There were multiple reasons sites were ill-equipped, but the most common problems were faulty redirects and lack of responsive design, both issues Google has singled out recently as their primary targets for future attacks on poorly configured mobile sites.

Source: Stock.xchangIt took a couple weeks for everything to even back out after the recent Penguin update, and now its time to start looking forward to what is coming up in SEO. It is an especially good time to make predictions for the rest of 2013 as we are just now passing the halfway point in the year and Google has made some of their intentions moving forward very clear.

Google has pulled out the big guns in their fight against spam, and have publicly stated their interests in user experience through site design and quality content. None of that is a surprise, but at the turn of the year none of it had actually been confirmed by people within the search engine juggernaut. A few months later and Matt Cutts basically confirmed everything we assumed before. Focus on the user and don’t try to cheat or loophole your way to the top and you should be fine.

Still, Google isn’t content to simply focus on one or two things at a time, and there are bound to be quite a few other changes in the near future that we haven’t been told about. Jayson DeMers analyzed all of the evidence from Google’s more subtle changes and announcements in the past few months to attempt to make predictions for what we might be seeing in the next year or so in SEO. They are all just guesses from the information available, but it’s always good to stay ahead of the curve and aware of changes that may be on the horizon.

For those still pushing backlinks as the golden goose of SEO, a recent revision to Google’s Ranking help guidelines could be potentially frightening. But, if you’ve been watching the changes in SEO over the past few years it shouldn’t come as much of a surprise. Google has become more and more strict about backlink quality and linkbuilding methods, and links were bound to be dethroned.

As reported by Search Engine Watch, it was spotted late last week that Google updated the Ranking help article to say “in general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.” Before, it told webmasters that they could improve their rank “by increasing the number of high-quality sites that link to their pages.”

There have been countless signs that Google would officially step back from linkbuilding as one of the most important ranking signals. There were widespread complaints for a while about competitors using negative SEO techniques like pointing bad links to websites, and every Penguin iteration that comes out is a significant event in SEO.

To top it all off, when Matt Cutts, the esteemed Google engineer, was asked about the top 5 basic SEO mistakes, he spent a lot of time talking about the misplaced emphasis on link building.

“I wouldn’t put too much of a tunnel vision focus on just links,” Cutts said. “I would try to think instead about what I can do to market my website to make it more well known within my community, or more broadly, without only thinking about search engines.”

Depending on your skill set, a recent Webmaster video may be good or bad news to bloggers and site owners out there. Most people have never considered whether stock photography or original photography has any effect on search engine rankings. As it happens, not even Matt Cutts has thought about it much.

There are tons of writers out there who don’t have the resources or talent with a camera to take pictures for every page or article they put out. Rather than deliver countless walls of text that people don’t like looking at, most of us without the artistic talent instead use stock photos to make the pages less boring and help our readers understand us more. For now, we have nothing to worry about.

Cutts, the head of Google’s Webspam team, used his latest Webmaster Chat to address this issue, and he says that to the best of his knowledge, original vs. stock photography has no impact on how your pages rank. However, he won’t rule it out for the future.

“But you know what that is a great suggestion for a future signal that we could look at in terms of search quality. Who knows, maybe original image sites might be higher quality, whereas a site that just repeat the same stock photos over and over again might not be nearly as high quality. But to the best of my knowledge, we don’t use that directly in our algorithmic ranking right now.”

Logically, I would say that if Google does decide to start consideration photo originality on web pages, Cutts appears to be more worried about sites that use the same images “over and over” rather than those who search for relevant and unique stock images for articles. Penalizing every website owner without a hired photographer to continuously produce images for every new page would seem a bit overkill.

Matt Cutts, head of Google’s Webspam team, recently announced via Twitter that a new ranking update focusing on spammy queries has officially gone live, according to Danny Goodwin from Search Engine Watch. At the same time, Google has made it clear that if you don’t have a quality mobile website, you’re going to start seeing your rankings dropping.

Spammy Queries Ranking Update

The ranking update for spammy queries is supposed to affect 0.3 to 0.5 percent of English queries, but it shouldn’t be much of a shock to anyone who has been listening to what Cutts says. It was one of the most notable updates Cutts spoke about in an earlier Google Webmaster video where he discussed what to expect from Google this summer.

Cutts says the updates are specifically focused on queries notorious for spam such as “payday loans” on Google.co.uk as well as pornographic queries. The roll-out of the update will be similar to many of Google’s recent changes in that it is being implemented gradually over the next few months.

Smartphone Ranking Changes

SmartphoneIt appears we’ve finally reached the point where slacking on mobile SEO is going to objectively hurt your site as a whole. A recent post added to the Google Webmaster Central Blog warns that “we plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.”

Google named two primary mobile mistakes as their primary targets: fault redirects and smartphone only errors. Faulty redirects are “when a desktop page redirects smartphone users to an irrelevant page on the smart-phone optimized website,” such as when you get automatically sent to a homepage on a smartphone, rather than the actual content you searched for. Smartphone only errors, on the other hand, occur when sites allow desktop users reaching a page to see content, but gives smartphone users errors.

This is Google’s first big move in adding mobile configuration as a ranking consideration, but their advice belies their intent to continue to pay attention to mobile. They suggest “try to test your site on as many different mobile devices and operating systems, or their emulators, as possible.” It isn’t acceptable to only pay attention to desktop anymore.

Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

Penguin 2.0 only affected 2.3% of search queries, but you would think it did much more from the response online. Ignoring all of the worrying before the release, there have been tons of comments about the first-hand effects it seems many are dealing with in the post-Penguin 2.0 web. Those spurned by the new Penguin algorithm have even accused Google of only releasing the update to increase their profitability.

Matt Cutts, head of Google’s Webspam team, used his recent Webmaster Chat video to attack that idea head on. The main question he was asked is what aspect of Google updates Cutts thinks the SEO industry doesn’t understand. While Matt expresses concern about the amount of people who don’t get the difference between algorithm updates and data refreshes, Cutts’ main focus is the concept that Google is hurting web owners to improve their profits.

Most notably, the algorithm updates simply aren’t profitable. Google experienced decreases in their revenue from almost all their recent updates, but Cutts says that money isn’t the focus. Google is aiming at improving the quality of the internet experience, especially search. While site owners using questionable methods are upset, most searchers will hopefully feel that the updates have improved their experience, which will keep them coming back and using Google.

As far as the misunderstandings between algorithm updates and data refreshes, Cutts has expanded on the problem more elsewhere. The biggest difference is that the algorithm update changes how the system is working while data refreshes do not and only change the information the system is using or seeing.

Cutts was also asked which aspect of SEO that we are spending too much time on, which leads Cutts to one of the main practices that Penguin focuses on: link building. Too many SEOs are still putting too much faith in that single practice though it is being destabilized by other areas that more directly affect the quality of users’ experiences such as creating compelling content. Instead, Matt urges SEOs to pay more attention to design and speed, emphasizing the need to create the best web experience possible.

Cutts’ video is below, but the message is that Google is going to keep growing and evolving, whether you like it or not. If you listen to what they say and tell you about handling your SEO, you may have to give up some of your old habits but you’ll spend much less time worrying about the next algorithm update.

You run a small local business with brick and mortar locations. What reason do you have to invest in online marketing? Actually, there are quite a few reasons local businesses can benefit from online marketing.

You want your business to be reaching out to customers everywhere they are looking for you or services like yours, and more and more people are turning to the internet before they make a purchase. If they aren’t buying straight off the web, they are checking reviews and public perception of the products they are looking for.

A recent BIA/Kelsey report said that 97% of consumers use online media before making local purchases, and Google suggests that 9 out of 10 internet searches led to follow up actions such as calling or visiting businesses. That means the majority of consumers are turning to the internet, and if your business isn’t there, they will find others.

Online marketing isn’t as intimidating as many thing, either. Search Engine Land says that 50% of small businesses’ online listings are wrong, and the majority of small business owners claim they don’t have the time to keep online listings up to date. Keeping Google’s information on your business updated only takes a few minutes, and that is where most will find you. You can create a local business listing even if you don’t have a website or sell anything online.

The one step above this is to embrace social media. Many smaller businesses focus almost their entire web presence on Facebook, Google+, and Twitter, because these are where the brands can reach out directly to consumers.

If you do wish to fully capitalize on online marketing, but don’t think you have the time, hiring someone to manage your online brand and website eventually pays itself off in public awareness of your brand and cementing your brand identity as a trusted business in the community. However, you can’t just do a little. A shoddy or out of date website can hurt public perception of your company, so keeping your site up to date with all the current web standards is important to maintaining your brand’s integrity.