Tag Archive for: Google

Wednesday, Google, Gmail, YouTube, and all the other similar services went unresponsive for roughly an hour in many parts of the United States. The problem was quickly resolved, but not before Twitter freaked out and the story reached many news outlets.

Now, Google’s head of Webspam used his Webmaster Chat to answer the big question that site owners who have gone through similar experiences have often wondered. If your site goes down temporarily, does it affect your rankings?

According to Cutts, having a site go offline shouldn’t negatively impact your rankings, so long as you fix the problem quickly. Obviously, Google wants to be directing searchers to sites that are working, so if a site has been offline for days, it makes sense for Google to replace it with a working relevant site. But, Google isn’t so quick to cut out an offline site.

Once Google notices your site is offline, they will attempt to notify those registered with Google Webmaster Tools that their site is unreachable. The messages generally say something along the lines of GoogleBot not being able to access the site.

Then, roughly 24 hours after Google has noticed your site isn’t working, they will come back to check the status of your site. This means that sites can be offline for roughly a full day or more before you can expect any negative affects from the search engines. However, if you’re site has been down for 48 hours or more, chances are Google is going to delist the site, at least temporarily.

Search Engine Land pointed out that there are also other tools available to monitor sites for you and alert webmasters if their site becomes unavailable. They suggest the free service Pingdom, though there are also plenty others to choose from.

Bing LogoWhen companies take the leap to establishing their brand’s reputation online, the focus is always on taking advantage of every opportunity Google gives you to try to connect with potential consumers.

However, any SEO or online business who is only paying attention to Google isn’t completely controlling their online reputation. Online reputation management requires understanding a complex ecosystem of sites where users are able to connect with your brand, and those include other search engines, social media, local search platforms such as Yelp, and business accreditation sites like those for the Better Business Bureau.

Of course, taking control of the first page of Google is the best first step for a company hoping to take the reigns on their online brand, but it isn’t the only step. Google controls roughly two thirds of all search traffic, but that also means you’re missing out on a third of all of the marketplace.

The second most popular search engine is Bing, and they’ve been making notable gains lately, rising to 17.4 percent of the market share from 13 percent last year. Microsoft has been marketing Bing rather strongly and it is clear the search engine will only keep gaining ground for the near future. Once you’ve taken control of the first page of Google, George Fischer suggests trying to capitalize on the often forgotten market of Bing, and he explains how you can do so in his article for Search Engine Watch.

Image Courtesy of Martin Pettitt

Image Courtesy of Martin Pettitt

It has been well over a month since Penguin 2.0 was unleashed upon the world and the search industry is still reeling from the results of the algorithm update aimed at link profiles, low quality backlinks, and over-optimized anchor texts.

The average estimate says that Penguin 2.0 affected over 2-percent of all English queries. That doesn’t sound like much, but when SEO Roundtable took a poll in late May over half their readers say they had been hit by the changes.

First, it should be said that some portion of those may have been affected by a separate algorithm update released shortly before the new version of Penguin, but that update was aimed at typically spammy sectors like payday loans and pornography.

The majority of those saying they were affected by Penguin however were most likely correct about their recent drop in rankings or loss of traffic. It is either that, or far too many involved were misreading their data or somehow unaware that their payday loan site might be targeted by Google. Let’s assume that’s not the case, because that option sounds highly unlikely.

But, time has passed since Penguin came out. I’ve seen at least 10 articles detailing how to recover from Penguin, and numerous others focused on all the areas Penguin targeted. We should all be getting back to normal, right?

According to the recent poll from SEO Roundtable on the topic, that is not the case. Over 60 percent of those responding have said they haven’t recovered from the algorithm update, with only 7.5-percent saying they have fully recovered.

What does this mean? Well the respondents are clearly SEO informed people who keep up to date with the latest blogs, since they responded to one of the more reputable sites available on the issue. One major issue is that full recovery from Penguin isn’t possible for many of those affected until the next refresh. It is hard to know when that refresh could happen, though it may not be until the next update is announced.

The other issue is simply that those articles telling SEOs how to recover from Penguin range from completely valid to “how to try to cheat the new system” which can be confusing for inexperienced or uninformed SEOs. The best suggestion for solving this problem is playing close attention to what sites you are reading and always take the more conservative advice.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?

For those still pushing backlinks as the golden goose of SEO, a recent revision to Google’s Ranking help guidelines could be potentially frightening. But, if you’ve been watching the changes in SEO over the past few years it shouldn’t come as much of a surprise. Google has become more and more strict about backlink quality and linkbuilding methods, and links were bound to be dethroned.

As reported by Search Engine Watch, it was spotted late last week that Google updated the Ranking help article to say “in general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.” Before, it told webmasters that they could improve their rank “by increasing the number of high-quality sites that link to their pages.”

There have been countless signs that Google would officially step back from linkbuilding as one of the most important ranking signals. There were widespread complaints for a while about competitors using negative SEO techniques like pointing bad links to websites, and every Penguin iteration that comes out is a significant event in SEO.

To top it all off, when Matt Cutts, the esteemed Google engineer, was asked about the top 5 basic SEO mistakes, he spent a lot of time talking about the misplaced emphasis on link building.

“I wouldn’t put too much of a tunnel vision focus on just links,” Cutts said. “I would try to think instead about what I can do to market my website to make it more well known within my community, or more broadly, without only thinking about search engines.”

Depending on your skill set, a recent Webmaster video may be good or bad news to bloggers and site owners out there. Most people have never considered whether stock photography or original photography has any effect on search engine rankings. As it happens, not even Matt Cutts has thought about it much.

There are tons of writers out there who don’t have the resources or talent with a camera to take pictures for every page or article they put out. Rather than deliver countless walls of text that people don’t like looking at, most of us without the artistic talent instead use stock photos to make the pages less boring and help our readers understand us more. For now, we have nothing to worry about.

Cutts, the head of Google’s Webspam team, used his latest Webmaster Chat to address this issue, and he says that to the best of his knowledge, original vs. stock photography has no impact on how your pages rank. However, he won’t rule it out for the future.

“But you know what that is a great suggestion for a future signal that we could look at in terms of search quality. Who knows, maybe original image sites might be higher quality, whereas a site that just repeat the same stock photos over and over again might not be nearly as high quality. But to the best of my knowledge, we don’t use that directly in our algorithmic ranking right now.”

Logically, I would say that if Google does decide to start consideration photo originality on web pages, Cutts appears to be more worried about sites that use the same images “over and over” rather than those who search for relevant and unique stock images for articles. Penalizing every website owner without a hired photographer to continuously produce images for every new page would seem a bit overkill.

Matt Cutts, head of Google’s Webspam team, recently announced via Twitter that a new ranking update focusing on spammy queries has officially gone live, according to Danny Goodwin from Search Engine Watch. At the same time, Google has made it clear that if you don’t have a quality mobile website, you’re going to start seeing your rankings dropping.

Spammy Queries Ranking Update

The ranking update for spammy queries is supposed to affect 0.3 to 0.5 percent of English queries, but it shouldn’t be much of a shock to anyone who has been listening to what Cutts says. It was one of the most notable updates Cutts spoke about in an earlier Google Webmaster video where he discussed what to expect from Google this summer.

Cutts says the updates are specifically focused on queries notorious for spam such as “payday loans” on Google.co.uk as well as pornographic queries. The roll-out of the update will be similar to many of Google’s recent changes in that it is being implemented gradually over the next few months.

Smartphone Ranking Changes

SmartphoneIt appears we’ve finally reached the point where slacking on mobile SEO is going to objectively hurt your site as a whole. A recent post added to the Google Webmaster Central Blog warns that “we plan to roll out several ranking changes in the near future that address sites that are misconfigured for smartphone users.”

Google named two primary mobile mistakes as their primary targets: fault redirects and smartphone only errors. Faulty redirects are “when a desktop page redirects smartphone users to an irrelevant page on the smart-phone optimized website,” such as when you get automatically sent to a homepage on a smartphone, rather than the actual content you searched for. Smartphone only errors, on the other hand, occur when sites allow desktop users reaching a page to see content, but gives smartphone users errors.

This is Google’s first big move in adding mobile configuration as a ranking consideration, but their advice belies their intent to continue to pay attention to mobile. They suggest “try to test your site on as many different mobile devices and operating systems, or their emulators, as possible.” It isn’t acceptable to only pay attention to desktop anymore.

Two years ago, Search Engine Land released their “Periodic Table of SEO Ranking Factors”, but we all know that SEO doesn’t stay the same for that long, especially with the bigger changes that Google has been pushing out lately. That is why the periodic table was recently updated, clarified, and re-branded “The Periodic Table of SEO Success Factors”.

When you hear that Google has over 200 “signals or ranking factors” and over 10,000 “sub-signals” it is easy to get overwhelmed or confused as to where you should focus your efforts. However, those big numbers are usually created by speculation such as whether or not Google pays any attention to Facebook Likes (the truth is, we don’t know).

While there may be a full 200 signals Google uses, there is a hierarchy to how important each signal is, and we have a pretty good idea of the most important ranking factors that Google relies on. These bigger signals are also the most likely to stay stable over time. If we somehow were to find out the current full list of ranking factors, the system would change again by the time you had their weight and function mapped out. Heck, they may have changed while I typed this sentence.

Search Engine Land’s periodic table doesn’t attempt to focus on the small things, but instead shows you the areas that have the biggest impact on rankings and visibility. As the creators see it, the table is a starting point for new SEO and a friendly reminder for the veterans. The simple version of the periodic table is below, but you can find the expanded table as well as the key for understanding the image here.

Periodic Table of SEO Success

One of the biggest mistakes you can make in SEO is optimizing your website at the expense of the audience. While it may help get people onto your page, when visitors are met with a page filled with too much content on a bad design, they leave and you lose a sale.

With Google’s latest emphasis on usability, over optimizing may actually hurt your search engine rankings anyways. Jay Taylor recently shared some tips to make sure you are creating websites that customers and search engines alike will love. If you want to get people to come to your page and stay, follow these important rules.

  1. Understand Your Customer – First and foremost, the internet is now more about the user than it ever has been. Google includes aspects of usability such as speed, design, and content in their algorithms as strong indicators of quality sites. Not only that, but obviously you should be trying to appeal to your actual customers, not just your own tastes. The way you percieve your brand may not be the same as how your customers understand your product, so you want to find out why they choose you over your competitors. Once you know that, you know what to play up when introducing potential customers to your brand.
  2. Websites Don’t Have To Be Beautiful – That’s a bit of an overstatement, but it is far more important for a website to be usable and interesting to your target audience than it is to have a website that looks like a work of art. Use visuals that appeal to your customers in a way that solidifies your credibility and appeal to your customers. You want your website to look professional and be extremely usable, not unapproachably artistic.
  3. Create Great Content With a Purpose – The days of creating content stuffed with keywords solely to appease search engines is long gone. Your content should have a purpose to your reader and be aimed at actually informing your audience rather than rambling with specific words to attract crawlers. Poor grammar, unnecessary vocabulary usage, and awkwardly mechanical text turn people off, and can lose you customers. Instead, make sure your content has a purpose, value to your customer, and inspires action of some sort.
  4. Provide Easy-to-Use Navigation – User-friendly navigation is essential to allowing your customers to quickly and easily find what they’re looking for on your site, but it also allows search engines to more effectively index your site. There should be navigation in the header and footer so that customers always have access to it, and you might consider a drop-down menu in the top navigation if you have a lot of pages.
  5. Measure and Improve – Keep track of your key performance indicators such as conversions, contacts from the website, and possibly purchases to see how any new changes may be affecting your performance. You should also be using Google Analytics to watch where your customers are coming from, and what may be causing you to lose conversions.
Image Courtesy of Wikipedia Commons

Image Courtesy of Wikipedia Commons

Penguin 2.0 only affected 2.3% of search queries, but you would think it did much more from the response online. Ignoring all of the worrying before the release, there have been tons of comments about the first-hand effects it seems many are dealing with in the post-Penguin 2.0 web. Those spurned by the new Penguin algorithm have even accused Google of only releasing the update to increase their profitability.

Matt Cutts, head of Google’s Webspam team, used his recent Webmaster Chat video to attack that idea head on. The main question he was asked is what aspect of Google updates Cutts thinks the SEO industry doesn’t understand. While Matt expresses concern about the amount of people who don’t get the difference between algorithm updates and data refreshes, Cutts’ main focus is the concept that Google is hurting web owners to improve their profits.

Most notably, the algorithm updates simply aren’t profitable. Google experienced decreases in their revenue from almost all their recent updates, but Cutts says that money isn’t the focus. Google is aiming at improving the quality of the internet experience, especially search. While site owners using questionable methods are upset, most searchers will hopefully feel that the updates have improved their experience, which will keep them coming back and using Google.

As far as the misunderstandings between algorithm updates and data refreshes, Cutts has expanded on the problem more elsewhere. The biggest difference is that the algorithm update changes how the system is working while data refreshes do not and only change the information the system is using or seeing.

Cutts was also asked which aspect of SEO that we are spending too much time on, which leads Cutts to one of the main practices that Penguin focuses on: link building. Too many SEOs are still putting too much faith in that single practice though it is being destabilized by other areas that more directly affect the quality of users’ experiences such as creating compelling content. Instead, Matt urges SEOs to pay more attention to design and speed, emphasizing the need to create the best web experience possible.

Cutts’ video is below, but the message is that Google is going to keep growing and evolving, whether you like it or not. If you listen to what they say and tell you about handling your SEO, you may have to give up some of your old habits but you’ll spend much less time worrying about the next algorithm update.