Don’t say Google doesn’t at least try to listen to webmasters. Though many webmasters have some pretty big (and often legitimate) grudges against the biggest search engine, it can’t be said they don’t at least try to reach out for opinions. One example of Google trying to receive feedback from site owners appeared last night, as Matt Cutts, Google’s head of webspam, tweeted out a call for webmasters and SEOs to fill out a survey.

Specifically, Cutts called for owners of small but high-quality websites who believe they should be doing better than they are in the rankings. It won’t end up affecting your rankings immediately, but it may give Google some information that will help them keep the playing field vaguely even for small businesses and big companies alike. The form reads:

Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.

The survey only asks two short questions. First, it calls for the name and URL of the small site you believe should be ranking well. Secondly, Google would obviously like to hear your opinion about why the site should rank higher. It is extremely straightforward, and shouldn’t take all that long for most webmasters to complete.

 

In his attempt to fix some confusing wording Google has been using, Matt Cutts, Google’s head of webspam, used his latest Webmaster Help video to clarify that page load speed is not any more important for rankings on mobile than it is for desktop searches.

This comes after Google has been publicly emphasizing the need for sites to load quickly, noting that mobile users are highly likely to leave a page if it doesn’t load fast enough. While Google isn’t backing off of that stance, Cutts wanted to make it clear that there isn’t a difference in how this speed is ranked from mobile to desktop.

If all things are equal, meaning all other aspects of two sites are ranked evenly, the site that loads faster will almost certainly be given the higher ranking in search results by Google, but that is true on smartphones and desktop computers alike. It is also just a sensible part of the algorithm, as slow pages will likely lose a large number of visitors just during the loading time, making it a lower-value site.

But, as internet speeds across devices and across the globe vary, Cutts said Google doesn’t have plans to give an exact amount of seconds your site should load in, but if it becomes obvious to Google that mobile users are getting more frustrated by slow sites than their desktop counterparts, they may consider weighting loading speed more for mobile searches. It just isn’t the case yet, and there are no plans currently to make it so.

Negative SEO has been a topic of debate for a while now, especially since links became dramatically more risky with the introduction of Google Penguin. It is entirely possible competitors could simply point hundreds or thousands of low-quality or negative backlinks towards your site with the intention of causing your site to be penalized or possibly even completely removed from Google’s index. Buying links went from being a tool for cheating to algorithms to a weapon for destroying, or at least handicapping, competitors.

Google acknowledged this issue with the introduction of the Disavow Links tool, by giving webmasters a way to protect their site. The problem is, everyone seems to be using it wrong. According to Marcela De Vivo from Search Engine Watch, the Disavow Links tool isn’t best used after a site has been penalized. Instead, it should be used in combination with a link audit before you ever run into trouble.

It makes sense, the entire point of a backlink audit is to check the quality and status of all of the links in your profile. If you’re auditing properly and regularly, you won’t ever have to worry about algorithm updates or manual actions. You could catch every low quality link and disavow before search engines identify them.

Auditing isn’t difficult either. All you have to do is download your backlinks from your Google Webmaster account or any other backlink tool, and look for any links pointing back to your site that you either don’t recognize or look questionable. Usually fishy or spammy links are easy to pick out. Then, you can take action by emailing the owner of those sites and asking for the link to be removed. If that fails, all you have to do is turn to the disavow tool.

Running these types of audits is like exercising for your website. When done regularly, audits keep your site healthy, removes any unhealthy links from the profile, and makes it easier to fight off outside attacks. If you’re regularly auditing your links, you’ll quickly spot any negative or blackhat SEO attempts. Everyone living in fear of Penguin updates is spending too much time being reactive. If you proactively manage your backlink profile, penalties will seem far less menacing.

Google Adwords Time Chart

Jon Diorio and the Google+ account for Google Ads announced today that a new feature is available in Adwords that will allow you to get a better look at your data. It is a small addition, but many advertisers will find it very useful.

Beginning today, you can control the time aggregation on Adwords charts to show data down to a day-by-day view. You can also view it by week, month, or quarter. This way, you can see the big and small pictures with just a couple clicks, and keep track of the smaller level trends.

The announcement read:

Today, we’re making it easier and faster to get a customized view of how your performance is trending with a new button right above your chart in AdWords that lets you toggle between Daily, Weekly, Monthly, or Quarterly data (shown below). We hope this will save you time and make you more efficient while optimizing your search campaigns.

Google IconHow fast does your website load on mobile devices? Under five seconds? If you said yes to the second question, you are probably pretty happy with your answer. What about under one second? Probably not. But that is how fast Google says sites should load, according to their newest guidelines for mobile phones.

Before you start freaking out at the suggestion their site is supposed to load in under a second, it should be clear that Google isn’t mandating an insane guideline. They don’t actually expect most websites to completely load that quickly. Instead, they are focusing on the “above the fold” content. They think users should be able to get started playing with your page quickly, while the rest can progressively load.

It is probably a wise insight, considering most mobile users say they are more likely to leave a site the longer it takes to load. On smartphones, every second really counts, and if you can get the above the fold content loaded within a second, most users will be happy to wait for the rest of the content while they start exploring.

The update reads:

“…the whole page doesn’t have to render within this budget, instead, we must deliver and render the above the fold (ATF) content in under one second, which allows the user to begin interacting with the page as soon as possible. Then, while the user is interpreting the first page of contents, the rest of the page can be delivered progressively in the background.”

To match with the new guidelines, Google also updated its PageSpeed Insights Tool to focus more on mobile scoring and suggestions over the desktop scoring. They also updated scoring and ranking criteria to reflect the guideline changes.

Recently, Google updated the link schemes web page that gives examples of what Google considers to be spammy backlinks. The additions are pretty notable as article marketing or guest posting campaigns with keyword rich anchor text have been included. Advertorials with paid links and links with optimized anchor text in press releases or articles were also added.

With all the new additions, it can be hard to keep up to date with what Google is labeling spammy backlinks or backlink schemes. But, Free-SEO-News’ recent newsletter simply and efficiently lays out the 11 things that Google doesn’t like to see in backlink campaigns.

  1. Paid Links – Buying or selling links that pass PageRank has been frowned upon for a long time. This includes exchanging money for links or posts that contain links, sending ‘free’ products in exchange for favors or links, or direct exchange of services for links. It is pretty simple, buying links in any way will get you in trouble.
  2. Excessive Link Exchanges – While exchanging links with relevant other websites in your industry is absolutely normal for websites, over-using those links or cross-linking to irrelevant topics is a big sign of unnatural linking. Simple common sense will keep you from getting in trouble, just don’t try to trick the system.
  3. Large-Scale Article Marketing or Guest Posting Campaigns – Similar to the last scheme, posting your articles and guest posts on other websites it perfectly normal. However, doing it in bulk or posting the same articles to numerous websites will appear to be blogspam to Google. Also, if you do guest posts just to get keyword rich backlinks, you will see similar penalties. Only publish on other websites when it makes sense and offers value.
  4. Automated Programs or Services to Create Backlinks – There are tons of ads for tools and services that promise hundreds or thousands of backlinks for a low price and very little work. While they may do what they say, Google also easily spots these tools and won’t hesitate to ban a site using them.
  5. Text Ads That Pass PageRank – If you’re running a text ad on another website, you have to make sure to use the rel=nofollow attribute, otherwise it appears to be a manipulative backlink.
  6. Advertorials That Include Links That Pass PageRank – If you pay for an article or ad, always use the rel=nofollow attribute. Simply put, if you paid for an ad or article, it won’t do you any good and can bring a lot of damage if you don’t use the attribute.
  7. Links with Optimized Anchor Text in Articles or Press Releases – Stuffing articles and press releases with optimized anchor text has been a strategy for a long time, but Google has shut it down recently. If your page has a link every four to five words, you’re probably looking at some penalties.
  8. Links From Low Quality Directories or Bookmark Sites – Submitting your site to hundreds of internet directories is an utter waste of time. Most links won’t ever get you a single visitor and won’t help your rankings. Instead, only focus on directories that realistically could get you visitors.
  9. Widely Distributed Links in the Footers of Various Websites – Another older trick that Google has put the squash on was to put tons of keyword rich links to other websites in the footer. These links are always paid links and are an obvious sign of link schemes.
  10. Links Embedded in Widgets – It isn’t uncommon for widget developers to offer free widgets that contain links to other sites. It also isn’t uncommon for these developers to reach out to site owners and offer to advertise through these widgets. However, Google hates these links and considers them a scheme. I’d suggest against it, but if you do advertise through these widgets, use the nofollow attribute.
  11. Forum Comments With Optimized Links in the Post – It is very easy to get a tool that automatically posts to forums and include links to websites. It is a pretty blatant form of spam which won’t get any actual visibility on the forums and the links are more likely to get you banned than draw a single visitor.

There’s a pretty obvious underlying trend in all of these tactics that Google fights. They all attempt to create artificial links, usually in bulk. Google can tell the quality of a link and all of these schemes are easily identifiable. Instead, focus on building legitimate quality links, and use respected tools such as SEOprofiler. It will take longer, but you’re site will do much better.

Google is great for searching for quick and concise information, such as what you might find on Wikipedia or IMDB. If you have a simple question with an objective answer, the biggest search engine is the perfect tool. But, as anyone who has tried to do actual research for academics or work will tell you, Google is not so great with providing lots of in-depth content.

You might find news, or maybe a couple books and articles on Google Scholar, but the main search results on Google can be limiting. You are getting the results for the people with the best short answers for your questions. Now, Google is trying to change that as they recognize roughly 10% of searches and people (their estimate) are looking for more comprehensive information.

Google has announced that over the next few days they will be rolling out “in-depth articles” within the main search results. Now, when searching for broader topics that warrant more information such as stem cell research or abstract topics such as happiness or love, Google will feature a block of results in the middle of the page like below.

In-Depth Articles Screenshot

Source: The Official Google Search Blog

This is yet another way Google is forming results intuitively tailored to fit the type of information you are searching for. As a Google spokesperson told Search Engine Land, “Our goal is to surface the best in-depth articles from the entire web. In general, our algorithms are looking for the highest quality in-depth articles, and if that’s on a local newspaper website or a personal blog, we’d like to surface it.”

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.

But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.

While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.

This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.

If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.

Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.

Timer

Source: WikiCommons

Everyone working in SEO knows that Google has a multitude of factors they use to determine the order of search engine results, and the majority of these ranking factors are based on either the content of the webpage or signs of authenticity or reputability. That was the case for the longest time, but since 2010, Google has made significant shifts towards a focus on usability, and the harbinger of this change was the inclusion of website speed to ranking factors.

The problem is, website speed and other usability issues aren’t exactly objectively defined. What exactly is a slow loading site? What is the cutoff? No one has gotten a definitive answer from Google, but in June Matt Cutts explicitly stated that slow loading sites, especially on mobile platforms will begin seeing search rank penalties soon.

Obviously these changes are good for searchers. Searchers want sites that load quickly, offer quality user experience, and deliver great content. And, the emphasis on speed is certainly highlighted on mobile platforms where on-the-go users are likely to go back to the results if the site takes too long for their liking. The issue we face as search optimization professionals is trying to figure out exactly what Google is measuring and how that information is being used.

Matt Peters from Moz decided to break through Google’s intentionally vague information to figure out exactly how site speed affects rankings with the help of Zoompf. They can’t explicitly disprove causation between site speed and rankings, due to the number of other algorithmic ranking factors that complicate the study. But, their results did show very little to no correlation between page load time and ranking.

I wouldn’t take this information as gospel, but it does suggest that loading time isn’t a huge consideration into long tail searches and doesn’t need to be worried about too much. If your site is loading quickly enough to please the people coming to it, your site will also likely pass Google’s expectations.