Tag Archive for: Google

Have you ever searched for a term only to find a page that says “we have no articles for [your search term]” and a whole bunch of ads? Most people have come across these sites with auto-generated content, often called “Made for AdSense” or MFA sites. These pages are created for the sole reason of luring people in, and hoping they click an AdSense ad to leave the page instead of hitting the back button.

The majority of these types of websites use a script to automatically generate content that takes snippets from search results or web pages with those keywords. They don’t offer real content in any way and have absolutely no legitimate value. It makes many wonder why they’ve encountered these kinds of pages in the Google search results.

One user directly asked Matt Cutts, Google’s head of webspam, if the search engine is doing anything about the pages, such as penalties or removing these sites from the index. As you would expect, Google already has a policy in place, and Cutts encourages users to report any pages like this they come across. He states:

We are absolutely willing to take action against those sites. We have our rules in our guidelines about auto-generated pages that have very little value and I have put out in the past specific calls for sites where you search for a product – a VCR, a laptop, or whatever – and you think you’re going to get a review, and the first thing you see is ‘0 Reviews found for [blah blah blah].’

As Google sees it, even if these pages are from legitimate search engines, they don’t belong in the rankings. Users don’t really like searching for something and being sent to another page of search results. They want to be directed straight to real content.

There are very few times when search results snippets should be indexed. The only real time it might be considerable is if you have exclusive data that no one else has. But, there is no time when a supposed search results page with 0 results should ever be indexed.

To put it simply, Google is already trying to fight against these sites. They aim to find and penalize all they can, but they also want people to report them with a spam report if possible so that the lowest amount possible slip through the cracks.

Don’t say Google doesn’t at least try to listen to webmasters. Though many webmasters have some pretty big (and often legitimate) grudges against the biggest search engine, it can’t be said they don’t at least try to reach out for opinions. One example of Google trying to receive feedback from site owners appeared last night, as Matt Cutts, Google’s head of webspam, tweeted out a call for webmasters and SEOs to fill out a survey.

Specifically, Cutts called for owners of small but high-quality websites who believe they should be doing better than they are in the rankings. It won’t end up affecting your rankings immediately, but it may give Google some information that will help them keep the playing field vaguely even for small businesses and big companies alike. The form reads:

Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.

The survey only asks two short questions. First, it calls for the name and URL of the small site you believe should be ranking well. Secondly, Google would obviously like to hear your opinion about why the site should rank higher. It is extremely straightforward, and shouldn’t take all that long for most webmasters to complete.

 

In his attempt to fix some confusing wording Google has been using, Matt Cutts, Google’s head of webspam, used his latest Webmaster Help video to clarify that page load speed is not any more important for rankings on mobile than it is for desktop searches.

This comes after Google has been publicly emphasizing the need for sites to load quickly, noting that mobile users are highly likely to leave a page if it doesn’t load fast enough. While Google isn’t backing off of that stance, Cutts wanted to make it clear that there isn’t a difference in how this speed is ranked from mobile to desktop.

If all things are equal, meaning all other aspects of two sites are ranked evenly, the site that loads faster will almost certainly be given the higher ranking in search results by Google, but that is true on smartphones and desktop computers alike. It is also just a sensible part of the algorithm, as slow pages will likely lose a large number of visitors just during the loading time, making it a lower-value site.

But, as internet speeds across devices and across the globe vary, Cutts said Google doesn’t have plans to give an exact amount of seconds your site should load in, but if it becomes obvious to Google that mobile users are getting more frustrated by slow sites than their desktop counterparts, they may consider weighting loading speed more for mobile searches. It just isn’t the case yet, and there are no plans currently to make it so.

SEO Waving Icon

 

We’ve all seen the cycle of Google updates. Every time there is a change to the algorithms, the blogs all light up with announcements, a fair sized group panics while the rest ride out the storm, and then the “how to recover” posts start rolling in. Eventually the excitement tapers off, and then it is time for a new update.

Probably the most shocking thing about all the commotion is how many people freak out in the first place. While some of Google’s changes are pretty significant, it isn’t like they don’t warn webmasters ahead of time with what direction they are headed for ranking websites. They won’t give the specifics, but they normally denounce a practice well before they start penalizing for it.

That is all my long-winded way of saying we don’t all have to be afraid of the next Penguin or Panda update. By simply following the best practice guidelines and keeping some solid tips in mind, you’ll find you have no reason to worry. Erin Everhart recently shared some great tactics you can use to keep your website in Google’s good graces.

1) Focus on Branding, Not on Ranking

It is no secret that Google isn’t actually a fan of a lot of what constitutes search engine optimization, mostly because of the way many try to take advantage of every loophole to get rankings. The common idea of SEO focuses solely on improving rankings, while Google wants to rank sites based on value to their consumers.

To start thinking like Google, you need to get your mind off of ranking and focus more on building your brand. If you search for any type of product like a flat screen TV, the results will be almost entirely brand names. Google views brand names as trustworthy and valuable parts of their community, and that goes for small businesses as well as large companies. Simply sponsoring events in the community and interacting with users in positive ways go a long way with search engines.

Of course, it would be naive to say the big brands don’t have advantages, but it isn’t the reason you think. Google evaluates them the same way they evaluate everyone else, but these brands are large enough that they never resort to the keyword stuffing, anchor text over-optimizing stuff that so many SEO professionals try to use.

2) Create a Good User Experience

Along the lines of taking your focus off of rankings, Google has been pleading with the SEO community to take their attention to actually delivering quality experiences for users. The search engine wants to deliver great sites that users will enjoy being on, not low quality pages with the most optimization. To achieve this, the engine made site quality more important than link profiles and has been refining their guidelines to push for faster sites with better content.

For marketers and optimizers, this can be a little confusing. Who exactly defines a “high quality site” and what are the criteria? Well we know the faster your site is, the better off it will fare. But, there are many more amorphous factors to deal with. The only real way to find out exactly what your users will like and how to make the highest quality site for them is testing. Run every type of test you can. Do user testing. Do split testing. Research your market.

3) Preserve Your URLs

It is a little bit of an outdated practice at this point, but it remains true that old URLs still rank the best. The only reason you should resort to changing your URLs is to fix an absolute mess of site architecture or absolutely have no choice. But, if there is any way you can avoid it, do. Canonicals and 301s reduce equity that you’ve built up, and new pages have to start all over again.

Instead, bigger companies like Apple use the same page for every new product launch, unless they release an entirely new product like the rumored upcoming smartwatch. They simply update the existing page to reflect the new product, while the old iPhone gets pushed to a new page. This way, you can take advantage of the equity you already have.

Conclusion

Focusing your SEO efforts on rankings isn’t sustainable any longer. You may shoot up the rankings more quickly than those creating a high quality campaign, but you’ll live in fear of every algorithm update, and eventually you will get hit. Chances are, you probably already have been penalized once, unless you’re walking the straight and narrow.

Google IconHow fast does your website load on mobile devices? Under five seconds? If you said yes to the second question, you are probably pretty happy with your answer. What about under one second? Probably not. But that is how fast Google says sites should load, according to their newest guidelines for mobile phones.

Before you start freaking out at the suggestion their site is supposed to load in under a second, it should be clear that Google isn’t mandating an insane guideline. They don’t actually expect most websites to completely load that quickly. Instead, they are focusing on the “above the fold” content. They think users should be able to get started playing with your page quickly, while the rest can progressively load.

It is probably a wise insight, considering most mobile users say they are more likely to leave a site the longer it takes to load. On smartphones, every second really counts, and if you can get the above the fold content loaded within a second, most users will be happy to wait for the rest of the content while they start exploring.

The update reads:

“…the whole page doesn’t have to render within this budget, instead, we must deliver and render the above the fold (ATF) content in under one second, which allows the user to begin interacting with the page as soon as possible. Then, while the user is interpreting the first page of contents, the rest of the page can be delivered progressively in the background.”

To match with the new guidelines, Google also updated its PageSpeed Insights Tool to focus more on mobile scoring and suggestions over the desktop scoring. They also updated scoring and ranking criteria to reflect the guideline changes.

Manual Actions Viewer Screenshot

Google has long been alerting webmasters when they placed a manual action against the site, but last week they made it even easier to know for sure whether a site’s search rankings are being penalized with a manual action. The search engine has added a new feature to Webmaster Tools called the Manual Actions viewer.

The Manual Actions viewer is seen under the “Search Traffic” tab, and it is meant to act as a complimentary alert to the email notifications they already send out to websites receiving a manual action. With the new tool, webmasters don’t have to rely on waiting for an email. Instead, they can check their site’s condition any time.

According to Google, less than two percent of all domains within its index are manually removed for spammy practices, so most legitimate webmasters will never see anything within the tool other than a display reading “No manual webspam actions found.”

However, for those who get targeted for spammy practices, the Manual Actions viewer will show existing webspam problems under two headings titled ‘site-wide matches’ and ‘partial matches’. They will also include information on what type of problem exists from a list of roughly a dozen categories including ‘hidden text and/or keyword stuffing’, ‘thin content’, and ‘pure spam’.

For the partial matches listed in the tool, Google also gives access to a list of affected URLs for each type of spam problem. For example, if you have a notification for thin content, you will be able to see all the URLs targeted. There is a limit of 1,000 URLs per problem category, but that should be plenty for al but massive websites like YouTube.

Within the tool, there is also quick access to a new ‘Request a Review’ button that appears any time there are manual actions listed. When you click the button, a pop-up window opens which lets the webmaster give Google details on how you have resolved the issues.

Recently, Google updated the link schemes web page that gives examples of what Google considers to be spammy backlinks. The additions are pretty notable as article marketing or guest posting campaigns with keyword rich anchor text have been included. Advertorials with paid links and links with optimized anchor text in press releases or articles were also added.

With all the new additions, it can be hard to keep up to date with what Google is labeling spammy backlinks or backlink schemes. But, Free-SEO-News’ recent newsletter simply and efficiently lays out the 11 things that Google doesn’t like to see in backlink campaigns.

  1. Paid Links – Buying or selling links that pass PageRank has been frowned upon for a long time. This includes exchanging money for links or posts that contain links, sending ‘free’ products in exchange for favors or links, or direct exchange of services for links. It is pretty simple, buying links in any way will get you in trouble.
  2. Excessive Link Exchanges – While exchanging links with relevant other websites in your industry is absolutely normal for websites, over-using those links or cross-linking to irrelevant topics is a big sign of unnatural linking. Simple common sense will keep you from getting in trouble, just don’t try to trick the system.
  3. Large-Scale Article Marketing or Guest Posting Campaigns – Similar to the last scheme, posting your articles and guest posts on other websites it perfectly normal. However, doing it in bulk or posting the same articles to numerous websites will appear to be blogspam to Google. Also, if you do guest posts just to get keyword rich backlinks, you will see similar penalties. Only publish on other websites when it makes sense and offers value.
  4. Automated Programs or Services to Create Backlinks – There are tons of ads for tools and services that promise hundreds or thousands of backlinks for a low price and very little work. While they may do what they say, Google also easily spots these tools and won’t hesitate to ban a site using them.
  5. Text Ads That Pass PageRank – If you’re running a text ad on another website, you have to make sure to use the rel=nofollow attribute, otherwise it appears to be a manipulative backlink.
  6. Advertorials That Include Links That Pass PageRank – If you pay for an article or ad, always use the rel=nofollow attribute. Simply put, if you paid for an ad or article, it won’t do you any good and can bring a lot of damage if you don’t use the attribute.
  7. Links with Optimized Anchor Text in Articles or Press Releases – Stuffing articles and press releases with optimized anchor text has been a strategy for a long time, but Google has shut it down recently. If your page has a link every four to five words, you’re probably looking at some penalties.
  8. Links From Low Quality Directories or Bookmark Sites – Submitting your site to hundreds of internet directories is an utter waste of time. Most links won’t ever get you a single visitor and won’t help your rankings. Instead, only focus on directories that realistically could get you visitors.
  9. Widely Distributed Links in the Footers of Various Websites – Another older trick that Google has put the squash on was to put tons of keyword rich links to other websites in the footer. These links are always paid links and are an obvious sign of link schemes.
  10. Links Embedded in Widgets – It isn’t uncommon for widget developers to offer free widgets that contain links to other sites. It also isn’t uncommon for these developers to reach out to site owners and offer to advertise through these widgets. However, Google hates these links and considers them a scheme. I’d suggest against it, but if you do advertise through these widgets, use the nofollow attribute.
  11. Forum Comments With Optimized Links in the Post – It is very easy to get a tool that automatically posts to forums and include links to websites. It is a pretty blatant form of spam which won’t get any actual visibility on the forums and the links are more likely to get you banned than draw a single visitor.

There’s a pretty obvious underlying trend in all of these tactics that Google fights. They all attempt to create artificial links, usually in bulk. Google can tell the quality of a link and all of these schemes are easily identifiable. Instead, focus on building legitimate quality links, and use respected tools such as SEOprofiler. It will take longer, but you’re site will do much better.

Google is great for searching for quick and concise information, such as what you might find on Wikipedia or IMDB. If you have a simple question with an objective answer, the biggest search engine is the perfect tool. But, as anyone who has tried to do actual research for academics or work will tell you, Google is not so great with providing lots of in-depth content.

You might find news, or maybe a couple books and articles on Google Scholar, but the main search results on Google can be limiting. You are getting the results for the people with the best short answers for your questions. Now, Google is trying to change that as they recognize roughly 10% of searches and people (their estimate) are looking for more comprehensive information.

Google has announced that over the next few days they will be rolling out “in-depth articles” within the main search results. Now, when searching for broader topics that warrant more information such as stem cell research or abstract topics such as happiness or love, Google will feature a block of results in the middle of the page like below.

In-Depth Articles Screenshot

Source: The Official Google Search Blog

This is yet another way Google is forming results intuitively tailored to fit the type of information you are searching for. As a Google spokesperson told Search Engine Land, “Our goal is to surface the best in-depth articles from the entire web. In general, our algorithms are looking for the highest quality in-depth articles, and if that’s on a local newspaper website or a personal blog, we’d like to surface it.”

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.

But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.

While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.

This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.

If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.

Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.