Posts

Those who have been following search trends in the past couple years have likely heard that being the first result on the search engine results pages (SERPs) is not nearly as important these days, mostly because the results we see are now customized based on location and user habits.

This is absolutely still the case for desktop searching, but a new click-through ranking study conducted by seoClarity suggests the difference between first and second in mobile search results may be the difference between success and failure. Their findings show such a large drop-off between the first and second rankings that there was no notable difference between the second listing and those that followed.

ctr-mobile-model-final-800x450

In the graph of mobile click through rates, you can see the first result receives nearly three times the number of clicks compared to the second ranking, while desktop rankings continue to show a more gradual slope following the first result.

ctr-desktop-model-image-final-800x450

To conduct the study, seoClarity examined over 2 billion impressions from Google Webmaster Tools data over a 90 day period between June and August. You can download and view the full report here.

Much has been made out of the announcement that Google would include switching from HTTP to HTTPS in their ranking algorithm. Despite clearly stating that the factor would be lightweight in the initial announcement, the possibility of a relatively easy rankings boost drove lots of people to make the switch immediately.

In the aftermath studies from analytics groups such as SearchMetrics have suggested that any effect of switching URLs might have is largely unnoticeable. Now, Google’s John Mueller has basically admitted that the signal currently too lightweight to have any noticeable effect but that may change at some point in the future.

At 22 minutes and 21 seconds in a recent video hangout, Mueller explained that HTTPS is a ranking signal but it is only a “very lightweight signal” and there aren’t any plans to change that in the future.

Jennifer Slegg was the first to report Mueller’s statement and transcribed it:

I wouldn’t expect any visible change when you move from http to https, just from that change, just from SEO reasons. That kind of ranking effect is very small and very subtle. It’s not something where you will see a rise in rankings just from going to https

I think that in the long run, it is definitely a good idea, and we might make that factor stronger at some point, maybe years in the future, but at the moment you won’t see any magical SEO advantage from doing that.

That said, anytime you make significant changes in your site, change the site’s URLs, you are definitely going to see some fluctuations in the short term. So you’ll likely see some drop or some changes as we recrawl and reindex everything. In the long run, it will settle down to about the same place, it won’t settle down to some place that’s like a point higher or something like that.

You can see the video below:

television

Yesterday morning, Bill Slawski from SEO By The Sea discovered that Google has been granted a patent which suggests they are working on a method to use information about what is showing on television in your area as a ranking signal in search results.

The patents follow Google’s trend of trying to individualize search results based on personal tastes and location, and in some ways it has already been in use within Google Now. However if the method used in the patent is implemented TV schedules could have a much larger impact on your results.

The specific patent is named System and method for enhancing user search results by determining a television program currently being displayed in proximity to an electronic device. It was filed on June 30, 2011.

Here is the abstract for the patent:

A computer implemented method for using search queries related to television programs. A server receives a user’s search query from an electronic device. The server then determines, in accordance with the search query and television program related information for television programs available at a location associated with the electronic device during a specific time window, a television program currently being displayed in proximity to the electronic device, wherein the television program related information includes program descriptions for a plurality of television programs being broadcast for the associated location.

Basically, the patent would allow Google to make note of what you are watching and instantly include that information within their ranking algorith. Presumably, this would make it easier to search for products shown during commercials or for more information about the show. As explained in the patent:

Someone watching a TV program with a segment about a particular model of Porsche might execute a search query for “Porsche” or “sports cars” instead of the designation of the particular model that was the subject of the segment….

Given that the Porsche model in question is a “911 Turbo,” and that the user executed a search query for “Porsche,” the server can return information about one or more of :

1) the “911 Turbo” model (e.g., a link to information on the Porsche.com website about the “911 Turbo”),

2) information about the TV program that is currently airing with that segment, and

3) suggestions of similar programming that is currently airing or airing in the future and that is available to the user.

In this way, implementations provide enhanced search results to viewers of live TV that are relevant to the content of TV programs that they are watching or are likely to be interested in watching.

The patent also provides a diagram which explains how the patent wold work:

google-tv-process-diagram

Ultimately, it is up to Google whether you can expect to see this idea included in future search algorithms. As Google has said before, just because they have patented something doesn’t mean they will definitely be using it. But, Search Engine Land also pointed out Google Now is able to do a very similar task.

If you opt in, Google Now is already capable of listening for information about what you’re watching and updates TV cards accordingly.

Seeing as Google isn’t giving away their search engine ranking factors playbook anytime soon, many people working in the search industry work constantly to discern as much as possible about how the biggest search engine ranks websites. One group of those people are SearchMetrics, who release a yearly ranking factors study.

As of yesterday, SearchMetrics 2014 ranking factors study is available to study, and they claim this year’s is the largest study they have ever done, with almost 100 pages and dozens of new ranking factors to review such as time of site, bounce rate, fresh links, and others.

Most importantly the study may answer one of the biggest SEO questions of the year; is content really the new king of search marketing? According to this report, the mantra of the SEO industry over the past few months is in fact true, as Marcus Tober comments that content is “no longer an addition to, but is the main focus of, SEO.”

Barry Schwartz broke down the most prominent ranking factors for Search Engine Land if you want the quick version, or you can get the full report directly from the source here.

ranking factors

google-security-360A few weeks ago, Google announced they would begin favoring sites who switch to HTTPS in search results. At the time of the announcement, most of the SEO community was skeptical at best and few believed the HTTPS ranking factor would have any effect on rankings whatsoever. Well, it has been a couple of weeks and we have the verdict.

The skeptics were absolutely right.

SearchMetrics decided to evaluate whether HTTPS had any discernible effect on search results of any form. According to Marcus Tober of SearchMetrics, there is no data to prove HTTPS has any effect on Google rankings after the launch of the ranking factor.

In a nutshell: No relationships have been discernible to date from the data analyzed by us between HTTPS and rankings nor are there any differences between HTTP and HTTPS. In my opinion therefore, Google has not yet rolled out this ranking factor – and/or this factor only affects such a small section of the index to date that it was not possible to identify it with our data.

Tober shared his data along with his report, and it all matches all the anecdotal evidence available as well. Site owners across the web rushed to update their site to the new favored HTTPS, but there is nary a single story I could find suggesting it had any ranking influence at all.

At the time of the announcement, Google did suggest that switching over could possibly influence rankings, but they also called it a “very lightweight signal” so there’s no need to grab your pitchforks. But, these results may have some lessons for those who were expecting and easy and quick ratings boost with minimal work.

For an industry that relies on as much data as the SEO market does, there is never much certainty that the popular optimization tactic being preached at the moment is a legitimate strategy. We rarely have the definitive answers from the source needed to keep all the confusion down, and new myths seem to spring up overnight.

To counter the constant flow of SEO myths, Google’s distinguished engineer Matt Cutts used one of his recent Webmaster Help videos to debunk many of the misconceptions surrounding the world’s most popular search engine.

This isn’t the first time Cutts has used his regular video message to debunk SEO myths, but this time he focuses on a specific type of myth that has become increasingly widespread as Google seems to keep narrowing their guidelines and offering greater space to ads.

Cutts starts by tackling the myth “if you buy ads you’ll rank higher in Google” and the opposing legend that not buying ads is the key to high rankings. In Matt Cutts’ perspective, these fables are tied to the notion that Google makes all of their decisions in an effort to force webmasters to buy more ads.

The problem with that idea is that it doesn’t actually reflect how Google thinks about their operations. The fact is, webmasters are rarely the main priority for the search engine to begin with. Instead, according to Cutts, Google’s rationale behind all changes is simply that they want to return the best search results possible to keep users happy and keep them coming back.

Of course, no one is denying that Google would like users to see ads and generate revenue, but that is never the prime motivation for changes like algorithm updates.

On a similar note, Matt uses the second half of the video to discuss the offers he sees for software packages that clam to help users make money and magically fix their SEO – for a small fee, of course.

Just as you can’t buy your way to high rankings with ads the chances of a random purchased software package making you money is almost zero. Matt lays out another scenario: “If someone had a foolproof way to make money online, they would probably use that way to make money rather than packaging it up in an ebook and selling it to people.”

In the end, most of the myths are born out of a misunderstanding of Google’s goals. Too many SEO professionals think of their job strictly in terms of increasing visibility and rankings, or upping their ROI. But the search engines are just looking for the best content possible. You can spend your time trying to game and cheat to get to the top, or you can align yourself with the search engine and try to provide users something of value. According to Cutts, that should be enough to fix many of the problems less honest SEOs tend to run into.

You can watch the full video below:

SpellingPretty much anything Google’s most popular engineer Matt Cutts says makes headlines in the SEO community, but often his Webmaster Chat videos and advice aren’t mind-blowing by any stretch of the imagination. For instance, we recently covered a video where Cutts explained that bad grammar in the comment section most likely won’t hurt your ranking (unless you allow spam to run rampant).

For content creators, it was a legitimate concern that poorly written contents might negate the hard work putting into writing legible and well-constructed content. However, many used this to run headlines that Google doesn’t care about grammar, which is not even close to being confirmed.

As Search Engine Land points out, way back in 2011, Cutts publicly stated that there is a correlation between spelling and PageRank, but Google does not use grammar as a “direct signal.” But, in his latest statement on the issue Cutts specifies that you don’t need to worry about the grammar in your comments “as long as the grammar on your own page is fine.” This suggests Google does in fact care about the level of writing you are publishing.

It is unclear exactly where the line is for Google at the moment, as they imply that grammar within your content does matter, but they have never stated it is a ranking signal. Chances are a typo or two won’t hurt you, but it is likely Google may punish pages with rampant errors and legibility issues.

On the other hand, Bing has recently made it pretty clear that they do care about technical quality in content as part of their ranking factors. Duane Forrester shared a blog post on the Bing Webmaster Blog which states, “just as you’re judging others’ writing, so the engines judge yours.”

Duane continues, “if you [as a human] struggle to get past typos, why would an engine show a page of content with errors higher in the rankings when other pages of error free content exist to serve the searcher?”

In the end, it all comes down to search engines trying to provide the best quality content they can. The search engines don’t want to direct users to content that will be hard to make sense of, and technical errors can severely impact a well thought-out argument.

As always, the best way to approach the issue is to simply write content for your readers. If your content can communicate clearly to your audience, the search engines shouldn’t have any problems with it. But, if a real person has trouble understanding you, the search engines aren’t going to do you any favors.

If there is one way to concisely explain the changes Google’s search algorithms have gone through in the past couple years, it would boil down to “bigger is not always better.” Gone are the days that you can jam as many keywords as you could fit into a paragraph of text, or buy up countless thousands of links and hope to rank highly.

However, the more you do to offer quality content and information to your users while staying in line with Google’s practices, the more success you’ll see.

Those two ideas are fairly common knowledge now, but they have created their own fair share of questions. Where should the balance between quantity and quality lie? How is this content evaluated? Does quantity of content outweigh quality of content?

Google has given some insight into how content is evaluated in the past, and it is clear that you won’t get far with an excessive amount of paper-thin content. Still, the number of indexed pages your site has does indeed have an effect on your ranking. So how exactly does this work and what is the balance?

Matt Cutts, Google’s head of Webspam, addressed this type of issue head-on in his most recent Webmaster Chat video. He was asked, “Does a website get a better overall ranking if it has a large amount of indexed pages?”

Cutts explained that having more indexed pages isn’t a magic ticket to higher rankings. He said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

However, having more indexed pages does have some clear benefits. The more pages you have, the more opportunities you have to rank for different keywords. But, this is only because you should be covering a larger variety of keywords and topics across that larger group of pages.

A larger number of indexed pages is also likely to improve your overall links and PageRank, which can affect your ranking. But, the link isn’t direct. Simply having more pages won’t improve much for you. Instead, you have to use those extra pages to deliver valuable content and information to your users. If you’re just filling your site with a meaningless wealth of pages to be indexed, you won’t be seeing any improvement anytime soon.

Google +1If you ask some marketing professionals, they may acts as if it is common knowledge that Google +1’s help raise your rankings on the search engine results. However, that “knowledge” is more an assumption based on a few correlation studies such as those done by Searchmetrics and Moz. These studies found extremely high correlation between Google +1’s and high rankings, but as you should know, correlation does not equal causation.

In fact, Google’s most prominent mouthpiece and Distinguished Engineer Matt Cutts has openly debunked the theory that more +1’s lead to higher rankings. But, that only sparked more debate. Whether or not there is a causative link between these two is much more fuzzy than many might tell you.

In an attempt to get to the bottom of this question, Stone Temple Consulting decided to conduct a real study of the effect Google +1’s have on search rankings. The difference is this study would be a real examination of causation, not correlation. The result: “Google Plus Shares did not drive any material ranking changes that we could detect.”

Eric Enge, leader of the study, did admit there were some possible limitations to the study. One of the biggest issues is the potential amount of links not showing up in the monitoring tools used in the study. In Enge’s estimate, the cumulative links found by Open Site Explorer, Majestic SEO, and Ahrefs is at best 50 percent of the total links to a site. It could even be as low as 30 percent of the links.

There was also a fair chance that general ranking movement and algorithm adjustments that are always occurring might not have been noticed in the study. In general, all studies of this sort are also very vulnerable to Google’s general complexity. There are so many factors involved which are not fully disclosed that any number of things could not have been taken into account.

Enge admits to these issues early, but he still stands by his study and the findings. He published a full review and report of the study and its methodology on Stone Temple Consulting’s website earlier this week. You can find all of the dirty details there, but the simplest conclusion is that Google shares are not driving up rankings. There will of course be many who still don’t believe this, and the debate will go on, but this tilts the scales away from what was considered conventional wisdom by many.

Don’t say Google doesn’t at least try to listen to webmasters. Though many webmasters have some pretty big (and often legitimate) grudges against the biggest search engine, it can’t be said they don’t at least try to reach out for opinions. One example of Google trying to receive feedback from site owners appeared last night, as Matt Cutts, Google’s head of webspam, tweeted out a call for webmasters and SEOs to fill out a survey.

Specifically, Cutts called for owners of small but high-quality websites who believe they should be doing better than they are in the rankings. It won’t end up affecting your rankings immediately, but it may give Google some information that will help them keep the playing field vaguely even for small businesses and big companies alike. The form reads:

Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.

The survey only asks two short questions. First, it calls for the name and URL of the small site you believe should be ranking well. Secondly, Google would obviously like to hear your opinion about why the site should rank higher. It is extremely straightforward, and shouldn’t take all that long for most webmasters to complete.