Tag Archive for: Google algorithms

Panda

Webmasters using “thin” or poor quality content may have seen a drop in traffic this week, as Google has announced that the release of the latest version of its Panda Update.

According to a post on Google+, the “slow rollout” began early this week and will continue into next week before being complete.

While those trying to do the bare minimum to improve rankings may have reason for concern, the new update could also be a relief to many who say they were improperly affected by previous updates as this update is intended to be more precise. As the announcement says:

Based on user (and webmaster!) feedback, we’ve been able to discover a few more signals to help Panda identify low-quality content more precisely. This results in a greater diversity of high-quality small- and medium-sized sites ranking higher, which is nice.

Those who were affected by previous updates may also welcome the latest release, as it means anyone who has made the right changes since the last update finally have a chance to bounce back.

television

Yesterday morning, Bill Slawski from SEO By The Sea discovered that Google has been granted a patent which suggests they are working on a method to use information about what is showing on television in your area as a ranking signal in search results.

The patents follow Google’s trend of trying to individualize search results based on personal tastes and location, and in some ways it has already been in use within Google Now. However if the method used in the patent is implemented TV schedules could have a much larger impact on your results.

The specific patent is named System and method for enhancing user search results by determining a television program currently being displayed in proximity to an electronic device. It was filed on June 30, 2011.

Here is the abstract for the patent:

A computer implemented method for using search queries related to television programs. A server receives a user’s search query from an electronic device. The server then determines, in accordance with the search query and television program related information for television programs available at a location associated with the electronic device during a specific time window, a television program currently being displayed in proximity to the electronic device, wherein the television program related information includes program descriptions for a plurality of television programs being broadcast for the associated location.

Basically, the patent would allow Google to make note of what you are watching and instantly include that information within their ranking algorith. Presumably, this would make it easier to search for products shown during commercials or for more information about the show. As explained in the patent:

Someone watching a TV program with a segment about a particular model of Porsche might execute a search query for “Porsche” or “sports cars” instead of the designation of the particular model that was the subject of the segment….

Given that the Porsche model in question is a “911 Turbo,” and that the user executed a search query for “Porsche,” the server can return information about one or more of :

1) the “911 Turbo” model (e.g., a link to information on the Porsche.com website about the “911 Turbo”),

2) information about the TV program that is currently airing with that segment, and

3) suggestions of similar programming that is currently airing or airing in the future and that is available to the user.

In this way, implementations provide enhanced search results to viewers of live TV that are relevant to the content of TV programs that they are watching or are likely to be interested in watching.

The patent also provides a diagram which explains how the patent wold work:

google-tv-process-diagram

Ultimately, it is up to Google whether you can expect to see this idea included in future search algorithms. As Google has said before, just because they have patented something doesn’t mean they will definitely be using it. But, Search Engine Land also pointed out Google Now is able to do a very similar task.

If you opt in, Google Now is already capable of listening for information about what you’re watching and updates TV cards accordingly.

local-business

Last night Google breathed new life into a forgotten algorithm by updating their local search algorithm to provide more useful, relevant, and accurate local search results that are more closely linked to traditional web search ranking signals.

While Google has remained mum on a large amount of the details, we do know the changes can be visible within Google Maps search results as well as traditional search results. From the online discussion, it also local businesses are also getting significant ranking and traffic boosts as most responses have been positive.

Most of the changes are behind the scenes, which Google doesn’t want to share with the world. However, Barry Schwartz shared that the new algorithm ties deeper into web search than previously by linking it to search features such as Knowledge Graph, spelling correction, synonyms, and more.

Google also says the new algorithm improves their distance and location ranking parameters.

The algorithm is already rolling out for US English results, but Google wouldn’t say when to expect it to roll out for other regions and languages, nor would they comment on what percent of queries have been effected or if web spam algorithms were included in the update.

As a business owner with an eye on your company’s online marketing success, you have likely heard about Google’s search engine algorithms. You may even have a general idea of how they function and effect your business’s online presence and marketing strategies.

But, unless you spend your free time reading all the SEO blogs, you probably have some questions about some aspects of how these algorithms work. If your business does international business, one of those questions is very likely if Google’s algorithms work the same around the world.

While the algorithms largely tackle the same issues, the short answer is that they do not all work the same on an international scale.

As Barry Schwartz recently highlighted, you can find specific examples of when algorithms vary across borders by looking at the Google Panda algorithm. The algorithm was initially launched for English language Google engines in February 2011, but the rest of the globe didn’t see the algorithm roll out for quite some time. Notably, it took 17 months for Google to release Panda in Korea and Japan to target Asian languages.

However, the Google Penguin algorithm didn’t have nearly the same delay. Penguin rolled out globally and impacted sites in any language.

What’s the reason for the difference? It all boils down to purpose. The Panda algorithm focused on language and content, and those algorithms have to be customized and focused based on the wide variety of languages found around the world. Meanwhile, algorithms like Penguin target off-page technical factors like links, which raises less of an issue customization.

Google Updates Banner

No matter what you personally think about Google, there are two undeniable facts about the massive company. They are the number one source of online searches by a wide margin, and they are constantly changing. Trying to keep track of all the individual updates from Google can be dizzying. It seems every time you are almost adjusted to one change, there is a new update popping up.

But, following the changes over at Google is important for anyone running a website. There are some pretty clear patterns in Google’s updates over the past year, and if you want your website to be successful through 2014, you will need to be prepared for the types of changes on the horizon.

To assist you in reviewing the changes from last year, E2M Solutions produced an infographic that covers a few of the most important updates on Google Search during 2013. As you might expect, Penguin and Panda are both big parts of the infographic. But, there are also some less known search updates such as Google Hummingbird.

The infographic isn’t perfect however. Search Engine Land points out that Hummingbird was not rolled out on August 20, 2013, as it is listed. Also, “Link Devaluation” has never been confirmed by Google, and thus it is only speculation. It is arguably pretty clear that links have lost some of their power in the past year, but it can be debated how that was actually implemented.

You can view the infographic below, or over at E2M’s website.

 11-Most-Important-Google-Algorithm-Changes-2013

A couple weeks ago, Google released an update directly aimed at the “industry” of websites which host mugshots, which many aptly called The Mugshot Algorithm. It was one of the more specific updates to search in recent history, but was basically meant to target sides aiming to extort money out of those who had committed a crime. Google purposefully targeted those sites who were ranking well for names and displayed arrest photos, names, and details.

Seeing how a week went by without response, you wouldn’t be judged for thinking that was the end of the issue, but finally one of the biggest sites affected, Mugshots.com, publicly responded to Google’s update. Barry Schwartz reported Mugshot.com published a blog post, in which they claim Google is endangering the safety of Americans.

Mugshots was among three sites who suffered the most from the algorithm, the others including BustedMugshots and JustMugshots.

In their statement, they say, “Google’s decision puts every person potentially at risk who performs a Google search on someone.”

If Mugshots.com could tone down the theatrics, they might have been able to make a reasonable argument. However, they also ignore there are many other means for employers and even common citizens to find out arrest records and details in less humiliating and more contextualized means.

Ranking PodiumA WebmasterWorld thread from roughly a month ago brings up an interesting question for us SEO professionals. While we focus on the algorithms we know about such as Penguin or Panda, it has long been suggested that Google could also be using different ranking factors depending on the industry a site fits within. In other words, sites for roofing companies would be being reviewed and ranked according to different standards than sites for tech companies.

Well, Matt Cutts, the head of Google’s Webspam team and trusted engineer, took to that thread to dispel all the rumors. He doesn’t deny that Google has “looked at topic-specific ranking.” Instead, he says scaling was the issue. In his answer, Cutts explains, “We have looked at topic-specific ranking. The problem is it’s not scalable. There’s a limited amount of that stuff going on — you might have a very spammy area, where you say, do some different scoring.”

He continued, “What we’re doing better is figuring out who the authorities are in a given category, like health. If we can figure that out, those sites can rank higher.”

While Google says they aren’t using different algorithms for different industries, it has been announced that Google uses Subject Specific Authority Ranking, which helps authorities in varying topics to be selected as the most reputable on that subject.

Of course, looking at the comments from SEO Roundtable, who reported on the WebmasterWorld thread, it is clear many don’t necessarily believe Cutts’ statement. Some say they have “always seen a difference in industry types,” while others argue that different industries necessitate using different ranking factors or algorithms due to lack of specific resources available to that industry. For example, industrial companies don’t tend to run blogs, which means creating new content through blogging shouldn’t be as honored as it is on other topics like health and tech with a lot of news constantly coming out.

For now, all we have to go on is Cutts’ word and our own experiences. Do you think Google is using different algorithms depending on industry? Do you think they should be?