Tag Archive for: Panda


Over the past 2 days, the SEO community has received confirmation that Google is rolling out not one, but two web spam focused algorithm updates; Panda 4.0 and Payday Loan 2.0. Panda 4.0 was confirmed by Matt Cutts on Twitter, while Search Engine Land initially announced the newest Payday Loan update which was later verified by Cutts.

As with any major algorithm update, there is much more speculation than there are facts at the moment. However we do know a little bit about the roll outs of the algorithm updates and what they are focused on.

Panda 4.0

Panda 4.0 is being called the ‘softer update’ in relation to its precursor thanks to a discussion back in March. It has been stated that the update affects different language queries to different extents, but Google estimates the effect on English searches is about 7.5% of queries.

Considering the reports of sites seeing significant recoveries, it is safe to assume this update is a little more generous and more welcomed than than the previous updates to Panda.

Payday Loan 2.0

The Payday Loan Algorithm is a bit less well known, as it was first launched last June and only targets ‘very spammy queries’; primarily the type of spammy queries associated with payday loans, insurance, and accident claims.

A Google Spokesperson issues a statement on the update, saying:

“Over the weekend we began rolling out a new algorithmic update. The update was neither Panda nor Penguin – it was the next generation of an algorithm that originally rolled out last summer for very spammy queries.”

So far estimates say only .2% of English queries were affected by this update, though this is also an international rollout affecting different languages to different extents.

Google recently integrated their Panda algorithms into their normal indexing process, and this has sprung up a whole new batch of questions from webmasters. The most common question is specifically how site owners will know if their site has been hit by Panda. Really, it was only a matter of time before Matt Cutts, the noted Google engineer and head of Webspam, addressed the issue.

And that is what he did earlier this week, when Cutts used one of his Webmaster Help videos to respond to Nandita B.’s question, “how will a webmaster come to know whether her site is hit by Panda? And, if her site is already hit, how she will know that she has recovered from Panda?”

Now that the Panda algorithm is a part of the normal search indexing process, finding out if you’ve been affected by Panda won’t be near as easy. You can’t just compare your analytics reports with recorded dates for Panda rollouts. But, Cutts does have some suggestions if you think your site has been affected.

Cutts said, “basically, we’re looking for high quality content. So if you think you might be affected by Panda, the overriding goal is to make sure that you’ve got high quality content.”

Of course, high quality content in this context means sites that offer real value to users. It appears integrating Panda was actually one of the last steps in a shift towards a high focus on high quality content. They’ve been suggesting focusing on value for a long time, and now it is officially a large part of the normal search algorithm.

Trash BasketTwo years ago Google unleashed Panda onto the world, and SEO hasn’t been the same since, especially when it comes to link building. Hundreds of thousands of sites have been penalized and some have made their way back to where they were, but countless others have perished or are still trudging along trying to recover.

Some of those sites were mostly innocent and got in trouble for just being a little too unscrupulous or not quite knowing what they were doing with link building, but the wide majority of these sites hit by penalties were flagrantly engaging in cheating trying to get their site’s traffic up by gaining a massive quantity of low quality links instead of a respectable number of solid links.

Still, those penalized have had to try to find a way to get their site restored to its former traffic rates and search rankings, and after two years of toiling away, the question eventually arises: “Should I just give up and start over?” Well, Eric Ward gives a simple question to that. “Are you going to do things differently with the new site than you did with the old site? If not, then it really doesn’t matter.”

Most of the websites unable to reclaim their former “glory” are still struggling because they haven’t wised up. The only way to be able to consistently rank highly on Google and Bing is to run a quality site people will want to visit. You can use SEO to get you there, but you can’t fake a good site.

A recent Google Webmaster Hangout seems to have implied that Google is pushing out Penguin Updates without announcing them. Penguin has only been officially updated twice after its initial release, and the last update was in October 2012. In the video, John Meuller from Google makes it appear that Google has been updating Penguin on a regular basis but has not announced them all. The comments come at around the four minute mark in the video below.

When asked for clarification by Search Engine Land, Meuller says that he was referring to general “link analysis” refreshes, but does not include the Penguin algorithm. They also confirmed the last update was the one announced in October.

One of the reasons some questioned if Penguin was being refreshed is Panda, the update always mentioned in association with Penguin, has been updated on roughly a monthly basis. Google didn’t confirm another update is coming, but the updates have been coming steadily, and there are signs a new one should arrive in the next few days.

It is impossible to understate just how quickly SEO changes and how important it is to keep up. Strategies change, and search engines update countless times. Google’s Penguin and Panda updates are clearly the most talked about, but Google has had plenty of other updates with less catchy names throughout the last year, like the Knowledge Graph (okay, that one has a catchy name too).

Penguin and Panda changed the landscape of searching completely and strategies have had to adapt to them quickly, though SEOs not taking advantage of gray area SEO tactics like link buying were mostly unaffected. That doesn’t mean that they don’t have to follow the new guidelines as well.

Most of these guidelines are more broad however, but Don Pathak, writer for Search Engine Journal, tried to simplify and explain them, and in doing so came out with a few specific points.

Many writers, usually with vested interests, have argued that SEO success can’t be done with just great content, and it is true to an extent that the internet is competitive to the point where great content doesn’t quite get you to the top search result. However, Google has also made it very clear that it wants to favor the quality of content over SEO tactics. Keeping a site fresh and relevant will give you as much of a boost as any behind the scenes tweak can.

The new Google also favors locality, so if your business has a local presence in a marketplace, optimizing for that location will help customers find your service. You can get started by simply establishing a local profile on Google Places for Business, and encourage customers to give you reviews on the site.

SEO will likely always concern itself with the technical dealings behind the curtain of a website, but Google wants to give preference to those who operate valuable and well made websites, not those manipulating every loophole to get the market advantage. As with anything run mostly through algorithms, there will always be “hacks” or weaknesses, but rather than exploiting them as they open, it is better to just create a website with real value.

Google’s made some very big changes in the past couple of weeks, and it’s affecting more sites than previously expected. In what way? Depending on how SEO has been done, some sites are dropping a few positions, some are dropping by multiple pages.

Read more