Tag Archive for: Barry Schwartz

Matt CuttsGoogle has been bringing down the hammer on spammy websites quite a bit recently with more specific penalties for sites that aren’t following guidelines. There have been several high-profile cases such as the Rap Genius penalty, and several attacks on entire spammy industries. But, if you are responsible for sites with spammy habits, a single manual action can hurt more than just one site.

It has been suggested that Google may look at your other sites when they issue manual actions, and Matt Cutts has all but confirmed that happens at least some of the time.

Marie Haynes reached out to Cutts for help dealing with a spammy client, and his responses make it clear that the client appears to be linked to “several” spammy sites. Over the course of three tweets, Cutts makes it obvious that he has checked out many of the spammer’s sites, not just the one who has received a manual action, and he even tells one way Google can tell the sites are associated.

Of course, Google probably doesn’t review every site penalized webmasters operate, but it shows they definitely do when the situation calls for it. If your spammy efforts are caught on one site, chances are you are making the same mistakes on almost every site you operate and they are all susceptible to being penalized. In the case of this client, it seems playing against the rules has created a pretty serious web of trouble.

bing-2Google isn’t the only search engine waging a war on black hat or manipulative SEO. Every major search engine has been adapting their services to fight against those trying to cheat their way to the top of the rankings. This week, Bing made their latest move against devious optimizers by amending their Webmaster Guidelines to include a stern warning against using keyword stuffing to try to rank highly.

The warning cuts the cheating SEOs no slack, cautioning that Bing may demote or entirely delist sites caught using keyword stuffing. The change wasn’t officially announced, but Barry Schwartz says it appeared in the guidelines sometimes yesterday.

The new section on keyword stuffing reads:

When creating content, make sure to create your content for real users and readers, not to entice search engines to rank your content better. Stuffing your content with specific keywords with the sole intent of artificially inflating the probability of ranking for specific search terms is in violation of our guidelines and can lead to demotion or even the delisting of your website from our search results.

The SEO community is sometimes thought of being a stuffy industry, but we like to have fun like any other group of people. For example, you probably would never have guessed that there are online games specifically aimed at the optimization community.

Yet, in the past week two such games have been found, both very SEO-centric. They’re a cool novelty and they offer about as much fun as the games they are based on.

First we have Donkey Cutts, a Donkey Kong knock-off, using prominent SEO personalities and tech imagery in the place of an oversized monkey and barrels. Obviously Matt Cutts from Google is featured, but players are also able to pick from other SEO personalities (though there is some disagreement who exactly the characters are).

Donkey Cutts

There is also Madoogle, a clone of Angry Birds which lets you attack black hat SEOs with some more easily recognizable SEO faces. This one includes versions of Matt Cutts (again), Rand Fishkin, Lisa Barone, and Barry Schwartz.

Madoogle

They probably won’t help you rank much higher, but these games might allow you to relax for a few minutes while still keeping SEO fresh in your mind.

As social media has grown there has been a consistent debate as to whether Google considered social signals when ranking websites. There have been several studies suggesting a correlation between strong social media presences and high rankings on the search engine, but there are many reasons they could be related. Well, Google’s head of search spam, Matt Cutts, may have finally put the question to rest with his recent Webmaster Chat video.

According to Cutts, Google doesn’t give any special treatment to websites based on social information. In fact, sites like Facebook and Twitter are treated the same as any other website. The search engine doesn’t do anything special such as indexing the number of likes or shares a page has.

Cutts explains that Google did at one point attempt to index social information. Barry Schwartz suggests Matt is referring to Google’s real time search deal expiring with Twitter. There was a lot of effort and engineering put into the deal before it was completely blocked and nothing useful came to fruition. Simply put, Google doesn’t want to invest more time and money into it only to be blocked again.

Google is also worried about crawling identity information only to have that information change long before Google is able to update it again. Social media pages can be incredibly active and they may not be able to keep up with the information. Outdated information can be harmful to people and user experience.

But, you shouldn’t count social media out of your SEO plan just because it isn’t directly included in ranking signals. Online marketers have known about the other numerous benefits of social media for a long time, and it is still a powerful you can use to boost your online presence and visibility.

A strong social media presence opens up many channels of engagement with your audience that can make or break your reputation. It can also drive huge amounts of traffic directly to your site and your content. By reaching out and interacting with your audience, you make people trust and value your brand, while also encouraging them to explore your site and the content you offer. Google notices all this traffic and activity on your site and rewards you for it as well.

You can see the video below:

Yesterday we reported on the mass hijacking of thousands of Google+ Local listings. In short, over a short period of time a huge number of hotels with business listings for Google Maps and Search. The story was broke open by Danny Sullivan from Search Engine Land, who attempted to track down the source of the spam attack, with no concrete evidence to suggest who the culprit actually is.

While the issue could have a big affect on many businesses it the hotel sector, it is more notable for showing that other attacks could happen in the future. Even worse, no one outside of Google has been able to explain how this could occur, especially with the number of big hotel chains affected. The hotels hit with the spam weren’t mom-and-pop bed and breakfast places. Most of the listings were for huge hotel chains, such as the Marriott hotel shown in the example of a hijacked link below.

If Google does know how this was able to happen, they aren’t telling. In fact, Google has been terribly quiet on the issue. They’ve yet to issue an official public statement, aside from telling Sullivan that he could confirm they were aware of the problem and working to resolve it.

The only direct word from Google on the hijackings is a simple response in an obscure Google Business Help thread from Google’s Community Manager, Jade Wang. If it weren’t for Barry Schwartz’s watchful eye, it is possible the statement would never have been widely seen. Wang said:

We’ve identified a spam issue in Places for Business that is impacting a limited number of business listings in the hotel vertical. The issue is limited to changing the URLs for the business. The team is working to fix the problem as soon as possible and prevent it from happening again. We apologize for any inconvenience this may have caused.

Keymaster

Source: Jason Tamez

Does Google control the internet? Of course no one has control over the entire existance of the internet, but the major search engine has a huge influence in how we browse the web. So, it is interesting to hear a Google representative entirely downplay their role in managing the content online.

Barry Schwartz noticed the statement in a Google Webmaster Help forums thread about removing content from showing up in Google. It’s a fairly common question, but the response had some particularly interesting information. According to Eric Kuan from Google, the search engine doesn’t play a part in controlling content on the internet.

His statement reads:

Google doesn’t control the contents of the web, so before you submit a URL removal request, the content on the page has to be removed. There are some exceptions that pertain to personal information that could cause harm. You can find more information about those exceptions here: https://support.google.com/websearch/answer/2744324.

Now, what Kuan said is technically true. Google doesn’t have any control over what is published to the internet. But, Google is the largest gateway to all that content, and plays a role in two-thirds of searches.

This raises some notable questions for website owners and searchers alike. We rarely consider how much of an influence Google has in deciding what information we absorb, but they hold some very important keys to areas of the web we otherwise wouldn’t find.

As a publisher, you are obliged to follow Google’s guidelines in order to be made visible to the huge wealth of searchers. It is an agreement which often toes uncomfortable lines as the search engine has grown into a massive corporation encompassing many aspects of our lives and future technology.

When you begin marketing and optimizing your site online to become more visible, you should keep this agreement in mind. A lot of people think of Google as a system to take advantage of in order to reach a larger audience. While you can attempt to do that, you are breaking the agreement with the search engine and they can penalize your efforts at any time.

 

Stop Sign

Source: Sharon Pruitt

Sometimes the source of the problem is so glaringly simple that you would never consider it. This is the case of many webmasters frustrated with their sites not being indexed or ranked by search engines. While there are numerous more technical reasons search engines might refuse to index your page, a surprising amount of time the problem is caused by you telling the search engine not to index your site with a noindex tag.

This is frequently overlooked, but it can put a complete halt to your site’s rankings and visibility. Thankfully it is also very easy to fix. The biggest hassle is trying to actually find the redirect, as they can be hard to spot due to redirects. But, you can use a http header checker tool to verify before the site page redirects.

Don’t be embarrassed if this small mistake has been keeping you down. As Barry Schwartz mentions on SEO Roundtable, there have been large Fortune 500 companies with these same problems. John Mueller also recently ran into someone with a noindex on their homepage. He noticed a thread in the Google Webmaster Help forums where a site owner had been working to fix his problem all day with the help of the other forum members. John explained the problem wasn’t nearly as complex as everyone else had suggested. It was much more obvious:

It looks like a lot of your pages had a noindex robots meta tag on them for a while and dropped out because of that. In the meantime, that meta tag is gone, so if you can keep it out, you should be good to go :).

When you encounter a problem with your site ranking or being indexed, it is always best to start with the most obvious possible causes before going to the bigger and more difficult mistakes. While we all like to think we wouldn’t make such a simple mistake, we all also let the small things slip by.

Google AdSenseIt seems something odd is happening over at Google AdSense. While there is always a pretty much constant stream of complaints coming in about drops in CTRs (click through rates), they are usually isolated cases. Most often, an individual is simply experiencing a problem and their issues are easily resolved.

But, over the past week there has been an unusually large number of people complaining at both the Google AdSense Help and WebmasterWorld forums that their CTR have declined significantly in the past weeks. As Barry Schwartz noticed, not only is the number of threads enough to raise an eye, but there are some who are saying this is having a big impact on their earnings. Clearly something is afoot.

Some quotes from commenters include:

My blog traffic still increasing but adsense earnings dropped from three days. I have a message from adsense help as “Your earnings were 76% below our forecast”.

and

At the risk of getting screamed at for asking this question (yet again). My ctr went down the last 3 days (Sunday,Monday, Today) a whopping 75%!

Not everyone is experiencing the drop in CTR (Schwartz himself has seen an increase), but this appears to be a widespread enough issue to cause some alarm. The world isn’t ending, but you should probably check out your own CTR to make sure everything is alright.

SpeedometerHave you noticed a difference using Google on your smartphone this past week? Last week Ilya Grigorik, a Google developer advocate, announced Google was making a tiny tweak which should speed up mobile search on both Safari and Chrome by 200-400 milliseconds.

The company implemented an attribute called <a ping>, which allows them to basically do the click tracking and redirect practically at the same time, as Barry Schwartz explained.

You might not actually be experiencing search with the change, since Google is “gradually rolling out this improvement to all browsers that support the <a ping> attribute.” Grigorik also took the time to explain exactly how the change works:

What’s the benefit? Whenever the user clicks on a result, typically they are first sent to a Google URL redirector and then to the target site. With <a ping>, the click is tracked using an asynchronous call, meaning that the user sees one less redirect and a faster overall experience!

What’s the best way to rank highly right now, according to Google? Most SEO professionals would say some one of two things. Creating a quality site will get your site ranked highly, and quality content is the most powerful way to improve the quality and value of your site.

According to Ryan Moulton, a software engineer at Google who Barry Schwartz from SEO Roundtable implies works in the search area, high quality content doesn’t necessarily work like that.

The assumption is that the “high quality” content Google favors is the most accurate and informative text available. But, Moulton says we misunderstand or forget about actual usefulness.

He was defending Google in a Hacker News thread on why Google ranks some sites highly despite the content not being entirely accurate, and in some people’s eyes low quality. He explains that some sources may be the most accurate, but they are often way too high-minded for the average searcher.

He states, “there’s a balance between popularity and quality that we try to be very careful with. Ranking isn’t entirely one or the other. It doesn’t help to give people a better page if they aren’t going to click on it anyways.”

Ryan then continues with an example:

Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are mayoclinic.com and answers.yahoo.com.

Almost anyone would agree that the mayoclinic result is higher quality. It’s written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like “Nonsteroidal anti-inflammatory drugs,” which a lot of people aren’t. Half of people aren’t literate enough to read their prescription drug labels: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831578/

The answer on yahoo answers is provided by “auntcookie84.” I have no idea who she is, whether she’s qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am.

Google has to balance many factors in their search results, and the simple fact is most searchers aren’t looking for comprehensive scientific explanations for most of their problems. They want the most relevant information for their problem in terms they can understand.

It should be noted Google does allow access to these academic sources in other areas of their search, but when writing for the main search page, your content needs to be accessible to your audience. Your average SEO news source can get away with using technical language to an extent, because those reading your information likely already have built a vocabulary for the topic.

However, if you are offering a service or attempting to educate to the general public about your field, you need to use terms they can easily understand without a dictionary and address their needs head-on.

There is still certainly a place for more extensive content. For instance, the Mayo Clinic and WebMD still rank higher than Yahoo Answers for most medical searches, simply because they are more reliable.