Tag Archive for: Danny Sullivan

Yesterday we reported on the mass hijacking of thousands of Google+ Local listings. In short, over a short period of time a huge number of hotels with business listings for Google Maps and Search. The story was broke open by Danny Sullivan from Search Engine Land, who attempted to track down the source of the spam attack, with no concrete evidence to suggest who the culprit actually is.

While the issue could have a big affect on many businesses it the hotel sector, it is more notable for showing that other attacks could happen in the future. Even worse, no one outside of Google has been able to explain how this could occur, especially with the number of big hotel chains affected. The hotels hit with the spam weren’t mom-and-pop bed and breakfast places. Most of the listings were for huge hotel chains, such as the Marriott hotel shown in the example of a hijacked link below.

If Google does know how this was able to happen, they aren’t telling. In fact, Google has been terribly quiet on the issue. They’ve yet to issue an official public statement, aside from telling Sullivan that he could confirm they were aware of the problem and working to resolve it.

The only direct word from Google on the hijackings is a simple response in an obscure Google Business Help thread from Google’s Community Manager, Jade Wang. If it weren’t for Barry Schwartz’s watchful eye, it is possible the statement would never have been widely seen. Wang said:

We’ve identified a spam issue in Places for Business that is impacting a limited number of business listings in the hotel vertical. The issue is limited to changing the URLs for the business. The team is working to fix the problem as soon as possible and prevent it from happening again. We apologize for any inconvenience this may have caused.

Yesterday, thousands of hotels with Google+ Local listings had their pages manipulated to replace their links to official sites with links leading to third-party booking services. Google+ Local listings are what Google uses to provide local results in Google Maps and Google Search.

It currently appears to be isolated entirely to hotels, and Google has already said they are aware of and fixing the problem, but Danny Sullivan’s research into who is responsible for the hijacking has yet to turn up anything concrete. What we do know is that thousands of listings were changed to point to either RoomsToBook.Info, RoomsToBook.net, or HotelsWhiz.com.

Source: Search Engine Land

Source: Search Engine Land

The problem is, we can’t be sure any of these companies are actually directly responsible. Only one person responded to Sullivan’s inquiries. Karim Miwani, listed on LinkedIn as the director of HotelsWhiz.com, replied saying (sic):

We have recently seen this issue and have reported to Google webmaster already. If you have seen any links please forward it to me and I will submit the request.

Our team is already in the process of blocking list of certain domains and IP addresses from back-linking us.

Thank you for pointing this out if you have any more external domains acting in aboce manner please report it to us on

You can get all the details on the hijacking from Danny Sullivan’s investigative report into the issue, but this event has a broader relevance outside of the hotel industry. The mass hijacking of Google’s local listings suggests their is a security flaw in the Google+ Local listings which needs to be addressed and resolved. It may explain why Google has largely remained mum on the subject aside from confirming that it occurred.

You most likely have nothing to worry about with your own local business’s listings, so long as you don’t work in the hotel industry. However, it could have implications about the future of Google+ Local listings. Either the security flaw that allowed this to happen will be fixed, or issues like these could affect other industries on a larger scale.

Considering how important these listings are to Google Maps and Search, a larger attack could be a serious problem for Google.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google made waves last week when they announced the expansion of how “Shared Endorsements” are used in ads, as well as the change to their terms of service to reflect this. The funny thing is, most people don’t understand what is actually changing.

The majority were simply confused when they heard that Google was implementing the use of social information into ads, because that has been going on for about two years now. But, as Danny Sullivan explains, the devil is in the details.

Throughout 2011, Google made changes which allowed advertisers to begin integrating images of people who liked their pages on Google+ into text and display ads. All that really showed was a small profile picture, and the phrase “+1’d this page.”

Starting on November 11, that won’t quite be the case. More than simply the people who +1 a page is going to be shown in ads. For example, if you comment, leave a review, or even follow a particular brand, those types of actions can be shown in ads on Google. A mockup of how it will appear is below.

These changes won’t take place until November, but don’t expect a prompt roll-out. It is possible you may start seeing the changes starting the 11th, but more likely it will gradually appear over the span of a few days or even a couple of weeks.

Not much else is known about how advertisers will be able to create these types of ads yet. Most likely, Google would not have announced the update this early, except they had to get the terms of service updated before they could even begin to implement this feature.

If you don’t want to appear in any of these types of ads, you can go to this page and click the tickbox at the bottom to opt out for all ads in the future.

Two years ago, Search Engine Land released their “Periodic Table of SEO Ranking Factors”, but we all know that SEO doesn’t stay the same for that long, especially with the bigger changes that Google has been pushing out lately. That is why the periodic table was recently updated, clarified, and re-branded “The Periodic Table of SEO Success Factors”.

When you hear that Google has over 200 “signals or ranking factors” and over 10,000 “sub-signals” it is easy to get overwhelmed or confused as to where you should focus your efforts. However, those big numbers are usually created by speculation such as whether or not Google pays any attention to Facebook Likes (the truth is, we don’t know).

While there may be a full 200 signals Google uses, there is a hierarchy to how important each signal is, and we have a pretty good idea of the most important ranking factors that Google relies on. These bigger signals are also the most likely to stay stable over time. If we somehow were to find out the current full list of ranking factors, the system would change again by the time you had their weight and function mapped out. Heck, they may have changed while I typed this sentence.

Search Engine Land’s periodic table doesn’t attempt to focus on the small things, but instead shows you the areas that have the biggest impact on rankings and visibility. As the creators see it, the table is a starting point for new SEO and a friendly reminder for the veterans. The simple version of the periodic table is below, but you can find the expanded table as well as the key for understanding the image here.

Periodic Table of SEO Success

Have you ever wondered if your site was penalized by Google through automated algorithms or a real human person? Now, you will almost always know because Google reports almost 100 percent of manual penalties.

Matt Cutts, head of Google’s web spam team, described this new policy at Pubcon this year, saying, “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site.”

“If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much all of those situations.”

Cutts did clarify that there may be rare instances where this doesn’t occur, but their aim to get to 100-percent.

In June, at SMX Advanced, Cutts gave a figure of 99 percent reporting, but Cutt believes they are currently reporting every instance of manual actions.

Danny Sullivan from Search Engine Land has more information about the distinction between manual and algorithmic actions.

 

I recently wrote about the release of Google’s Disavow Links tool, but there are some more questions popping up that need answering. So, let’s cover a little bit more about the tool.

First off, the tool does not immediately take effect. This is one of many reasons Google suggests publishers try to remove questionable links first by working with site owners hosting links, or companies that they may have purchased links through.

Instead of disavowing the links immediately, “it can take weeks for that to go into effect,” said Matt Cutts, head of Google’s web spam team at a keynote during the Pubcon conference. Google also has reserved the right to not use submissions if it feels they are questionable.

It is important to be accurate when making your file to submit to Google. Because of the delay in processing the file, it may take another few weeks to “reavow” links you didn’t mean to discount.

Once you have submitted a file to Google, you can download it, change it, and then resubmit.

The tool is mainly designed for site owners affected by the Penguin Update, which was focused on hitting sites that may have purchased links or gained them through spamming. Before, Google ignored bad links, but now they act as a negative mark against the site.

This change prompted fear in some of the SEO industry that site owners would create bad links pointing to their site, or “negative SEO.” This tool helps to ensure that negative SEO is not a worry by allowing you to disavow any of those types of links.

Danny Sullivan from Search Engine Land has even more information about the tool, and Matt Cutts has a 10 minute long video answering questions.

 

Not really much more to say than that. I found a post on his blog from not too long ago, and it cracked me up.  Even people outside of internet marketing might find it worth checking out.

Now I’m already an iPhone user, but I’ve heard plenty about Google’s Android. And while I’m not able to do a fair comparison, Danny Sullivan (SEO extraordinaire) is, and he did.

He does more of a businessman’s review, as to how each phone worked in an efficient manner (or not), and what his impressions were of each overall.  Check it out if you’re trying to decide between the two.

SEO is dead!  Come the cries from doubters, and they do vocalize it occasionally.  The latest is from a man named Robert Scoble.  He doesn’t exactly say SEO is dead, but he questions its validity.

Well, SEO (and internet marketing, in general) is not a static element.  There are always changes that fit into it.  For people who think SEO is just making some on-page tweaks, then that alone will not do a lot.  And as more people and businesses get online, that limited amount of result that pure on-page/on-site adjustments will get will only go down.

SEO has been questioned on how effective it truly is for years now, and many have said it wouldn’t last.  As long as 12 years ago.  Well, it’s lasted, and personally I think it will last – maybe not in exactly the same fashion, but it’s not going to go away.

Danny Sullivan had a lot of interesting points to say (as well as responding directly to Robert – check his post) in Search Engine Land.  Of all people, I think Danny is someone who is worth listening to when it comes to predicting the progress of SEO.  Check out his full post to see more.