Posts

Google is making some changes to its image search results pages by removing details about image sizes and replacing them with icons indicating what type of content the image is taken from.

For example, images pulled from recipes show an icon of a fork and knife, those from product pages show a price tag icon, and pictures pulled from videos include a “play” icon.

Google’s Search Liaison Danny Sullivan says the change is coming later this week for desktop search results and shared a few examples of what the icons look like in action:

As you can see, by mousing over the icons users can get additional details including the length of a video.

Where To Find Image Size Details

To make room for these new icons, Google is removing the traditional image dimension information provided in the search results.

However, the information is still available to users after clicking on a specific thumbnail and mousing over the larger image preview.

Sullivan also shared an example of this:

Licensing Icons In Beta

Along with the announcement, Sullivan provided an update on a test to include licensing information alongside photos.

Currently, the company is beta testing the ability to pull licensing information from structured data on a website, though it is unclear if or when this feature will be widely available. Interested image owners can find out more about how to mark up your images in Google’s guide.

Google_AuthorRankLast week, Google confirmed they would be pulling all authorship information from their search results pages but confusion between Google Authorship and Author Rank has been causing some chaos in the SEO world.

Before you start burning bridges that feed into Author Rank and can legitimately help your site, take the time to check out the explanation on the situation from Danny Sullivan. The explanation helps clear up how authorship can die and Author Rank is still alive and as important to search as ever.

Every month, comScore releases a “U.S. Search Engine Rankings” report illustrating the market shares of the most commonly used search engines. From month to month the results have stayed largely the same for over a year, with Google taking in almost exactly two-thirds of the market and the other search engines like Bing and Yahoo slowly growing and shrinking by minuscule percentages.

ComScore’s report is widely trusted by most of the online marketing community, but recently analysts from Conductor attempted to challenge comScore’s findings with their own report claiming Google actually rakes in a significantly larger percentage of searches. They even went as far as to title their reports “Why You Shouldn’t Trust comScore’s Numbers for Search Engine Market Share.”

Conductor-WhitePaper-2014-comScore_pdf__page_4_of_5_-600x369

For such an obvious attack on another analytics firm, you would assume Conductor was publishing new information or even comparing the same factors. As Danny Sullivan from Search Engine Land shows in his article reviewing Conductor’s findings however, Conductor’s findings shouldn’t be news to anyone paying attention, and they don’t disprove comScore’s findings.

The issue i that, when people hear that Google controls two-thirds of the search market many publishers assume they should see close to the same proportion of traffic coming from the search engine. Instead, most publishers see significantly more traffic from Google than their market score seemingly indicates.But, market share isn’t a measurement of the traffic sites receive.

The monthly report from comScore reflects the number of actual searches conducted from the major search engines. Most importantly, their report isn’t affected by where the user goes after clicking on a search listing. Sullivan refers to this type of measurement as “before-the-click” behavior. Every search gets counted equally, no matter what the destination is.

Conductor’s analysis instead focuses on “post-click” behavior, or the traffic publishers receive from search engines. In their report, the information that matters most is the post-click activity. If someone does a search and clicks on a link that leads them back into the search engine, it isn’t measured in Conductor’s report.

The discrepancy between these two types of reports isn’t anything new. In fact, Sullivan cites 2006 as the last time it received significant attention due to Rich Skrenta writing that Google’s “true market share” being 70% while most measurement services were estimating their market share at 40%. Most entertainingly, Sullivan’s response then still perfectly explains why a gap might form. So much changes in search on a daily basis it is always noteworthy when something manages to be admirably accurate after eight years. As Danny Sullivan wrote at the time:

“But a search for something on Yahoo Sports? That might be counted as a “search” and it is – but it’s not the type of search that would register with site-based metrics. The searcher might stay entirely inside Yahoo.”

Search engines with the largest gaps favor their own services more than others, which would suggest that Bing’s 13% gap indicates they direct searchers to their own services and platforms more than any other search engine. Surprisingly, Google appears to favor themselves the least, with a -18% gap.

Of course, there is always the possibility that this gap could be created or exacerbated by other factors that may not have been in play at the time. When Sullivan asked comScore for its opinion on the difference between its reports and Conductor’s recent study he was told mobile search could also potentially be an influence. Google has a higher share of mobile search than compared to desktop figures, and comScore’s reports only include data from desktop users.

Both reports serve their own purposes, but both also highlight the same issue. Google has a huge hold on search traffic that should be recognized and planned for. But, those who buy into Conductor’s study may be tempted to ignore the other search engines entirely. To each their own, but my opinion still favors an approach which puts the most weight in Google but doesn’t cut out the other search engines too much.

We try to keep our readers and clients updated with all of Google’s biggest news, whether it be redesigns, guideline changes, or newsworthy penalties. It makes sense, as Google currently receives over half of all searches made every day.

But, even those of us who keep a careful eye on the best guidelines and policy trends of the biggest search engine can end up outright confused by Google occasionally. A story reported by Danny Sullivan yesterday happens to be one of those situations.

Google has been outspoken against guest blogging or guest posts being used “for SEO purposes”, and they have even warned that sites using these questionable guest posts could be subject to penalties. However, the latest story claims that Google has penalized a moderately respected website for a single guest post. Most interesting, the post was published well before the guidelines were put into place and seems to be relevant to the site it was posted on.

The penalty was placed against DocSheldon.com, which is run by Doc Sheldon, a long-time SEO professional. Recently, Sheldon was notified that a penalty was placed against his entire site. The penalty report informed Sheldon that Google determined there were “unnatural links” from his site.

So far, this is the typical penalty put against those who are attempting to run link schemes of some form. But, obviously someone who has been around as long as Sheldon knows better than that. So what were the “unnatural links”?

It took an open letter from Doc Sheldon to Google, which he then tweeted to Matt Cutts, one of Google’s most distinguished engineers, to get some answers.

Cutts mentions one blog post published to Sheldon’s site, which appears to have been written in March 2013.

The post is exactly what the title suggests it would be (“Best Practices for Hispanic Social Networking”), but it contains two links at the end, within the author’s bio. One of the links takes you to the author’s LinkedIn page. The other, however, claims to take people to a “reliable source for Hispanic data”, which leads to a page that appears to be closer to a lead-generation pitch about big data.

Source: Search Engine Land

Source: Search Engine Land

Now, there are a few issues with the link. The page it leads to is suspect, and some would say that the words “Hispanic data” in the anchor text could be potentially too keyword rich. But, Cutts seems to imply that the content of the blog post was as much an issue as the links. As Sullivan puts it, “Apparently, he fired up some tool at Google to take a close look at Sheldon’s site, found the page relating to the penalty and felt that a guest post on Hispanic social networking wasn’t appropriate for a blog post about SEO copywriting.”

That would be a fair criticism, but if you take a closer look at the top of Sheldon’s site, he doesn’t claim the site to be limited to SEO copywriting. In fact, the heading outright states that the site relates to “content strategy, SEO copywriting, tools, tips, & tutorials”. You may take note that social practices for any demographic could certainly be relevant to the topic of content strategy.

So, as the story stands, Google has levied a large penalty against an entire site for a single blog post with one questionable link, all because they decided it wasn’t on-topic. Does that mean Google is now the search police, judge, and jury? Sadly, it appears so for the moment. Little appears to have changed since the story broke yesterday. DocSheldon.com is still dealing with the penalties, and Google hasn’t backed down one bit since the penalty was sent.

It goes without saying, the events have sparked a large amount of debate in the SEO community, especially following the widely followed penalty placed against the guest blog network MyBlogGuest. The wide majority agree this penalty seems questionable, but for the moment it appears it is best to stay under the radar by following Google’s policies to the letter. Hopefully they will become a bit more consistent with their penalties in the meantime.

Yesterday we reported on the mass hijacking of thousands of Google+ Local listings. In short, over a short period of time a huge number of hotels with business listings for Google Maps and Search. The story was broke open by Danny Sullivan from Search Engine Land, who attempted to track down the source of the spam attack, with no concrete evidence to suggest who the culprit actually is.

While the issue could have a big affect on many businesses it the hotel sector, it is more notable for showing that other attacks could happen in the future. Even worse, no one outside of Google has been able to explain how this could occur, especially with the number of big hotel chains affected. The hotels hit with the spam weren’t mom-and-pop bed and breakfast places. Most of the listings were for huge hotel chains, such as the Marriott hotel shown in the example of a hijacked link below.

If Google does know how this was able to happen, they aren’t telling. In fact, Google has been terribly quiet on the issue. They’ve yet to issue an official public statement, aside from telling Sullivan that he could confirm they were aware of the problem and working to resolve it.

The only direct word from Google on the hijackings is a simple response in an obscure Google Business Help thread from Google’s Community Manager, Jade Wang. If it weren’t for Barry Schwartz’s watchful eye, it is possible the statement would never have been widely seen. Wang said:

We’ve identified a spam issue in Places for Business that is impacting a limited number of business listings in the hotel vertical. The issue is limited to changing the URLs for the business. The team is working to fix the problem as soon as possible and prevent it from happening again. We apologize for any inconvenience this may have caused.

Yesterday, thousands of hotels with Google+ Local listings had their pages manipulated to replace their links to official sites with links leading to third-party booking services. Google+ Local listings are what Google uses to provide local results in Google Maps and Google Search.

It currently appears to be isolated entirely to hotels, and Google has already said they are aware of and fixing the problem, but Danny Sullivan’s research into who is responsible for the hijacking has yet to turn up anything concrete. What we do know is that thousands of listings were changed to point to either RoomsToBook.Info, RoomsToBook.net, or HotelsWhiz.com.

Source: Search Engine Land

Source: Search Engine Land

The problem is, we can’t be sure any of these companies are actually directly responsible. Only one person responded to Sullivan’s inquiries. Karim Miwani, listed on LinkedIn as the director of HotelsWhiz.com, replied saying (sic):

We have recently seen this issue and have reported to Google webmaster already. If you have seen any links please forward it to me and I will submit the request.

Our team is already in the process of blocking list of certain domains and IP addresses from back-linking us.

Thank you for pointing this out if you have any more external domains acting in aboce manner please report it to us on

You can get all the details on the hijacking from Danny Sullivan’s investigative report into the issue, but this event has a broader relevance outside of the hotel industry. The mass hijacking of Google’s local listings suggests their is a security flaw in the Google+ Local listings which needs to be addressed and resolved. It may explain why Google has largely remained mum on the subject aside from confirming that it occurred.

You most likely have nothing to worry about with your own local business’s listings, so long as you don’t work in the hotel industry. However, it could have implications about the future of Google+ Local listings. Either the security flaw that allowed this to happen will be fixed, or issues like these could affect other industries on a larger scale.

Considering how important these listings are to Google Maps and Search, a larger attack could be a serious problem for Google.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google made waves last week when they announced the expansion of how “Shared Endorsements” are used in ads, as well as the change to their terms of service to reflect this. The funny thing is, most people don’t understand what is actually changing.

The majority were simply confused when they heard that Google was implementing the use of social information into ads, because that has been going on for about two years now. But, as Danny Sullivan explains, the devil is in the details.

Throughout 2011, Google made changes which allowed advertisers to begin integrating images of people who liked their pages on Google+ into text and display ads. All that really showed was a small profile picture, and the phrase “+1’d this page.”

Starting on November 11, that won’t quite be the case. More than simply the people who +1 a page is going to be shown in ads. For example, if you comment, leave a review, or even follow a particular brand, those types of actions can be shown in ads on Google. A mockup of how it will appear is below.

These changes won’t take place until November, but don’t expect a prompt roll-out. It is possible you may start seeing the changes starting the 11th, but more likely it will gradually appear over the span of a few days or even a couple of weeks.

Not much else is known about how advertisers will be able to create these types of ads yet. Most likely, Google would not have announced the update this early, except they had to get the terms of service updated before they could even begin to implement this feature.

If you don’t want to appear in any of these types of ads, you can go to this page and click the tickbox at the bottom to opt out for all ads in the future.

Two years ago, Search Engine Land released their “Periodic Table of SEO Ranking Factors”, but we all know that SEO doesn’t stay the same for that long, especially with the bigger changes that Google has been pushing out lately. That is why the periodic table was recently updated, clarified, and re-branded “The Periodic Table of SEO Success Factors”.

When you hear that Google has over 200 “signals or ranking factors” and over 10,000 “sub-signals” it is easy to get overwhelmed or confused as to where you should focus your efforts. However, those big numbers are usually created by speculation such as whether or not Google pays any attention to Facebook Likes (the truth is, we don’t know).

While there may be a full 200 signals Google uses, there is a hierarchy to how important each signal is, and we have a pretty good idea of the most important ranking factors that Google relies on. These bigger signals are also the most likely to stay stable over time. If we somehow were to find out the current full list of ranking factors, the system would change again by the time you had their weight and function mapped out. Heck, they may have changed while I typed this sentence.

Search Engine Land’s periodic table doesn’t attempt to focus on the small things, but instead shows you the areas that have the biggest impact on rankings and visibility. As the creators see it, the table is a starting point for new SEO and a friendly reminder for the veterans. The simple version of the periodic table is below, but you can find the expanded table as well as the key for understanding the image here.

Periodic Table of SEO Success

Have you ever wondered if your site was penalized by Google through automated algorithms or a real human person? Now, you will almost always know because Google reports almost 100 percent of manual penalties.

Matt Cutts, head of Google’s web spam team, described this new policy at Pubcon this year, saying, “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site.”

“If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much all of those situations.”

Cutts did clarify that there may be rare instances where this doesn’t occur, but their aim to get to 100-percent.

In June, at SMX Advanced, Cutts gave a figure of 99 percent reporting, but Cutt believes they are currently reporting every instance of manual actions.

Danny Sullivan from Search Engine Land has more information about the distinction between manual and algorithmic actions.