Tag Archive for: SEO

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

If you have been reading up on SEO, blogging, or content marketing, chances are you’ve been told to “nofollow” certain links. If you’re like most, you probably didn’t quite understand what that means, and you may or may not have followed the advice blindly.

But, even if you’ve been using the nofollow tag for a while, if you don’t understand what it is or how it works you may be hurting yourself as much as you’re helping.

The nofollow tag is how publishers can tell search engines to ignore certain links to other pages. Normally, these links count similar to votes in favor of the linked content, but in some circumstances this can make search engines think you are abusing optimization or blatantly breaking their guidelines. Nofollowing the right pages prevents search engines from thinking you are trying to sell you’re influence or are involved in link schemes.

To help webmasters and content creators understand exactly when to nofollow, and how it affects their online presence, the team from Search Engine Land put together an infographic explaining when and how to use the tag. They also created a comprehensive guide to the tag for those who prefer long walls of text to nice and easy infographics.

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

Google’s Carousel may seem new to most searchers, but it has actually been rolling out since June. That means enough time has past for marketing and search analysts to really start digging in to see what makes the carousel tick.

If you’ve yet to encounter it, the carousel is a black bar filled with listings that runs along the top of the screen for specific searches, especially those that are location based or for local businesses such as hotels and restaurants. The carousel includes images, the businesses’ addresses, and aggregated review ratings all readily available at the top, in an order that seems less hierarchical than the “10 pack” listings previously used for local searches.

Up until now, we’ve only had been able to guess how these listings were decided based on surface level observations. But, this week Digital Marketing Works (DMW) published a study which finally gives us a peak under the hood and shows how businesses may be able to take some control of their place in the carousel. Amanda DiSilvestro explains the process used for the study:

  • They examined more than 4,500 search results in the category of hotels in 47 US cities and made sure that each SERP featured a carousel result.
  • For each of the top 10 hotels found on each search, they collected the name, rating, quantity of reviews, travel time from the hotel to the searched city, and the rank displayed in the carousel.
  • They used (equally) hotel search terms—hotels in [city]; best hotels in [city]; downtown [city] hotels; cheap hotels in [city].
  • This earned them nearly 42,000 data points on approximately 19,000 unique hotels.
  • They looked at the correlation between a hotel’s rank in a search result based on all of the factors discussed in step 1 to determine which were the most influential.

Their report goes into detail on many of the smaller factors that play a role, but DMW’s biggest findings were on the four big factors which determine which businesses are shown in the carousel and where they are placed.

1. Google Reviews – The factor which correlated the most with the best placement in the carousel were by far Google review ratings. Both quantity and quality of reviews clearly play a big role in Google’s placement of local businesses and marketers should be sure to pay attention to reviews moving forward. However, it is unclear how Google is handling paid or fake reviews, so many might be inspired to try to rig their reviews. For long-term success, I would suggest otherwise.

2. Location, Location, Location – Seeing as how the Google Carousel seems built around local businesses, it shouldn’t be a surprise that location does matter quite a bit. Of the 1,900 hotels in the study, 50 percent were within 2 miles of the search destination, while 75 percent were within 13 minutes of travel. Businesses would benefit from urging customers to search for specific landmarks or areas of cities, as you never know exactly where Google will establish the city “center”.

3. Search Relevancy and Wording – According to the findings, Google seems to change the weight of different ranking factors depending upon the actual search. For example, searching “downtown [city] hotels” will result in listings with an emphasis on location, while “best hotels in [city]” gives results most dependent on review rankings.

4. Primary Markets and Secondary Markets – It seems both small and larger businesses are on a relatively flat playing field when it comes to the carousel. Many small hotels are able to make it into the listings, right next to huge chains. The bigger businesses may have more capabilities to solicit reviews, but no hotel is too small to be considered for the carousel.

Bing gave people more control over what shows up about them online last week when they partnered with Klout to create Bing Personal Snapshots. Personal Snapshots are an extension of the previously implemented People Snapshots, but it functions to give you some say in how you appear within the Snapshot column on Bing.

Bing and other search engines are one of the most common ways to find information about people, but those search engines usually gather that information from social media, which isn’t always full of information we want displayed to everyone who searches our names.

These new Personal Snapshots allows you to ensure the information you want displayed is shown while your more personal or embarrassing details can be withheld.

This works by allowing users to sign up for Klout and claim a profile, which Bing will then connect to your social networking profiles. From there, you’ll have some ability to manage your digital appearance and persona. The update will also allow Bing to show your most influential moments from social media within the same bar, along with a verified badge.

This isn’t total control over your online identity, but the change gives more power over your online presence than previously available.

If you don’t have a profile with Klout already, you should be aware that it is a social ranking website which relies on analytics to evaluate individuals’ online influence over social networks.

Leave it to Matt Cutts to always be there to clear the air when there is an issue causing some webmasters confusion. One webmaster, Peter, asked Matt Cutts whether geo-detecting techniques is actually against Google’s policies, as it is common for websites to be designed so that users are given the information (price, USPs) most relevant to their lives based on geo-location.

In some understandings of Google’s policies, this may be against the rules, but it turns out all is fine, so long as you avoid one issue.

In one of his Webmaster Chat videos, Cutts explained that directing users to a version of a site, or delivering specific information based on location are not spammy or against Google’s policies. It only makes sense to offer viewers information that actually applies to their lives.

What Google does consider spam is directing their crawlers or GoogleBot to a web page of content that users cannot see. Sending GoogleBot to a different location that what visitors see is a bad idea, which is considered spam or a form of cloaking. Instead, treat GoogleBot as you would any user, by checking the location information and sending the crawler to the normal page reflecting that data.

SpeedometerHave you noticed a difference using Google on your smartphone this past week? Last week Ilya Grigorik, a Google developer advocate, announced Google was making a tiny tweak which should speed up mobile search on both Safari and Chrome by 200-400 milliseconds.

The company implemented an attribute called <a ping>, which allows them to basically do the click tracking and redirect practically at the same time, as Barry Schwartz explained.

You might not actually be experiencing search with the change, since Google is “gradually rolling out this improvement to all browsers that support the <a ping> attribute.” Grigorik also took the time to explain exactly how the change works:

What’s the benefit? Whenever the user clicks on a result, typically they are first sent to a Google URL redirector and then to the target site. With <a ping>, the click is tracked using an asynchronous call, meaning that the user sees one less redirect and a faster overall experience!

It remains incredibly unclear what Google’s thoughts or plans are for PageRank, as Matt Cutts, Google’s head of search spam, commented on Twitter yesterday that there won’t be any updates to PageRank or the toolbar anytime before 2014.

Neils Bosch asked the esteemed Google engineer whether there would be an update before next year, to which Cutts responded, “I would be surprised if that happened.”

According to Search Engine Land, it has been over 8 months since the last Google Toolbar PageRank update, back on February 4, 2013. Many have proclaimed the toolbar dead, but Cutts has personally defended the toolbar on a Webmaster chat within the past year, and said the toolbar won’t be going away.

However, as Cutts himself explained, Chrome doesn’t have a PageRank extension, Google dropped support for Firefox in 2011, and Internet Explorer 10 doesn’t support toolbar extensions. It seems clear there will be less and less of an audience for the toolbar, so its relevancy and use will likely taper off until it just kind of disappears.

It is always possible that Google might put out a surprise update next year, but don’t expect PageRank to be around forever.

Just as with any field, there are plenty of supposed SEO experts who are more than happy to offer your services and guarantees they can’t back up in order to get you to sign a contract. There are a few different ways these scammers operate, but when it boils down to it they all promise online success while stealing your money.

Any time you are hiring a company for online marketing, it is best to do your homework and ensure you’re getting what you’re paying for. You can find great success online, but if an offer sounds too good to be true, it probably is. Jaydeep Dosi from Search Engine Journal shares the most common claims you should be wary of.

We Offer Free Services

Proper SEO is time consuming to manage, the economy is unforgiving, and search engine optimization is a highly competitive field. How could any business with a long-term hope of survival offer free of cost services? The answer is they can’t. Yes, real SEO professionals are able to offer special rebates or low pricing occasionally. You will even see offers for one odd service offered for free within a larger transaction, but nothing comes entirely for free. SEO “experts” claiming not to charge you are likely more interested in your information and other details you don’t want them getting ahold of.

We Guarantee First Page Ranking

Watch the wording on these types of offer closely. Many SEO professionals emphasize their goal to get your site to the first page on search engine results pages (SERPs), but they can’t honestly guarantee it. They also can’t guarantee any level of traffic, though that is also certainly a goal. The reality is search engines guard their information closely, and they change their algorithms all the time. We work to stay on top of these changes and learn as much as we possibly can to gain exposure and visibility, but nothing is guaranteed.

We Submit Your Site to Hundreds of Search Engines

This isn’t a lie so much as a misrepresentation. Think for a second. How many people do you know using any search engine besides one of the main few. Google, Bing, and Yahoo are all still relevant in their own ways, but there aren’t hundreds of useful search engines. There aren’t even tens of relevant search engines. You really don’t need your site submitted to more than two or three of the most popular engines, so don’t get caught paying for wasteful services.

We Have Connections Within Google!

Any company advertising this way is a downright fraud. The majority have absolutely no connection with actual Google employees. But, more importantly, do you really think a Google employee is going to risk their job to help a friend rank their client’s sites higher? Nope.

We Know Everything About Google’s Algorithms

A company may claim to be an expert on Google’s algorithms, but you should press them to share exactly what they mean. While one might be an “expert” in that they keep up constantly with all the latest news and information about how Google’s search engines operate, it might be hard to consider them a real expert compared to an actual Google engineer. However, an SEO professional claiming to know every detail of Google’s algorithms is blatantly lying. These algorithms are dynamic and ever-evolving, not to mention they are so complex it would be impossible to know and understand the entire system. Search engines aren’t telling us their secrets.

We Have a Secret Formula for Success

The worst snake oil peddlers don’t even try to tell you what they will actually do. Successful SEO practices are no secret, and anyone who will help you achieve your goals will tell you so. To be truly successful in SEO, you just need to work hard and with focus from the very beginning and be responsible for keeping up to date with the current best practices and guidelines.