Tag Archive for: News

Google is always making changes and updates, but it seems like the past couple weeks have been especially crazy for the biggest search engine out there. There have been tons of changes both big and small, but best of all, they seem to all be part of one comprehensive plan with a long term strategy.

Eric Enge sums up all the changes when he says Google is pushing people away from a tactical SEO mindset to a more strategic and valuable approach. To try to understand exactly what that means going forward, it is best too review the biggest changes. By seeing what has been revamped, it is easier to make sense of what the future looks like for Google.

1. ‘(Not Provided)’

One of the hugest changes for both searchers and marketers is Google’s move to make all organic searches secure starting in late September. For users, this means more privacy when browsing, but for marketers and website owners it means we are no longer able to see keyword data from most users coming to sites from Google searches.

This means marketers and site-owners are having to deal with a lot less information, or they’re having to work much harder to get it. There are ways to find keyword data, but it’s no longer easily accessible from any Google tool.

This was one of the bigger hits for technical SEO, though there are many work arounds for those looking for them.

2. No PageRank Updates

PageRank has long been a popular tool for many optimizers, but it has also been commonly used by actual searchers to get a general idea of the quality of the sites they visit. However, Google’s Matt Cutts has openly said not to expect another update to the tool this year, and it seems it won’t be available much longer on any platform. The toolbar has never been available on Chrome, and with Internet Explorer revamping how toolbars work on the browser, it seems PageRank is going to be left without a home.

This is almost good news in many ways. PageRank has always been considered a crude measurement tool, so if the tool goes away, many will have to turn to more accurate measurements.

3. Hummingbird

Google’s Hummingbird algorithm seemed minor to most people using the search engine, but it was actually a major overhaul under the hood. Google vastly improved their abilities at understanding conversational search that entirely changes how people can search.

The most notable difference with Hummingbird is Google’s ability to contextualize searches. If you search for a popular sporting arena, Google will find you all the information you previously would have expected, but if you then search “who plays there”, you will get results that are contextualized based on your last search. Most won’t find themselves typing these kinds of searches, but for those using their phones and voice capabilities, the search engine just got a lot better.

For marketers, the consequences are a bit heavier. Hummingbird greatly changes the keyword game and has huge implications for the future. With the rise of conversational search, we will see that exact keyword matches become less relevant over time. We probably won’t feel the biggest effects for at least a year, but this is definitely the seed of something huge.

4. Authorship

Authorship isn’t exactly new, but it has become much more important over the past year. As Google is able to recognize the creators of content, they are able to begin measuring which authors are consistently getting strong responses such as likes, comments, and shares. This means Google will be more and more able to filter those who are creating the most valuable content and rank them highest, while those consistently pushing out worthless content will see their clout dropping the longer they fail to actually contribute.

5. In-Depth Articles

Most users are looking for quick answers to their questions and needs with their searches, but Google estimates that “up to 10% of users’ daily information needs involve learning about a broad topic.” To reflect that, they announced a change to search in early August, which would implement results for more comprehensive sources for searches which might require more in-depth information.

What do these all have in common?

These changes may all seem separate and unique, but there is an undeniably huge level of interplay between how all these updates function. Apart, they are all moderate to minor updates. Together, they are a huge change to search as we know it.

We’ve already seen how link building and over-attention to keywords can be negative to your optimization when improperly managed, but Google seems keen on devaluing these search factors even more moving forward. Instead, they are opting for signals which offer the most value to searchers. Their search has become more contextual so users can find their answers more easily, no matter how they search. But, the rankings are less about keywords the more conversational search becomes.

In the future, expect Google to place more and more emphasis on authorship and the value that these publishers are offering to real people. Optimizers will always focus on pleasing Google first and foremost, but Google is trying to synergize these efforts so that your optimization efforts are improving the experience of users as well.

A couple weeks ago, Google released an update directly aimed at the “industry” of websites which host mugshots, which many aptly called The Mugshot Algorithm. It was one of the more specific updates to search in recent history, but was basically meant to target sides aiming to extort money out of those who had committed a crime. Google purposefully targeted those sites who were ranking well for names and displayed arrest photos, names, and details.

Seeing how a week went by without response, you wouldn’t be judged for thinking that was the end of the issue, but finally one of the biggest sites affected, Mugshots.com, publicly responded to Google’s update. Barry Schwartz reported Mugshot.com published a blog post, in which they claim Google is endangering the safety of Americans.

Mugshots was among three sites who suffered the most from the algorithm, the others including BustedMugshots and JustMugshots.

In their statement, they say, “Google’s decision puts every person potentially at risk who performs a Google search on someone.”

If Mugshots.com could tone down the theatrics, they might have been able to make a reasonable argument. However, they also ignore there are many other means for employers and even common citizens to find out arrest records and details in less humiliating and more contextualized means.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

Google’s Carousel may seem new to most searchers, but it has actually been rolling out since June. That means enough time has past for marketing and search analysts to really start digging in to see what makes the carousel tick.

If you’ve yet to encounter it, the carousel is a black bar filled with listings that runs along the top of the screen for specific searches, especially those that are location based or for local businesses such as hotels and restaurants. The carousel includes images, the businesses’ addresses, and aggregated review ratings all readily available at the top, in an order that seems less hierarchical than the “10 pack” listings previously used for local searches.

Up until now, we’ve only had been able to guess how these listings were decided based on surface level observations. But, this week Digital Marketing Works (DMW) published a study which finally gives us a peak under the hood and shows how businesses may be able to take some control of their place in the carousel. Amanda DiSilvestro explains the process used for the study:

  • They examined more than 4,500 search results in the category of hotels in 47 US cities and made sure that each SERP featured a carousel result.
  • For each of the top 10 hotels found on each search, they collected the name, rating, quantity of reviews, travel time from the hotel to the searched city, and the rank displayed in the carousel.
  • They used (equally) hotel search terms—hotels in [city]; best hotels in [city]; downtown [city] hotels; cheap hotels in [city].
  • This earned them nearly 42,000 data points on approximately 19,000 unique hotels.
  • They looked at the correlation between a hotel’s rank in a search result based on all of the factors discussed in step 1 to determine which were the most influential.

Their report goes into detail on many of the smaller factors that play a role, but DMW’s biggest findings were on the four big factors which determine which businesses are shown in the carousel and where they are placed.

1. Google Reviews – The factor which correlated the most with the best placement in the carousel were by far Google review ratings. Both quantity and quality of reviews clearly play a big role in Google’s placement of local businesses and marketers should be sure to pay attention to reviews moving forward. However, it is unclear how Google is handling paid or fake reviews, so many might be inspired to try to rig their reviews. For long-term success, I would suggest otherwise.

2. Location, Location, Location – Seeing as how the Google Carousel seems built around local businesses, it shouldn’t be a surprise that location does matter quite a bit. Of the 1,900 hotels in the study, 50 percent were within 2 miles of the search destination, while 75 percent were within 13 minutes of travel. Businesses would benefit from urging customers to search for specific landmarks or areas of cities, as you never know exactly where Google will establish the city “center”.

3. Search Relevancy and Wording – According to the findings, Google seems to change the weight of different ranking factors depending upon the actual search. For example, searching “downtown [city] hotels” will result in listings with an emphasis on location, while “best hotels in [city]” gives results most dependent on review rankings.

4. Primary Markets and Secondary Markets – It seems both small and larger businesses are on a relatively flat playing field when it comes to the carousel. Many small hotels are able to make it into the listings, right next to huge chains. The bigger businesses may have more capabilities to solicit reviews, but no hotel is too small to be considered for the carousel.

Leave it to Matt Cutts to always be there to clear the air when there is an issue causing some webmasters confusion. One webmaster, Peter, asked Matt Cutts whether geo-detecting techniques is actually against Google’s policies, as it is common for websites to be designed so that users are given the information (price, USPs) most relevant to their lives based on geo-location.

In some understandings of Google’s policies, this may be against the rules, but it turns out all is fine, so long as you avoid one issue.

In one of his Webmaster Chat videos, Cutts explained that directing users to a version of a site, or delivering specific information based on location are not spammy or against Google’s policies. It only makes sense to offer viewers information that actually applies to their lives.

What Google does consider spam is directing their crawlers or GoogleBot to a web page of content that users cannot see. Sending GoogleBot to a different location that what visitors see is a bad idea, which is considered spam or a form of cloaking. Instead, treat GoogleBot as you would any user, by checking the location information and sending the crawler to the normal page reflecting that data.

Instagram LOgoMany considered it only a matter of time before advertising would find its way onto Instagram, since Facebook purchased the app. However it took much longer than most expected. Instagram has remained ad-less until now, but over the next few months you will finally see that change. Instagram announced late last week that advertising would begin rolling out within the Instagram photo stream over the next few months.

This doesn’t mark the first possible attempt to monetize Instagram. Jennifer Slegg reminds us of late last year when Instagram altered its terms to suggest that Instagram would all the rights to all photos posted on it, implicating that Instagram would begin selling those photos to advertisers. The response was massive and overwhelmingly negative, as users began to flee from the service until the terms were reverted.

Since then, the waters have been quiet, but it was heavily expected that Facebook would attempt to turn Instagram into a revenue generating service, seeing as it cost Facebook $1 billion.

This attempt is a little more direct than their change to their terms, but it appears they will be slowly integrating advertisers. They are clearly more cautious this time around – Instagram even emphasized that there would be no changes to how image or video ownership would be viewed.

The company is starting with just a limited number of U.S. advertising firms only showing small and occasional ads. All ads are required to use high-quality images and videos, so they should blend in on the feed.

Seeing photos and videos from brands you don’t follow will be new, so we’ll start slow. We’ll focus on delivering a small number of beautiful, high-quality photos and videos from a handful of brands that are already great members of the Instagram community.

Our aim is to make any advertisements you see feel as natural to Instagram as the photos and videos many of you already enjoy from your favorite brands. After all, our team doesn’t just build Instagram, we use it each and every day. We want these ads to be enjoyable and creative in much the same way you see engaging, high-quality ads when you flip through your favorite magazine.

Expect the ads to be similar to the sponsored posts you see in Facebook, but designed for Instagram. The company will also be heavily soliciting feedback from users about the types of advertising being tested and shown, including the ability to hide them.

It remains incredibly unclear what Google’s thoughts or plans are for PageRank, as Matt Cutts, Google’s head of search spam, commented on Twitter yesterday that there won’t be any updates to PageRank or the toolbar anytime before 2014.

Neils Bosch asked the esteemed Google engineer whether there would be an update before next year, to which Cutts responded, “I would be surprised if that happened.”

According to Search Engine Land, it has been over 8 months since the last Google Toolbar PageRank update, back on February 4, 2013. Many have proclaimed the toolbar dead, but Cutts has personally defended the toolbar on a Webmaster chat within the past year, and said the toolbar won’t be going away.

However, as Cutts himself explained, Chrome doesn’t have a PageRank extension, Google dropped support for Firefox in 2011, and Internet Explorer 10 doesn’t support toolbar extensions. It seems clear there will be less and less of an audience for the toolbar, so its relevancy and use will likely taper off until it just kind of disappears.

It is always possible that Google might put out a surprise update next year, but don’t expect PageRank to be around forever.

Just as with any field, there are plenty of supposed SEO experts who are more than happy to offer your services and guarantees they can’t back up in order to get you to sign a contract. There are a few different ways these scammers operate, but when it boils down to it they all promise online success while stealing your money.

Any time you are hiring a company for online marketing, it is best to do your homework and ensure you’re getting what you’re paying for. You can find great success online, but if an offer sounds too good to be true, it probably is. Jaydeep Dosi from Search Engine Journal shares the most common claims you should be wary of.

We Offer Free Services

Proper SEO is time consuming to manage, the economy is unforgiving, and search engine optimization is a highly competitive field. How could any business with a long-term hope of survival offer free of cost services? The answer is they can’t. Yes, real SEO professionals are able to offer special rebates or low pricing occasionally. You will even see offers for one odd service offered for free within a larger transaction, but nothing comes entirely for free. SEO “experts” claiming not to charge you are likely more interested in your information and other details you don’t want them getting ahold of.

We Guarantee First Page Ranking

Watch the wording on these types of offer closely. Many SEO professionals emphasize their goal to get your site to the first page on search engine results pages (SERPs), but they can’t honestly guarantee it. They also can’t guarantee any level of traffic, though that is also certainly a goal. The reality is search engines guard their information closely, and they change their algorithms all the time. We work to stay on top of these changes and learn as much as we possibly can to gain exposure and visibility, but nothing is guaranteed.

We Submit Your Site to Hundreds of Search Engines

This isn’t a lie so much as a misrepresentation. Think for a second. How many people do you know using any search engine besides one of the main few. Google, Bing, and Yahoo are all still relevant in their own ways, but there aren’t hundreds of useful search engines. There aren’t even tens of relevant search engines. You really don’t need your site submitted to more than two or three of the most popular engines, so don’t get caught paying for wasteful services.

We Have Connections Within Google!

Any company advertising this way is a downright fraud. The majority have absolutely no connection with actual Google employees. But, more importantly, do you really think a Google employee is going to risk their job to help a friend rank their client’s sites higher? Nope.

We Know Everything About Google’s Algorithms

A company may claim to be an expert on Google’s algorithms, but you should press them to share exactly what they mean. While one might be an “expert” in that they keep up constantly with all the latest news and information about how Google’s search engines operate, it might be hard to consider them a real expert compared to an actual Google engineer. However, an SEO professional claiming to know every detail of Google’s algorithms is blatantly lying. These algorithms are dynamic and ever-evolving, not to mention they are so complex it would be impossible to know and understand the entire system. Search engines aren’t telling us their secrets.

We Have a Secret Formula for Success

The worst snake oil peddlers don’t even try to tell you what they will actually do. Successful SEO practices are no secret, and anyone who will help you achieve your goals will tell you so. To be truly successful in SEO, you just need to work hard and with focus from the very beginning and be responsible for keeping up to date with the current best practices and guidelines.

googleadwordsGoogle AdWords announced yesterday a major reporting update to conversion tracking called Estimated Total Conversions will be rolling out over the next few weeks. The new feature provides estimates of conversions which take place over multiple devices and adds this to the conversion reporting we are already accustomed to.

Once enhanced campaigns launched earlier this year, search advertisers have had more control to combine mobile and desktops with the ability to further modify bids by mobile as well as other targeting considerations. There was a missing piece limiting the effectiveness of campaigns. We had limited data on how consumers actually navigate and convert across multiple device options.

What is a Cross-Device Conversion?

The widespread use of mobile and tablet devices to browse and shop online has greatly influenced how we actually interact with businesses. From our couch, we can have three options for achieving our online goals within reach, and it has been shown that we choose different devices for different tasks.

A study from Google last month found that more than 90 percent of multi-device consumers move sequentially through several screen like mobile to desktop, or mobile to tablet in order to complete transactions. There are even those who move from desktop screen to desktop screen, likely going from work to home computers. Anytime a person begins the actions that initiate a conversion on one screen, only to complete the conversion later on another screen, that is a cross-device conversion.

How Estimated Total Conversion is Calculated

Google calculates these types of conversions for advertisers based on how their customers convert when they are logged in. Then, they use this data to extrapolate out data to estimate what the total conversions from cross devices may be. The data is only used in aggregate and is not personally identifiable according to Search Engine Watch.

Google-Webmaster-Tools-LogoLast week SEO and online marketing professionals all had a collective freakout as keyword data stopped showing up in Webmaster Tools. They even made memes! Well there is good news, Google has said the issue was an unintended bug, and should be fixed soon.

Google made a very public switch to secure search last week, in an effort to encrypt all search information and provide “extra protection” to searchers. Webmasters immediately noticed nearly all of their keyword referral data disappeared and was replaced with “(not provided)”. The best way to deal with the issue was to access similar keyword data under Search Queries within the Search Traffic section of Google Webmaster Tools.

But there was a problem, when secure search was implemented that keyword data stopped being reported or provided within Webmaster Tools. Many questioned whether this was a mistake or a change in policy, while the regular anti-Google group proclaimed Google had lied and was intentionally hiding the data; Matt Cutts had previously estimated only one to two percent of keyword data would be affected by secure search.

Now, John Mueller, a member of the Google Webmaster Tools team in Europe, as well as a separate Google spokesperson have both clarified the missing data was the result of the bug, and they are working hard to solve the problem.

Mueller posted to the Google Webmaster Central forum, “The team is aware of the problem and working on speeding that data back up again. Thanks for your patience in the meantime.” The spokesperson told Search Engine Watch, “We’ve recently fixed a small bug related to data reporting in Webmaster Tools. We expect reporting to return to normal in the coming days.”