Tag Archive for: Search Engine Watch

Advertisers on Facebook won’t have to go through demand-side platforms (DSPs) to manage their retargeting campaigns for much longer. According to Search Engine Watch, Facebook is creating new retargeting options that won’t force you to go through FBX (Facebook Ad Exchange) or any other platform other than Facebook’s own interface.

Up until now, Advertisers using FBX have only been able to serve their ads on desktops within the news feed or right sidebar and they must buy their ad space through separate DSPs. Considering how many Facebook users are accessing the social media platform via smartphones or tablets, it is surprising it has taken this long for Facebook to allow advertisers to target individuals on mobile devices.

What’s New?

The big new feature will be Custom Audiences, which will allow advertisers to set up their retargeting campaigns directly through Facebook’s interface. That will include the ability to overlay standard Facebook targeting options as well.

The ability to target mobile devices is of course another huge aspect of this update, as it is undeniable a remarkable percentage of Facebook users are primarily using mobile devices for social media.

What is FBX Still Better At?

FBX still has benefits over the options that will be available through the Facebook interface. Most important of those benefits is predictive buying. If an individual continuously browses for a certain product of type of service, FBX’s predictive buying capabilities allow advertisers to show an ad reflecting that interest.

HalloweenThere have never been more opportunities for local businesses online than now. Search engines cater more and more to local markets as shoppers make more searches from smartphones to inform their purchases. But, in the more competitive markets that also means local marketing has become quite complicated.

Your competitors may be using countless online tactics aiming too ensure their online success over yours, and to stand a chance that means you also have to employ a similarly vast set of strategies. When this heats us and online competition begins to grow convoluted, some things get overlooked. The more you have to juggle, the more likely you are to make a serious mistake.

In true Halloween fashion, Search Engine Watch put together the four most terrifying local search mistakes that can frighten off potential customers.

Ignoring the Data Aggregators

A common tactic is to optimize Google+ listings, as well as maybe Yelp, or a few other high-profile local directories. But, why stop there? Google crawls thousands and thousands of sites that contain citations every day, so optimizing only a few listings is missing out on serious opportunities.

The most efficient way to handle this and optimize the sites most visible to customers, businesses should focus on data sources that Google actually uses to understand local online markets. The best way to do this is to submit business data to the biggest data aggregators, such as Neustar Localeze, InfoUSA, Acxion, and Factual.

Not Having and Individual Page for Each Business Location

A few years ago Matt Cutts, one of Google’s most respected engineers, said, “if you want your store pages to be found, it’s best to have a unique, easily crawlable URL for each store.” These days organic ranking factors have become much more influential in Google’s method of ranking local businesses, so this advice has become more potent than ever before.

There are also numerous non-ranking based reasons you should have optimized location pages for each location. If you don’t have actual results on individual pages, Google isn’t indexing that content separately, and instead only sees the results offered in a business locator. Think of it like optimizing a product site without product pages. If the results don’t have separate pages, it loses context and usability.

Ignoring the Opportunity to Engage Your Customers

Whether you want to face it or not, word of mouth has managed to become more important than ever as consumers talk about businesses online on social media. Each opinion has an exponentially larger audience than ever in history, so a single bad review is seen by hundreds or thousands of potential customers. Thankfully, that one review doesn’t have to be your down bringing.

First, if bad reviews get seen by more people, the same can be said for good reviews. If a bad review is an outlier, it might not make such an impact on viewers. But, more importantly, every review mention or review or interaction with your business gives you the opportunity to engage them back. If you see a positive mention online, showing gratitude for the remark opens up an entirely new connection with your brand. Similarly, a bad review can be salvaged by simply asking how changes can be made to improve their experience in the future.

Not Using Localized Content

Pretty much every local online marketer has heard about the importance of using the relevant keywords in their content so their website ranks for those terms. But, they tend to only use this logic for the products or types of services they offer.

Local keywords including ZIP codes, neighborhoods, or popular attractions can do as much to help you stand out for important searches as product based keywords can. Simply including information about traffic or directions can help you start ranking for search terms your competitors are missing.

AdWords Editor is one of the best tools available for editing and building out campaigns and ad groups, but it has its limits. For example, it would be convenient to be able to break up an ad group, clone a campaign, or copy keywords right into the Web UI. Thankfully, now you can, as Ginny Marvin reports. This week, Google added copy and paste functionality for the Web UI, making AdWords Editor even better than before.

Users are now able to copy keywords, ads, ad groups, or entire campaigns directly into the Web UI with the simple keyword shortcuts you are already used to. Simply press Ctrl-C/Cmnd-C and Ctrl-V/Cmnd-V. Though if you want, you can also use the Edit drop down menu.

Martin suggests the final version of the update might not be exactly as Google’s screenshot shows. Apparently you may see “Copy to” instead of “Copy” in the drop down, and you might not see the option to “copy keywords as paused”. Other than that, the tool is expected to roll out as planned.

You will be prompted when copying keywords to select both the campaign and ad group where you would like them to be copied.

googleadwordsGoogle AdWords announced yesterday a major reporting update to conversion tracking called Estimated Total Conversions will be rolling out over the next few weeks. The new feature provides estimates of conversions which take place over multiple devices and adds this to the conversion reporting we are already accustomed to.

Once enhanced campaigns launched earlier this year, search advertisers have had more control to combine mobile and desktops with the ability to further modify bids by mobile as well as other targeting considerations. There was a missing piece limiting the effectiveness of campaigns. We had limited data on how consumers actually navigate and convert across multiple device options.

What is a Cross-Device Conversion?

The widespread use of mobile and tablet devices to browse and shop online has greatly influenced how we actually interact with businesses. From our couch, we can have three options for achieving our online goals within reach, and it has been shown that we choose different devices for different tasks.

A study from Google last month found that more than 90 percent of multi-device consumers move sequentially through several screen like mobile to desktop, or mobile to tablet in order to complete transactions. There are even those who move from desktop screen to desktop screen, likely going from work to home computers. Anytime a person begins the actions that initiate a conversion on one screen, only to complete the conversion later on another screen, that is a cross-device conversion.

How Estimated Total Conversion is Calculated

Google calculates these types of conversions for advertisers based on how their customers convert when they are logged in. Then, they use this data to extrapolate out data to estimate what the total conversions from cross devices may be. The data is only used in aggregate and is not personally identifiable according to Search Engine Watch.

Google-Webmaster-Tools-LogoLast week SEO and online marketing professionals all had a collective freakout as keyword data stopped showing up in Webmaster Tools. They even made memes! Well there is good news, Google has said the issue was an unintended bug, and should be fixed soon.

Google made a very public switch to secure search last week, in an effort to encrypt all search information and provide “extra protection” to searchers. Webmasters immediately noticed nearly all of their keyword referral data disappeared and was replaced with “(not provided)”. The best way to deal with the issue was to access similar keyword data under Search Queries within the Search Traffic section of Google Webmaster Tools.

But there was a problem, when secure search was implemented that keyword data stopped being reported or provided within Webmaster Tools. Many questioned whether this was a mistake or a change in policy, while the regular anti-Google group proclaimed Google had lied and was intentionally hiding the data; Matt Cutts had previously estimated only one to two percent of keyword data would be affected by secure search.

Now, John Mueller, a member of the Google Webmaster Tools team in Europe, as well as a separate Google spokesperson have both clarified the missing data was the result of the bug, and they are working hard to solve the problem.

Mueller posted to the Google Webmaster Central forum, “The team is aware of the problem and working on speeding that data back up again. Thanks for your patience in the meantime.” The spokesperson told Search Engine Watch, “We’ve recently fixed a small bug related to data reporting in Webmaster Tools. We expect reporting to return to normal in the coming days.”

Links

Since the introduction of Google’s Penguin algorithm many have suggested that links are no longer important for SEO. I’ve even seen some misguided folks suggesting all links are outright bad. As usual the truth is more complicated than that.

It has become such a common issue that veteran SEO writer used his regular column over at Search Engine Watch to attempt to fully answer whether links are important for SEO these days. The exact question he was asked was “do you feel Google is putting less emphasis on links as part of their algorithm?”

The truth is there are a variety of types of links that have been devalued and count very little or are poisonous to your SEO. BUT, these links were almost entirely the type “that never should have been counting in the first place.”

You see, the types of links being devalued are being brought down because they are spammy. Google has gotten increasingly smarter and better at its job of helping people find what they want on the internet without running into spam or low-quality sites. The devalued links come from junk directories, link networks, paid link brokers, article databases, link wheels, etc. The list could go on and on. But, this hasn’t brought down the quality links that good SEO professionals have built.

In Ward’s opinion, quality links matter even more now. Google can tell a lot of information about links in your profile, and they are swift to penalize low quality or spammy links, but they are even more rewarding to those who have the “right” kind of links.

Any SEO professional or online marketer you hire to help raise your brand’s profile online should be able to tell the difference between good and bad links. They know what Google doesn’t like, and they stay out of trouble. However, the best online marketers know that organic search traffic and link building are only a part of a much larger system.

Twitter LogoIt may not come as a surprise to those who have been watching closely, but this week Twitter put the rumors to rest by officially filing for their IPO. Twitter was naturally assumed to be the next major online technology company to file for an IPO after Facebook went public.

Twitter announced their submission of an S-1 to the SEC exactly how you would expect; they tweeted the news yesterday. The announcement read: “We’ve confidentially submitted an S-1 to the SEC for a planned IPO. This Tweet does not constitute an offer of any securities for sale.”

One of the most interesting factors of the filing is that Twitter filed a “secret” IPO, and the terms will be kept secret under new regulations from the JOBS Act, which allow small businesses to keep their financial information from the public.

According to The Verge, Twitter is the first well-known web company to file for a “secret” IPO, but it also confirms that Twitter has less than $1 billion in revenue. They will eventually have to release their financial information, but not until “the road show part of their public offering,” as Search Engine Watch explains.

Blogger Portrait

Source: Marisa Vasquez

Unless you’ve been living under a rock for the past year, you’ve likely heard how important creating content is to your SEO strategy. For larger companies, it isn’t hard to find resources for solid content creation, but smaller businesses see a much larger hurdle. Smaller businesses means smaller budgets, but these businesses still need to find a way to market themselves.

Social media and blogs have made it easier than ever to create and share content with your audience, so small businesses have many more feasible options than in the past. This content creates a relationship with your audience and cements your brand as a trusted resource, and it doesn’t have to cost an arm and a leg as long as you focus on the right types of content. Phillip Thune highlighted four of the best ways small businesses can deliver quality content without destroying their budget.

Blogs

No matter what your marketing strategy, if you have on online presence (which you should) you need to have a blog. A blog is the cornerstone of any SEO or online marketing plans, and it offers you a convenient way to share new products, industry news, and interesting facts with your consumers. Not only does a blog give your company a voice, it also improves your SEO so that more people can find you. Plus, when something gets posted, it can be easily shared to all the most popular social media platforms.

Ebooks

Ebooks are digital books or publications that people can easily receive via the internet. They can then be read on your computer or any tablet or smartphone. These publications share information and establish credibility by showing your expertise to your clients. Most businesses request information such as an email address for these ebooks, so they also generate leads. These require a bit more effort on a single concentrated piece of content, but they often gain more traction than blogs so long as you create something valuable to readers and you share it enough to be seen.

Slide Presentations

You have no doubt put together a few slide presentations throughout your career, and are familiar with their easy-to-read format. They are used for sales presentations and conferences, but they can also be used to share educational content. These slideshows are easily shared on SlideShare, YouTube, or Vimeo, and will help gain trust and reputability within your field.

Press Releases

Press releases have long been the best way to spread information and establish credibility in your market and your local community. They announce information about new products or services, while also showcasing your brand’s place as a respected part of your community. Traditionally, these are shared with journalists or newswires, but they also encourage bloggers and other publications to share your story. Sharing your press releases will help small businesses establish relationships with journalists within your community, but you can also share them online to spread directly to your customers.

When Facebook announced their introduction of hashtags in June, it seemed to be a pretty big deal, especially within the social media marketing industry. Every online marketer immediately began investigating how to make the most out of the use of hashtags, and if they are even worth the effort. A few months later, it appears the hashtags aren’t faring well.

Facebook Hashtag Graph

In late July, Simply Measured reported status updates with hashtags weren’t gaining brands any extra exposure, now Search Engine Watch reports EdgeRank Checker has similar findings.

According to EdgeRank Checker’s data, viral reach and engagement were down on posts with hashtags compared to those without hashtags. They studied over 500 pages, and then compared their data to a sample of 50 Twitter accounts from Fortune 500 brands. They found that 70 percent of brands experienced an increase in retweets when using a hashtag, indicating higher engagement.

EdgeRank Checker did have an idea why Facebook users may not be responding to the hashtags:

Our hypothesis is that not many people are clicking on hashtags. If many people were clicking hashtags, we should see an increase in Viral Reach for posts with hashtags. The data is not showing that. If anything, it’s showing a decrease in Viral Reach.

We hypothesize that hashtagged posts don’t have the expected increase in Viral Reach due to how brands are using them. After examining how hashtags are being used, hashtags are often used in promotional material. For some brands, they’ve created campaigns around particular hashtags and use them in all posts associated with the campaign. By nature, campaigns are promotional, therefore more likely to drive less engagement, less clicks, and ultimately less Reach.