Multilingua

Image Courtesy of Quinn Dombrowski

Google has worked for years to ensure speakers of all languages can use and benefit from their search engine. But with the increasing use of conversational and voice search, another issue has risen.

Millions of people around the world are at least partially multilingual, including up to 20 percent of the U.S. population. Starting today, Google can now understand those languages at the same time.

As announced in a blog post from today, multilingual people can change their settings one time and then speak in any of up to five of Google’s 50 understood languages and be understood. Before, users could only use a single language at a time, but now they can switch between languages as they are most comfortable.

As the blog post explains:

Now, you can just make a small, one-time change to your settings, and then you can switch back and forth easily. Google will automatically detect which language you’re using. (For now, you need to stick to one language per sentence though.) You can select up to five languages total—enough to satisfy all but the most advanced polyglots. Whether you get a spoken response from Google depends on the language you use and your query (and you’ll see more languages and features added over time).

While this is beneficial for many Americans, this could be seen as downright revolutionary for many areas of the world where children and immigrants speak the native language and an adoped dominant language interchangeably.

ransomLast week, many webmasters and SEO’s received a scare in the form of extortion emails from a supposed SEO threatening to plague a site with negative SEO if they do not pay a ransom of$1,500.

It seems the emails concerned even the most prominent members of the SEO community such as Dan Petrovic and Steve Webb. Even more interesting, despite assurances from Google that they would investigate the threats, a fair portion of the community appears to be at least moderately troubled by the threats. This gives an indication of just how easy people perceive negative SEO to be.

The email cuts straight to the point opening with, “This is an extortion email.” It then goes on to explain exactly how the individual(s) will enact specific tactics which can hurt a site’s performance in Google and potentially cause a site to be deindexed by the search engine.

The full text of the emails is as follows:

Hello,

Read this email very carefully.

This is an extortion email.

We will do NEGATIVE SEO to your website by giving it 20,000 XRumer forum profile backlinks (permanent & mostly dofollow) pointing directly to your website and hence your website will get penalised & knocked off the Google’s Search Engine Result Pages (SERP) forever, if you do not pay us $1,500.00 (payable by Western Union).

This is no false claim or a hoax, download the following Notepad file containing 20,000 XRumer forum profile backlinks pointing to http://www.negativeseo.cn.pn/ (this is our website and go and see on this website, you will find our email address [email protected] from which this email right now is being sent to you) :

http://www.mediafire.com/download/eizjwnpq2rsrncu/20000-XRumer-Forum-Profile-Backlinks-Dofollow.txt

Just reply to this email to let us know if you will pay just $1,500.00 or not for us to refrain or not from ruining your precious website & business permanently. Also if you ignore this email and do not reply to this email within the next 24-48 hours, then we will go ahead and build 20,000 XRumer forum profile backlinks pointing directly to your website.

We are awaiting your wise decision.

RS

Thankfully, it appears the entire situation has been nothing more than empty threats. Despite several credible SEO figures reporting the extortion emails, no one has reported paying the extortion amount and there are no signs that negative SEO is being put into action against these sites.

Now that we’ve all hopefully gotten over the “links are dead” hysteria, SEOs and webmasters are beginning to worry about their backlink profiles again. In the past it was easy. You could buy links or make enact one of the many now-banned tactics to try to artificially inflate your backlink profile and it seemed like no one was the wiser.

Of course things have changed quite drastically, as you should know by now. Backlinks need to be earned, and they need to be quality. As many analysts will tell you, building backlinks these days is more about relationship building than it is about farming as many links as possible. But how are you supposed to earn these prized high quality links?

SEOChat asked a long list of SEO experts where their most valuable links came from, and each gives an example of how you can earn links yourself by simply providing a service to your users and important figures related to your industry.

google-security-360In the past, several Google employees have suggested they would like to see site security included as a ranking factor within their search engine. Now, Google has followed through and announced that going HTTPS, or adding a SSL 2048-bit key certificate on your site, can potentially give you a small ranking boost.

Don’t expect to propel yourself to the top of the search results by adding HTTPS, as Google refers to it as “a very lightweight signal” within the larger scheme of things and only affects “fewer than 1% of global queries.” However, it was also implied that the new ranking signal may get beefed up in the future in an attempt to encourage all site owners to increase the security on their sites.

The change should come as little surprise to anyone who heard Matt Cutts, Google’s head of search spam, publicly endorse the idea of making SSL a ranking factor just a few months ago.

Unlike many ranking changes that Google makes, the risk of drawbacks is small. Google has been saying that switching to HTTPS should not have an effect on SEO for years, so long as you take a few steps to guarantee your traffic stays steady. Mostly, such steps relate to communicating to Google so it understands how to read your site.

Google has also said they will be releasing for information and resources for webmasters deciding to adopt HTTPS, but for now all they offer are these tips:

  • Decide the kind of certificate you need: single, multi-domain, or wildcard certificate
  • Use 2048-bit key certificates
  • Use relative URLs for resources that reside on the same secure domain
  • Use protocol relative URLs for all other domains
  • Check out our site move article for more guidelines on how to change your website’s address
  • Don’t block your HTTPS site from crawling using robots.txt
  • Allow indexing of your pages by search engines where possible. Avoid the noindex robots meta tag.

It has been a few weeks since Google caught the search world by surprise with the release of its local search algorithm which has been nicknamed “Pigeon.” Out of all of Google’s search algorithms, Pigeon was likely the most well-received at its initial roll-out, but is that still the same now that some time has passed?

PigeonWhile we at TMO still feel that Pigeon has the potential to help local businesses and searchers improve their local results, it is always good to get the opinions from other experts in the search marketing community. Thankfully, Search Engine Land did just that. They compiled the opinions of several authority figures in search marketing, and needless to say the consensus is mixed.

Much of the criticism is related to buggy issues likely to be resolved in the near future, but there is also plenty worthy of discussing and lots of room for improvement. You can find out exactly what the experts had to say here.

Google Product Ratings

Google has been working hard to expand their reviews and ratings systems, and yesterday they made a big step by announcing that they will be introducing product ratings for Product Listing Ads (PLAs).

The announcement, which appeared on the Inside Adwords Blog, stated:

Product reviews provide critical information to shoppers making purchase decisions. To help shoppers easily find this information when searching for products, we’re introducing product ratings on Product Listing Ads.

Shoppers browsing on Google will see the typical product listings they have become accustomed to, but beneath the listings product ratings will also be shown in the shape of stars and review counts. For now it appears the changes will only be seen on search results within the United States.

The data used for these review listings will be gathered from multiple sources, such as merchants, third party aggregators, and editorial sites.

If reviews for businesses are any indication of what this change will bring, it seems very likely that businesses offering products with largely positive reviews will be able to leverage the updated listings to not only increase their click-through rates, but to also increase conversions overall.

iOS MobileA few weeks ago Google finally got around to releasing the iOS version of Google Analytics. The app had been available for Android for quite some time, but the release to iOS makes website data available to webmasters at any time and it is fair to assume some business owners and webmasters may be trying to use Google Analytics for their first time.

While Analytics is without a doubt one of the most powerful tools for analyzing your website and how others are accessing it, it can also be a bit overwhelming for those who aren’t familiar with the layout and aren’t well versed in the terminology.

To help familiarize new and inexperienced webmasters with Google Analytics, Emma Barnes, who offers training on Google Analytics from Branded3, reviewed many of the most common questions she receieves and the terminology you can expect to run into when using Analytics.

Once those questions are out of the way, you may find yourself tasked with another question: “just what am I supposed to do with all this information?” For that, you may want to browse the recent article titled “11 Things You Should Be Doing With Google Analytics” from Search Engine Journal.

If you want to be in control of your website, you need all the information possible to make the right choices. Google Analytics can give you the numbers you want, but these resources will help you know what to do with it.

local-business

Last night Google breathed new life into a forgotten algorithm by updating their local search algorithm to provide more useful, relevant, and accurate local search results that are more closely linked to traditional web search ranking signals.

While Google has remained mum on a large amount of the details, we do know the changes can be visible within Google Maps search results as well as traditional search results. From the online discussion, it also local businesses are also getting significant ranking and traffic boosts as most responses have been positive.

Most of the changes are behind the scenes, which Google doesn’t want to share with the world. However, Barry Schwartz shared that the new algorithm ties deeper into web search than previously by linking it to search features such as Knowledge Graph, spelling correction, synonyms, and more.

Google also says the new algorithm improves their distance and location ranking parameters.

The algorithm is already rolling out for US English results, but Google wouldn’t say when to expect it to roll out for other regions and languages, nor would they comment on what percent of queries have been effected or if web spam algorithms were included in the update.

As a business owner with an eye on your company’s online marketing success, you have likely heard about Google’s search engine algorithms. You may even have a general idea of how they function and effect your business’s online presence and marketing strategies.

But, unless you spend your free time reading all the SEO blogs, you probably have some questions about some aspects of how these algorithms work. If your business does international business, one of those questions is very likely if Google’s algorithms work the same around the world.

While the algorithms largely tackle the same issues, the short answer is that they do not all work the same on an international scale.

As Barry Schwartz recently highlighted, you can find specific examples of when algorithms vary across borders by looking at the Google Panda algorithm. The algorithm was initially launched for English language Google engines in February 2011, but the rest of the globe didn’t see the algorithm roll out for quite some time. Notably, it took 17 months for Google to release Panda in Korea and Japan to target Asian languages.

However, the Google Penguin algorithm didn’t have nearly the same delay. Penguin rolled out globally and impacted sites in any language.

What’s the reason for the difference? It all boils down to purpose. The Panda algorithm focused on language and content, and those algorithms have to be customized and focused based on the wide variety of languages found around the world. Meanwhile, algorithms like Penguin target off-page technical factors like links, which raises less of an issue customization.

Duplicate content has been an important topic for webmasters for years. It should be absolutely no secret by now that duplicate content is generally dangerous to a site and usually offers no value, but there are occasional reasons for duplicate content to exist.

Of course, there are very real risks with hosting a significant amount of duplicate content, but often the fear is larger than the actual risk of penalties – so long as you aren’t taking advantage and purposely posting excessive duplicate content.

Google’s John Mueller puts the risk of using duplicate content in the best context,. According to John, there are two real issues with duplicate content.

The first issue is that Google’s algorithms typically automatically choose one URL to show for specific content in search, and sometimes you don’t get to choose. The only way you can effectively let Google know your preference is by using redirects or canonical tags, and that isn’t foolproof.

Secondly, if you are hosted a ton of duplicate content it can actually make the process of crawling to overwhelming for the server, which will slow new content from being noticed as quickly as it should be.

Still, John said that in most cases, “reasonable amounts of duplication […] with a strong server” is not a huge problem, as “most users won’t notice the choice of URL and crawling can still be sufficient.”