Posts

SpellingPretty much anything Google’s most popular engineer Matt Cutts says makes headlines in the SEO community, but often his Webmaster Chat videos and advice aren’t mind-blowing by any stretch of the imagination. For instance, we recently covered a video where Cutts explained that bad grammar in the comment section most likely won’t hurt your ranking (unless you allow spam to run rampant).

For content creators, it was a legitimate concern that poorly written contents might negate the hard work putting into writing legible and well-constructed content. However, many used this to run headlines that Google doesn’t care about grammar, which is not even close to being confirmed.

As Search Engine Land points out, way back in 2011, Cutts publicly stated that there is a correlation between spelling and PageRank, but Google does not use grammar as a “direct signal.” But, in his latest statement on the issue Cutts specifies that you don’t need to worry about the grammar in your comments “as long as the grammar on your own page is fine.” This suggests Google does in fact care about the level of writing you are publishing.

It is unclear exactly where the line is for Google at the moment, as they imply that grammar within your content does matter, but they have never stated it is a ranking signal. Chances are a typo or two won’t hurt you, but it is likely Google may punish pages with rampant errors and legibility issues.

On the other hand, Bing has recently made it pretty clear that they do care about technical quality in content as part of their ranking factors. Duane Forrester shared a blog post on the Bing Webmaster Blog which states, “just as you’re judging others’ writing, so the engines judge yours.”

Duane continues, “if you [as a human] struggle to get past typos, why would an engine show a page of content with errors higher in the rankings when other pages of error free content exist to serve the searcher?”

In the end, it all comes down to search engines trying to provide the best quality content they can. The search engines don’t want to direct users to content that will be hard to make sense of, and technical errors can severely impact a well thought-out argument.

As always, the best way to approach the issue is to simply write content for your readers. If your content can communicate clearly to your audience, the search engines shouldn’t have any problems with it. But, if a real person has trouble understanding you, the search engines aren’t going to do you any favors.

The SEO community is sometimes thought of being a stuffy industry, but we like to have fun like any other group of people. For example, you probably would never have guessed that there are online games specifically aimed at the optimization community.

Yet, in the past week two such games have been found, both very SEO-centric. They’re a cool novelty and they offer about as much fun as the games they are based on.

First we have Donkey Cutts, a Donkey Kong knock-off, using prominent SEO personalities and tech imagery in the place of an oversized monkey and barrels. Obviously Matt Cutts from Google is featured, but players are also able to pick from other SEO personalities (though there is some disagreement who exactly the characters are).

Donkey Cutts

There is also Madoogle, a clone of Angry Birds which lets you attack black hat SEOs with some more easily recognizable SEO faces. This one includes versions of Matt Cutts (again), Rand Fishkin, Lisa Barone, and Barry Schwartz.

Madoogle

They probably won’t help you rank much higher, but these games might allow you to relax for a few minutes while still keeping SEO fresh in your mind.

Internet commenters aren’t quite known for their excellent grammar or insightful conversation. While there are plenty who contribute and expand on content with helpful information in the comments, there are also more than a few who struggle with language or actively try to troll other users with garbled borderline nonsense.

For most users, these types of commenters are a simple annoyance at most. But, for those who spend long hours crafting grammatically correct and easily-readable content, the less readable comments can cause some worry. While quality content is most important for users, content creators also aim for high legibility because it can affect rankings.

It is no secret that nonsensical or poorly written content doesn’t tend to perform well on search engines, but what about those comments that aren’t so carefully put together? Can comments with bad grammar or readability hurt your content’s rankings? According to Matt Cutts, Google’s most popular spokesperson, the answer is not really.

“I wouldn’t worry about the grammar in your comments. As long as the grammar on your own page is fine … there are people on the Internet and they write things and it doesn’t always make sense,” Cutts said in a recent Webmaster Chat video. “You can see nonsense comments on YouTube and other large properties and that doesn’t mean that YouTube video will be able to rank.”

The primary exception to this rule is spam comments. While nonsense comments or poorly written comments aren’t much of a problem for you, spam comments should still be removed to protect your SEO and generally improve user experience and site quality.

“Just make sure that your own content is high-quality. You might want to make sure that people aren’t leaving spam comments, if you’ve got a bot, then they might leave bad grammar,” Cutts said. “But if it’s a real person and they are leaving a comment and the grammar is not slightly perfect, that usually reflects more on them than it does on your sites, so I wouldn’t stress out about that.”

So breathe easy content creators of the internet; so long as you keep putting effort into making great content, you should be safe from any trouble with Google.

Keeping your website design fresh and modern is an important part of your brand, but it is also essential for SEO success. Search engines tend to favor sites which are regularly refining their site to offer new features and better user experience, as Matt Cutts recently confirmed in one of his Webmaster Chat videos.

But, there is a lot to consider before redesigning or modifying your website. A good website should be able to feel modern for at least a couple of years before needing another serious overhaul, and you are investing considerable resources into having the site designed in a way that communicates your brand well while keeping up with modern design styles.

There are also several factors behind the scenes you need to consider. Great usability and style are important, but several modern design practices seemingly go against some of the biggest search engines suggested practices. If you aren’t careful, you may do some damage to your SEO while trying to improve your site.

Kannav Chaudhary recently broke down how some of the most popular web design practices of the moment can affect your SEO. Usability and keeping your brand modern are important, but finding the right style for your brand also means choosing the paradigm which won’t hurt your other efforts.

Parallax Design

bagigia

Parallax design recently became popular with web designers for it’s unique way of restructuring a site in a visually exciting way. You build your entire website onto one page, but with responsive scrolling which delivers the content in impressive style. Sites with parallax design are incredibly easy for most users to navigate, as they simply have to scroll through the page, but it raises some issues with optimization.

Simply put, most modern SEO practices rely on creating a lot of content over numerous pages so increase the impact of keywords. You show off your skill and reputation through your content, while showing search engines you are relevant for these keywords. When all of your content is on one page, it can dilute the impact of those keywords, and Google can be unsure about how to view your site.

The key is really understanding when to use parallax design. It is great for product or contest pages, because there isn’t much content on those types of sites in the first place. Parallax design can showcase a product and rank for a few key phrases, but it will struggle with presenting a full website to the search engines.

Infinite Scrolling

Etsy

If you are pumping out a lot of content on a regular basis, but want it to be easily available from a single page, infinite scrolling can be the perfect solution Social Media sites like Facebook and Twitter popularized the design practice, but it can be found all over the web these days, especially on blogs.

If you use the wrong method of implementation for infinite scrolling, you may run into some SEO issues, but the current practices avoid the lion’s share of drawbacks. Most web designers use frameworks such as Backbone or Bootstrap with crawlable AJAX so you can present your information on one page, while avoiding the problems of parallax design. Best of all, it loads quickly, so everyone will be happy.

Fixed Width Navigation

Fixed Width Navigation

Navigation will always be an important part of web design, and lately many designers have been using fixed width navigation to keep their menus in place while users move down the page. This way, you can always jump to another part of the site you want to find, even when you’re at the bottom of an article.

Thankfully, this design practice has very little effect on SEO. Your content will still be spread over plenty of pages, but you’ll want to make sure your navigation widget is indexable so that Google can also explore your site.

Conclusion

At the end of the day, you’ll always want to fully understand the new design trends before implementing them for your brand. Most of the time their SEO drawbacks can be mitigated with careful practice, but occasionally you will find one that just isn’t right for your site. As long as you keep user experience as the highest priority, you’ll be able to manage any of the SEO problems that pop up along the way.

As social media has grown there has been a consistent debate as to whether Google considered social signals when ranking websites. There have been several studies suggesting a correlation between strong social media presences and high rankings on the search engine, but there are many reasons they could be related. Well, Google’s head of search spam, Matt Cutts, may have finally put the question to rest with his recent Webmaster Chat video.

According to Cutts, Google doesn’t give any special treatment to websites based on social information. In fact, sites like Facebook and Twitter are treated the same as any other website. The search engine doesn’t do anything special such as indexing the number of likes or shares a page has.

Cutts explains that Google did at one point attempt to index social information. Barry Schwartz suggests Matt is referring to Google’s real time search deal expiring with Twitter. There was a lot of effort and engineering put into the deal before it was completely blocked and nothing useful came to fruition. Simply put, Google doesn’t want to invest more time and money into it only to be blocked again.

Google is also worried about crawling identity information only to have that information change long before Google is able to update it again. Social media pages can be incredibly active and they may not be able to keep up with the information. Outdated information can be harmful to people and user experience.

But, you shouldn’t count social media out of your SEO plan just because it isn’t directly included in ranking signals. Online marketers have known about the other numerous benefits of social media for a long time, and it is still a powerful you can use to boost your online presence and visibility.

A strong social media presence opens up many channels of engagement with your audience that can make or break your reputation. It can also drive huge amounts of traffic directly to your site and your content. By reaching out and interacting with your audience, you make people trust and value your brand, while also encouraging them to explore your site and the content you offer. Google notices all this traffic and activity on your site and rewards you for it as well.

You can see the video below:

Matt CuttsUsually Matt Cutts, esteemed Google engineer and head of Webspam, uses his regular videos to answer questions which can have a huge impact on a site’s visibility. He recently answered questions about using the Link Disavow Tool if you haven’t received a manual action, and he often delves into linking practices which Google views as spammy. But, earlier this week he took to YouTube to answer a simple question and give a small but unique tip webmasters might keep in mind in the future.

Specifically, Cutts addressed the need to have a unique meta tag description for every individual page on your site. In an age where blogging causes pages to be created every day, creating a meta tag description can seem like a fruitless time-waster, and according to Cutts it kind of is.

If you take the time to create a unique meta tag description for every page, you might see a slight boost in SEO over your competitors, but the difference will be negligible compared to the other aspects of your site you could spend that time improving. In fact, overall it may be better to simply leave the meta description empty than to invest your time paying attention to such a small detail. In fact, on his own blog, Cutts doesn’t bother to use meta descriptions at all.

Cutts does say that you shouldn’t try to skimp on the meta tag descriptions by using copy directly from your blog. It is better to have no meta tag description than to possibly raise issues with duplicate content, and Google automatically scans your content to create a description any time you don’t make one.

matt-cutts

Google’s Matt Cutts

With the big crackdown on spammy link building practices over the past two years at Google, there are still many webmasters left with questions about what exactly constitutes a spammy practice. Google has previously advised against using links in forum “signatures” as a means of link building, but what about using a link in a comment when it is topically relevant and contributes to the conversation? That is exactly the question Matt Cutts answered in a Webmaster Chat video on Wednesday.

The short answer is that using links to your site in your comments is fine the majority of the time. Everyone who actually contributes to forums has a habit of linking to relevant information, and that often includes their own blogs. But, like everything, it can be abused.

Matt gave some tips to ensure your comments don’t get flagged as spammy by Google or the sites you are commenting on.

  • If you can, use your real name when commenting. Using a company name or anchor text you want to rank for gives the appearance of commenting for commercial marketing purposes, which raises the spam alarm.
  • If you are using leaving links in blog post comments as your primary means for linkbuilding and the majority of your links come from blog comments, Google will probably flag you.

You can see the video below.

 

If there is one way to concisely explain the changes Google’s search algorithms have gone through in the past couple years, it would boil down to “bigger is not always better.” Gone are the days that you can jam as many keywords as you could fit into a paragraph of text, or buy up countless thousands of links and hope to rank highly.

However, the more you do to offer quality content and information to your users while staying in line with Google’s practices, the more success you’ll see.

Those two ideas are fairly common knowledge now, but they have created their own fair share of questions. Where should the balance between quantity and quality lie? How is this content evaluated? Does quantity of content outweigh quality of content?

Google has given some insight into how content is evaluated in the past, and it is clear that you won’t get far with an excessive amount of paper-thin content. Still, the number of indexed pages your site has does indeed have an effect on your ranking. So how exactly does this work and what is the balance?

Matt Cutts, Google’s head of Webspam, addressed this type of issue head-on in his most recent Webmaster Chat video. He was asked, “Does a website get a better overall ranking if it has a large amount of indexed pages?”

Cutts explained that having more indexed pages isn’t a magic ticket to higher rankings. He said, “I wouldn’t assume that just because you have a large number of indexed pages that you automatically get a high-ranking. That’s not the case.”

However, having more indexed pages does have some clear benefits. The more pages you have, the more opportunities you have to rank for different keywords. But, this is only because you should be covering a larger variety of keywords and topics across that larger group of pages.

A larger number of indexed pages is also likely to improve your overall links and PageRank, which can affect your ranking. But, the link isn’t direct. Simply having more pages won’t improve much for you. Instead, you have to use those extra pages to deliver valuable content and information to your users. If you’re just filling your site with a meaningless wealth of pages to be indexed, you won’t be seeing any improvement anytime soon.

There has been quite a bit of speculation ever since Matt Cutts publicly stated that Google wouldn’t be updating the PageRank meter in the Google Toolbar before the end of the year. PageRank has been assumed dead for a while, yet Google refuses to issue the death certificate by assuring us they currently have no plans to outright scrape the tool.

Search Engine Land reports that yesterday, Cutts finally explained what is going on and why there have been no updates while speaking at Pubcon. Google’s ability to update the toolbar is actually broken, and repairing the “pipeline” isn’t a major priority by any means. The search engine already feels that too many marketers are obsessing too much over PageRank, while Google doesn’t see it as very important.

But, Cutts did give some insight as to why Google has been hesitant to completely kill off PageRank or the toolbar. They have consistently maintained they intend to keep the meter around because consumers actually use the tool almost as much as marketers. However, at this point that data is nearly a year out of date, so suggesting consumers are the main motive for keeping PageRank around is disingenuous.

No, it turns out Google actually uses PageRank internally for ranking pages, and the meter has been consistently updated within the company during the entire period the public has been waiting for an update. It is also entirely possible Google likes keeping the toolbar around because Google wants the data users are constantly sending back to the search engine.

While the toolbar may be useful for the company internally, PageRank has reached the point where it needs to be updated or removed. Data from a year ago isn’t reliable enough to offer anyone much value, and most browsers have done away with installable toolbars anyways. If a repair isn’t a high enough priority for Google to get around to it at all this year, it probably isn’t worth leaving the toolbar lingering around forever.

Leave it to Matt Cutts to always be there to clear the air when there is an issue causing some webmasters confusion. One webmaster, Peter, asked Matt Cutts whether geo-detecting techniques is actually against Google’s policies, as it is common for websites to be designed so that users are given the information (price, USPs) most relevant to their lives based on geo-location.

In some understandings of Google’s policies, this may be against the rules, but it turns out all is fine, so long as you avoid one issue.

In one of his Webmaster Chat videos, Cutts explained that directing users to a version of a site, or delivering specific information based on location are not spammy or against Google’s policies. It only makes sense to offer viewers information that actually applies to their lives.

What Google does consider spam is directing their crawlers or GoogleBot to a web page of content that users cannot see. Sending GoogleBot to a different location that what visitors see is a bad idea, which is considered spam or a form of cloaking. Instead, treat GoogleBot as you would any user, by checking the location information and sending the crawler to the normal page reflecting that data.