SEO IconNewcomers to SEO can often feel intimidated by the complex, lingo-filled field of optimization. There are countless articles explaining seemingly complicated concepts, constantly changing practices, and constant warnings about the cost of making mistakes. It all makes SEO seem very high-risk and altogether frightening.

But, SEO doesn’t have to be that way. While you can dive straight into the rabbit hole and try to make sense of it all if you want, there are also many resources available to break it all down in simple terms, if you know where to find them. Pan Galactic Digital is one of those resources, as they regularly publish “SEO 101” articles trying to educate consumers, beginners, and anyone else interested.

SEO can seem unwieldy because it has to pay attention to the countless ranking signals that Google and Bing use to rank sites, but in all actuality, it all can be collected into two categories: on-page and off-page SEO.

On-Page SEO

On-page SEO is exactly what it sounds like. It is made off all the areas you can optimize directly on each individual page of your site, especially Content, Code, and Site Architecture.

You are most likely familiar with content, because it is what you interact with most often. Specifically, content is everything you read and all the images you see directly on the page. Are they high quality? Does the text inform and engage? Is the content using keywords you would want to rank in a search engine, but not overusing them so that it feels unnatural? High quality content offers value to the viewer.

You can also optimize some of the HTML code on your page around chosen keywords. Titles, Image Alt tags, and Headers can all be potentially optimized to help search engines understand what your site is and what it offers. However, just as with content, overstuffing these areas with keywords isn’t advised.

Lastly, you have your site architecture, or how your site is laid out through proper URL management and loading speed of webpages. You want to make sure search engines can easily access your site once they’ve found you. You will want your keywords to incorporate important keywords that are relevant to your content.

Off-Page SEO

Off-page SEO is everything that happens outside your site, which can actually be fairly out of your control if done properly. The most well-talked about example is linking. Links to your site from other sites have historically been highly favored by Google as a sign of a quality site. They act roughly like votes in favor of your site. But, these have been partially demoted because numerous optimizers would attempt linking schemes such as buying links or syndicating content on other websites in inappropriate ways. Don’t ever buy links or try to take advantage of a loophole to get them. They must be earned.

One of the quickest growing off-page SEO signals is social media. It isn’t entirely clear how much social media presence directly affects your rankings compared to how much it just brings you traffic, but its no argument that well managed social media is a great tool for your website and business presence.

Obviously, you can dig deeper into these two categories and find far more ways that Google and Bing decides where to rank your webpages, but those basic factors are by far the most important. If you can get a handle on everything above, you will be well suited for anything else you encounter moving forward.

Google IconHow fast does your website load on mobile devices? Under five seconds? If you said yes to the second question, you are probably pretty happy with your answer. What about under one second? Probably not. But that is how fast Google says sites should load, according to their newest guidelines for mobile phones.

Before you start freaking out at the suggestion their site is supposed to load in under a second, it should be clear that Google isn’t mandating an insane guideline. They don’t actually expect most websites to completely load that quickly. Instead, they are focusing on the “above the fold” content. They think users should be able to get started playing with your page quickly, while the rest can progressively load.

It is probably a wise insight, considering most mobile users say they are more likely to leave a site the longer it takes to load. On smartphones, every second really counts, and if you can get the above the fold content loaded within a second, most users will be happy to wait for the rest of the content while they start exploring.

The update reads:

“…the whole page doesn’t have to render within this budget, instead, we must deliver and render the above the fold (ATF) content in under one second, which allows the user to begin interacting with the page as soon as possible. Then, while the user is interpreting the first page of contents, the rest of the page can be delivered progressively in the background.”

To match with the new guidelines, Google also updated its PageSpeed Insights Tool to focus more on mobile scoring and suggestions over the desktop scoring. They also updated scoring and ranking criteria to reflect the guideline changes.

Manual Actions Viewer Screenshot

Google has long been alerting webmasters when they placed a manual action against the site, but last week they made it even easier to know for sure whether a site’s search rankings are being penalized with a manual action. The search engine has added a new feature to Webmaster Tools called the Manual Actions viewer.

The Manual Actions viewer is seen under the “Search Traffic” tab, and it is meant to act as a complimentary alert to the email notifications they already send out to websites receiving a manual action. With the new tool, webmasters don’t have to rely on waiting for an email. Instead, they can check their site’s condition any time.

According to Google, less than two percent of all domains within its index are manually removed for spammy practices, so most legitimate webmasters will never see anything within the tool other than a display reading “No manual webspam actions found.”

However, for those who get targeted for spammy practices, the Manual Actions viewer will show existing webspam problems under two headings titled ‘site-wide matches’ and ‘partial matches’. They will also include information on what type of problem exists from a list of roughly a dozen categories including ‘hidden text and/or keyword stuffing’, ‘thin content’, and ‘pure spam’.

For the partial matches listed in the tool, Google also gives access to a list of affected URLs for each type of spam problem. For example, if you have a notification for thin content, you will be able to see all the URLs targeted. There is a limit of 1,000 URLs per problem category, but that should be plenty for al but massive websites like YouTube.

Within the tool, there is also quick access to a new ‘Request a Review’ button that appears any time there are manual actions listed. When you click the button, a pop-up window opens which lets the webmaster give Google details on how you have resolved the issues.

Recently, Google updated the link schemes web page that gives examples of what Google considers to be spammy backlinks. The additions are pretty notable as article marketing or guest posting campaigns with keyword rich anchor text have been included. Advertorials with paid links and links with optimized anchor text in press releases or articles were also added.

With all the new additions, it can be hard to keep up to date with what Google is labeling spammy backlinks or backlink schemes. But, Free-SEO-News’ recent newsletter simply and efficiently lays out the 11 things that Google doesn’t like to see in backlink campaigns.

  1. Paid Links – Buying or selling links that pass PageRank has been frowned upon for a long time. This includes exchanging money for links or posts that contain links, sending ‘free’ products in exchange for favors or links, or direct exchange of services for links. It is pretty simple, buying links in any way will get you in trouble.
  2. Excessive Link Exchanges – While exchanging links with relevant other websites in your industry is absolutely normal for websites, over-using those links or cross-linking to irrelevant topics is a big sign of unnatural linking. Simple common sense will keep you from getting in trouble, just don’t try to trick the system.
  3. Large-Scale Article Marketing or Guest Posting Campaigns – Similar to the last scheme, posting your articles and guest posts on other websites it perfectly normal. However, doing it in bulk or posting the same articles to numerous websites will appear to be blogspam to Google. Also, if you do guest posts just to get keyword rich backlinks, you will see similar penalties. Only publish on other websites when it makes sense and offers value.
  4. Automated Programs or Services to Create Backlinks – There are tons of ads for tools and services that promise hundreds or thousands of backlinks for a low price and very little work. While they may do what they say, Google also easily spots these tools and won’t hesitate to ban a site using them.
  5. Text Ads That Pass PageRank – If you’re running a text ad on another website, you have to make sure to use the rel=nofollow attribute, otherwise it appears to be a manipulative backlink.
  6. Advertorials That Include Links That Pass PageRank – If you pay for an article or ad, always use the rel=nofollow attribute. Simply put, if you paid for an ad or article, it won’t do you any good and can bring a lot of damage if you don’t use the attribute.
  7. Links with Optimized Anchor Text in Articles or Press Releases – Stuffing articles and press releases with optimized anchor text has been a strategy for a long time, but Google has shut it down recently. If your page has a link every four to five words, you’re probably looking at some penalties.
  8. Links From Low Quality Directories or Bookmark Sites – Submitting your site to hundreds of internet directories is an utter waste of time. Most links won’t ever get you a single visitor and won’t help your rankings. Instead, only focus on directories that realistically could get you visitors.
  9. Widely Distributed Links in the Footers of Various Websites – Another older trick that Google has put the squash on was to put tons of keyword rich links to other websites in the footer. These links are always paid links and are an obvious sign of link schemes.
  10. Links Embedded in Widgets – It isn’t uncommon for widget developers to offer free widgets that contain links to other sites. It also isn’t uncommon for these developers to reach out to site owners and offer to advertise through these widgets. However, Google hates these links and considers them a scheme. I’d suggest against it, but if you do advertise through these widgets, use the nofollow attribute.
  11. Forum Comments With Optimized Links in the Post – It is very easy to get a tool that automatically posts to forums and include links to websites. It is a pretty blatant form of spam which won’t get any actual visibility on the forums and the links are more likely to get you banned than draw a single visitor.

There’s a pretty obvious underlying trend in all of these tactics that Google fights. They all attempt to create artificial links, usually in bulk. Google can tell the quality of a link and all of these schemes are easily identifiable. Instead, focus on building legitimate quality links, and use respected tools such as SEOprofiler. It will take longer, but you’re site will do much better.

SEO Magnifying Glass

Source: Flickr

Search engine optimization (SEO) isn’t the easiest thing to get into, even though it is one of the most important things you can learn when starting an online business or building a website for your company. It isn’t that SEO is too difficult for most to learn, it is simply that most people in the industry have been working in it for so long that even the basic guides often come out overly complicated.

SEO is extremely important for bringing in new customers and being found online. In basic terms, SEO is notifying search engines to the existence of your site and telling them what its about. This way, search engines can rank the quality of sites and decide where you belong in the results. Of course, the higher you are in the search results, the more people will come to your site.

Daily SEO Tip categorizes SEO into four basic parts: keywords, content, links, and relevance. If you understand each of these components, you are well on your way to setting up your search engine optimization.

Keywords

Keywords act as the basic main ingredients of your website. The amount of keywords you have, their relevance, and how often you use them all play a role in a search engine determining your site’s quality.

  • Make sure all keywords you use are directly related to your service, brand, or product. Keep them specific to what you do, not just the broad industry you work in.
  • There is a practice called keyword stuffing that can get you into a lot of trouble. Keyword stuffing is the practice of overusing keywords in order to trick search engines. But, the search engines are very smart and will quickly see that you’re using words out of context or unnecessarily.

Content

Search engines are basically rating your website, and content is the main thing they are judging. The engines want to show searchers sites with valuable information. That doesn’t mean the content is selling to the user. It should be offering something of real value such as informative videos, up to date news, or helpful tutorials. Instead, the content establishes yourself as an expert in your field and raises your site’s reputability with search engines.

Linking

Ratings are partially decided based on how many inbound links a website has. They serve essentially as arrows directing the search engines to your site. It also follows the theory that if people are linking to your site there must be something of value there. It also shows that you aren’t an isolated spammy site in the internet ether, which is why you should also include links on any social media sites (aside from simply helping visitors find your business.)

Relevance

Relevance is less of a concrete component of SEO, but it is relevant in every facet of the work. Search engines spend the majority of their time fighting spam, and irrelevant content, keywords, or links are a huge red flag that a site may not be reputable. Search engines assume webpages deal with specific topics, be it news, jewelry, or a Buffy the Vampire Slayer fanpage. By keeping your content relevant to your topic, search engines know you are focused, professional and informative.

Conclusion

If you can get a hang on these four basic ideas, you will have a solid grasp on how SEO functions and how you can get your site showing up on search engines, bringing in new visitors and potential customers. SEO can be a broad, complicated topic, but the basics tend to always stay the same. Follow these principles, and you’ll be able to figure out the rest.

Google is great for searching for quick and concise information, such as what you might find on Wikipedia or IMDB. If you have a simple question with an objective answer, the biggest search engine is the perfect tool. But, as anyone who has tried to do actual research for academics or work will tell you, Google is not so great with providing lots of in-depth content.

You might find news, or maybe a couple books and articles on Google Scholar, but the main search results on Google can be limiting. You are getting the results for the people with the best short answers for your questions. Now, Google is trying to change that as they recognize roughly 10% of searches and people (their estimate) are looking for more comprehensive information.

Google has announced that over the next few days they will be rolling out “in-depth articles” within the main search results. Now, when searching for broader topics that warrant more information such as stem cell research or abstract topics such as happiness or love, Google will feature a block of results in the middle of the page like below.

In-Depth Articles Screenshot

Source: The Official Google Search Blog

This is yet another way Google is forming results intuitively tailored to fit the type of information you are searching for. As a Google spokesperson told Search Engine Land, “Our goal is to surface the best in-depth articles from the entire web. In general, our algorithms are looking for the highest quality in-depth articles, and if that’s on a local newspaper website or a personal blog, we’d like to surface it.”

It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.

According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.

SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.

Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.

If you’ve ever worked with PPC, you know how important “landing pages” can be. Google partially decides where a paid ad will appear and how much each click costs based on the quality of the landing page that ad leads to. Similarly, SEO professionals surely know all about “optimized pages” and how Google analyzes them for the SERPs. However, Stoney deGeyter from Search Engine Land says we should stop thinking of landing pages and optimized landing pages as different things. Now is the time for an optimized landing page.

SEO and PPC have always worked very closely, and in this case they overlap to the point where keeping them separate is doing a disservice to you and your site. Landing pages need to be optimized and optimized pages need to be respectable landing pages. Merging the two concepts into one idea simply makes sense.

So what does an optimized landing page look like? They simply change the intent of the optimization towards conversions. While SEO optimized pages are intended to rank highly, they can and should be performing the additional purpose of getting users to perform whatever action you desire such as purchasing a good or service or signing up for an e-mail mailing list. To do this, you just need a few things.

  • Compelling, Keyword Focused Title Tag – The title tag is probably the most 8-10 words you will write when optimizing your page. Not only does it need to be keyword focused, but it also needs to be interesting enough for searchers to choose your link over the others in the search results. Anyone could do one or the other, but achieving both at the same time is tricky.
  • Well-Written Description – Meta descriptions may not be important for rankings, but that shouldn’t diminish their importance for SEO. It displays in the search results and gets people to click to your page, so it is automatically essential for proper search engine optimization. It is also a great place for a strong call-to-action for the searcher.
  • Keyword Focused Headline – Headlines are the first thing users see when hitting your landing page from a search engine, so it is important for the keyword to be relevant if not similar to what was listed on the results page. It should also be wrapped in an H1 tag for proper optimization. Proper heading and sub-heading use helps search engines and browsers alike to determine what type of content you are offering and decide if they will stay on the page. Make yours compelling.
  • Topically Focused Content Concentrating on Benefits – For anyone to stay on your page, you need to keep your content on topic and interesting. Wandering off on tangents or not getting to the point will lose your visitors. Your content can be long, but it must also be trimmed of all excess. Not only that, but the value of your content should be readily available. Customers want to know what they will be getting from the content. Being positive and focusing on real tangible benefits will keep readers and consumers interested.
  • Keep Your Content Scannable – Even long content needs to be scannable so users can find what they want without hassle. Even interested visitors might not care about everything on your page. Keep your pages cleanly laid out, and clearly divide your content with sub-headlines that show the users where they want to look. White space and line spacing can be especially important to overall readability.
  • Call-to-Action – Without a call-to-action, there isn’t even a reason to have a landing page. Each page should have a goal that comes with a desired action or results that you want each visitor to take. The landing page should be a first step, not the only one. The only way to accomplish this is by clearly showing users what you want them to do. Whether you want them to share your content, sign up, or purchase, make it obvious.

There are some other small aspects deGeyter says these pages need, but the ones listed are by far the most essential. Optimized landing pages combine the best of both worlds when it comes to SEO and PPC. They accomplish two missions while saving stress and effort. SEO and PPC have their unique focuses and functions, but sometimes they work best when working together.

By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.

But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.

While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.

This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.

If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.

Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.

Timer

Source: WikiCommons

Everyone working in SEO knows that Google has a multitude of factors they use to determine the order of search engine results, and the majority of these ranking factors are based on either the content of the webpage or signs of authenticity or reputability. That was the case for the longest time, but since 2010, Google has made significant shifts towards a focus on usability, and the harbinger of this change was the inclusion of website speed to ranking factors.

The problem is, website speed and other usability issues aren’t exactly objectively defined. What exactly is a slow loading site? What is the cutoff? No one has gotten a definitive answer from Google, but in June Matt Cutts explicitly stated that slow loading sites, especially on mobile platforms will begin seeing search rank penalties soon.

Obviously these changes are good for searchers. Searchers want sites that load quickly, offer quality user experience, and deliver great content. And, the emphasis on speed is certainly highlighted on mobile platforms where on-the-go users are likely to go back to the results if the site takes too long for their liking. The issue we face as search optimization professionals is trying to figure out exactly what Google is measuring and how that information is being used.

Matt Peters from Moz decided to break through Google’s intentionally vague information to figure out exactly how site speed affects rankings with the help of Zoompf. They can’t explicitly disprove causation between site speed and rankings, due to the number of other algorithmic ranking factors that complicate the study. But, their results did show very little to no correlation between page load time and ranking.

I wouldn’t take this information as gospel, but it does suggest that loading time isn’t a huge consideration into long tail searches and doesn’t need to be worried about too much. If your site is loading quickly enough to please the people coming to it, your site will also likely pass Google’s expectations.