Google is great for searching for quick and concise information, such as what you might find on Wikipedia or IMDB. If you have a simple question with an objective answer, the biggest search engine is the perfect tool. But, as anyone who has tried to do actual research for academics or work will tell you, Google is not so great with providing lots of in-depth content.
You might find news, or maybe a couple books and articles on Google Scholar, but the main search results on Google can be limiting. You are getting the results for the people with the best short answers for your questions. Now, Google is trying to change that as they recognize roughly 10% of searches and people (their estimate) are looking for more comprehensive information.
Google has announced that over the next few days they will be rolling out “in-depth articles” within the main search results. Now, when searching for broader topics that warrant more information such as stem cell research or abstract topics such as happiness or love, Google will feature a block of results in the middle of the page like below.
Source: The Official Google Search Blog
This is yet another way Google is forming results intuitively tailored to fit the type of information you are searching for. As a Google spokesperson told Search Engine Land, “Our goal is to surface the best in-depth articles from the entire web. In general, our algorithms are looking for the highest quality in-depth articles, and if that’s on a local newspaper website or a personal blog, we’d like to surface it.”
It has always been a little unclear how Google handles their international market. We know they have engineers across the world, but anyone that has tried to search from outside the US knows the results can seem like what Americans would see five years ago: a few good options mixed with a lot of spam. That’s a little bit of a hyperbole, but Matt Cutts says we can expect to see it continue to get better moving forward.
According to Cutts’ recent Webmaster Help video, Google does fight spam globally using algorithms and manual actions taken by Google employees stationed in over 40 different regions and languages around the world. In addition, they also try to ensure all of their algorithms will work in all languages, rather than just English.
SEO Roundtable points out you could see the international attention to Google’s algorithms when Penguin originally rolled out. At first it was only affecting English queries, but was released for other languages quickly after. With Penguin’s release however, all countries saw the release on the same day.
Matt Cutts did concede that English language queries in Google do receive more attention, which has always been fairly obvious and understandable. There are far more searchers there and that is the native language of the majority of engineers working for the company.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-06 12:05:442013-08-06 12:05:44Google Focuses on English Queries, But Fights Spam Globally
If you’ve ever worked with PPC, you know how important “landing pages” can be. Google partially decides where a paid ad will appear and how much each click costs based on the quality of the landing page that ad leads to. Similarly, SEO professionals surely know all about “optimized pages” and how Google analyzes them for the SERPs. However, Stoney deGeyter from Search Engine Land says we should stop thinking of landing pages and optimized landing pages as different things. Now is the time for an optimized landing page.
SEO and PPC have always worked very closely, and in this case they overlap to the point where keeping them separate is doing a disservice to you and your site. Landing pages need to be optimized and optimized pages need to be respectable landing pages. Merging the two concepts into one idea simply makes sense.
So what does an optimized landing page look like? They simply change the intent of the optimization towards conversions. While SEO optimized pages are intended to rank highly, they can and should be performing the additional purpose of getting users to perform whatever action you desire such as purchasing a good or service or signing up for an e-mail mailing list. To do this, you just need a few things.
Compelling, Keyword Focused Title Tag – The title tag is probably the most 8-10 words you will write when optimizing your page. Not only does it need to be keyword focused, but it also needs to be interesting enough for searchers to choose your link over the others in the search results. Anyone could do one or the other, but achieving both at the same time is tricky.
Well-Written Description – Meta descriptions may not be important for rankings, but that shouldn’t diminish their importance for SEO. It displays in the search results and gets people to click to your page, so it is automatically essential for proper search engine optimization. It is also a great place for a strong call-to-action for the searcher.
Keyword Focused Headline – Headlines are the first thing users see when hitting your landing page from a search engine, so it is important for the keyword to be relevant if not similar to what was listed on the results page. It should also be wrapped in an H1 tag for proper optimization. Proper heading and sub-heading use helps search engines and browsers alike to determine what type of content you are offering and decide if they will stay on the page. Make yours compelling.
Topically Focused Content Concentrating on Benefits – For anyone to stay on your page, you need to keep your content on topic and interesting. Wandering off on tangents or not getting to the point will lose your visitors. Your content can be long, but it must also be trimmed of all excess. Not only that, but the value of your content should be readily available. Customers want to know what they will be getting from the content. Being positive and focusing on real tangible benefits will keep readers and consumers interested.
Keep Your Content Scannable – Even long content needs to be scannable so users can find what they want without hassle. Even interested visitors might not care about everything on your page. Keep your pages cleanly laid out, and clearly divide your content with sub-headlines that show the users where they want to look. White space and line spacing can be especially important to overall readability.
Call-to-Action – Without a call-to-action, there isn’t even a reason to have a landing page. Each page should have a goal that comes with a desired action or results that you want each visitor to take. The landing page should be a first step, not the only one. The only way to accomplish this is by clearly showing users what you want them to do. Whether you want them to share your content, sign up, or purchase, make it obvious.
There are some other small aspects deGeyter says these pages need, but the ones listed are by far the most essential. Optimized landing pages combine the best of both worlds when it comes to SEO and PPC. They accomplish two missions while saving stress and effort. SEO and PPC have their unique focuses and functions, but sometimes they work best when working together.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-06 10:42:342020-08-08 21:57:16How to Make an Optimized Landing Page Work for PPC and SEO
By now, the hacker craze of the 90’s and early 2000’s has died down quite a bit. Most people don’t worry about hackers all that much, so long as you use some solid anti-virus and keep your router protected. Big businesses may have to worry about Anonymous’ hi jinks, but the common person don’t tend to concern themselves with the issue. Hacking especially doesn’t seem like that big of an issue for SEO, at first.
But, hackers can actually do your site some damage, and can even get your site entirely dropped from the Google search index. Sites get blacklisted when hackers inject malicious code onto servers, as Google seeks to protects searchers’ computers from any sort of compromising.
While Google doesn’t immediately drop sites from their index, being blacklisted leads to a complete drop in organic traffic and can be a crisis for SEO. Blacklisting starts as a warning to searchers that a site may be compromised, and few will continue past that alarm.
This has become a rather significant problem for Google. To help provide wide support for the increasing number of webmasters dealing with compromised servers, Google has launched the ‘Webmasters Help for Hacked Sites‘ support center. They give detailed information on how to clean and repair your server and prevent your site from getting entirely dropped from the Google index.
If you think this sort of hacking isn’t a big deal, check out the charts below. They show just how frequent this type of malicious activity has become. It isn’t just banks and large corporations dealing with it. Small businesses are just as at risk as international franchises. The most common form of attack is an automated set of processes that indiscriminately discover and exploit vulnerabilities on servers, which are often left completely unprotected.
Search Engine Journal recently explored the issue more in depth, unpacking why the issue is such a large concern to Google and webmasters alike. Compromised sites can destroy a search engine’s credibility just as your own, so the problem has to be taken very seriously.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-05 11:14:082013-08-05 11:14:08Can Hacking Get You Blacklisted by Google?
Everyone working in SEO knows that Google has a multitude of factors they use to determine the order of search engine results, and the majority of these ranking factors are based on either the content of the webpage or signs of authenticity or reputability. That was the case for the longest time, but since 2010, Google has made significant shifts towards a focus on usability, and the harbinger of this change was the inclusion of website speed to ranking factors.
The problem is, website speed and other usability issues aren’t exactly objectively defined. What exactly is a slow loading site? What is the cutoff? No one has gotten a definitive answer from Google, but in June Matt Cutts explicitly stated that slow loading sites, especially on mobile platforms will begin seeing search rank penalties soon.
Obviously these changes are good for searchers. Searchers want sites that load quickly, offer quality user experience, and deliver great content. And, the emphasis on speed is certainly highlighted on mobile platforms where on-the-go users are likely to go back to the results if the site takes too long for their liking. The issue we face as search optimization professionals is trying to figure out exactly what Google is measuring and how that information is being used.
Matt Peters from Moz decided to break through Google’s intentionally vague information to figure out exactly how site speed affects rankings with the help of Zoompf. They can’t explicitly disprove causation between site speed and rankings, due to the number of other algorithmic ranking factors that complicate the study. But, their results did show very little to no correlation between page load time and ranking.
I wouldn’t take this information as gospel, but it does suggest that loading time isn’t a huge consideration into long tail searches and doesn’t need to be worried about too much. If your site is loading quickly enough to please the people coming to it, your site will also likely pass Google’s expectations.
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-08-01 11:13:562013-08-01 11:13:56How Does Site Speed Really Affect Search Rankings?
Internet security and privacy has been at the forefront of many people’s minds with the recent headlines about the NSA keeping data on the public’s online activity, and the issue has had subtle affects on search engines. We’ve seen a small group of searchers migrating to search engines with stricter privacy policies. Of course, those who are truly outraged by the NSA news would expect to see a pretty large shift, but so far the change has been slow. But, it is picking up momentum.
More and more people are learning about how Google actually decides which results to show you, as an individual, and many are a little concerned. While Google sees the decision to collect data on users as an attempt to individually tailor results, a few raise their eyebrows at the idea that a search engine and huge corporation is keeping fairly detailed tabs on the internet activities of users. The internet comes with an assumption that our activity is at least fairly private, though that notion is getting chipped away at daily. But, there is still the widespread assumption that our e-mails or simple search habits are our business alone, an assumption that is also being proved wrong.
These privacy issues have a fair number of people looking for search engines that keep our searches completely anonymous and don’t run data collection processes. The most notable solution people seem to be moving to is DuckDuckGo.com, a search engine whose privacy policy claims will not retain any personal information or share that information with other sites. The search engine has been seeing a traffic rise by close to 2 million searches per day since the NSA scandal broke.
There are numerous debates surrounding these issues. Political discourse focuses on the legality and ethical aspects of the government and large corporations working together to collect information on every citizen of the United States (other companies included in the NSA story include Yahoo, Facebook, and Microsoft). But, as SEO professionals, the bigger question is the ethical and practical reality of individually tailored results which rely entirely on data collection.
If you’ve ever taken a look at the ads on the edges of websites, you’ve probably noticed that the ads are loosely based on your personal information. The ads reflect your gender, age, location, and sometimes loose search histories. The ads you are shown are chosen based on information your computer relays to almost every site you access. Google acts the same way, but they collect this data and combine it extended data of your search history to deliver search results they believe are more relevant to you.
There is a practicality to this. We all have fine tuned personal tastes, and innately we desire for search engines to show us exactly what we want with the first search result, every time. While poll responses say that the majority of people don’t want personalized search results, are online actions belie our true desires for efficient search. The best way to do this is to gather data and use the data to fine-tune results. On a broad scale, we don’t want results for a grocery store in Los Angeles when we are physically situated in Oklahoma. On a smaller scale, we don’t want Google showing us sites we never go to when our favorite resource for a topic is a few results down the page.
In this respect, the move towards search engines like DuckDuckGo is actually a step back. These privacy-focused search engines are essentially acting how Google used to. They use no personal information, and simply try to show the best results for a specific search. It is a trade of privacy for functionality, and this could possibly explain the slow uptake or migration to these types of search engines. But, people are moving.
The longer the NSA story stays in the news, the more searches DuckDuckGo receives, and this could potentially have a significant affect on the search market in the future. The question is, do we want to sacrifice personal privacy and assumed online anonymity for searches that match our lives? Andrew Lazaunikas recently wrote an article on the debate for Search Engine Journal. He admits DuckDuckGo delivers excellent, unbiased results, but in the end, “when I want to know the best pizza place or car dealer in my area, the local results that Google and Bing shows are superior.”
Lazaunikas isn’t deterred by the aspect, and notes, “I can still get the information I need from DuckDuckGo by modifying my search.” He ends his statement by vowing to use DuckDuckGo more in the future, but the question is whether the public at large will follow. For the moment, it seems as though most people prefer quick easy searches and familiarity to trying out these new search engines.
00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-31 11:41:292013-07-31 11:41:29Will You Alter Your Searches After The Internet Privacy Scandal?
After two fairly explicit warnings about advertorials this year, Google has added advertorials to their webmaster guidelines, as well as other popular spammy linking techniques in the Link Schemes help document.
Google Continues To Downplay Links
The biggest change is the removal of the entire first paragraph from the help article, which addressed how incoming links influence rankings. Search Engine Journal says the removed paragraph read:
Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links influences your ranking. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity.
Links have been steadily falling out of favor throughout the past few years, and it appears we are finally reaching a tipping point for Google’s reduction of linking’s role in search algorithms. Or, as Google has been advising, high-quality sites matter much more than links of any quality.
Keyword-Rich/Optimized Anchor Text Links
Google also tackled heavily-optimized anchor text used in press releases that are usually distributed across other sites. The technique has enjoyed a quick rise in highly competitive markets, and Google appears to finally be putting the squash on the practice. They did note that guest posting is still a popular practice, which can be valuable when done correctly. However, sites that accept guest blogging have been using nofollow or an optimized URL link to avoid issues.
Advertorials
And of course, the final change is the addition of advertorials as an example of unnatural links that violate Google guidelines.
Advertorials or native advertising where payment is received for articles that include links that pass PageRank.
Google has been making swift changes to linking policy and practice, so it is highly likely changes like this will keep occurring. Links can still be a strong weapon in your SEO strategy, but you have to tread carefully, and they maybe shouldn’t be your highest priority when optimizing.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-30 13:39:112013-07-30 13:39:11Google Makes Changes To Their Link Schemes Help Document
Site audits can be ugly work. Nothing can be more disastrous to a client-SEO relationship than informing someone of all the issues wrong with their site in too harsh of a way. They’ve spent time and money having a site created that they think works well for their business, and then we audit the page and everything under the hood and have to break the news that their site is sick or badly put together.
The process is similar to playing “website mechanic” as Stuntdubl SEO put it. We offer diagnostic information and recommendations that are absolutely critical to keeping a site relevant and valuable to organic search, but many people don’t want to hear how bad of shape their car or website are in. Not only will it be costly to fix, but they’ve developed a sentimental relationship to the site they have.
To be able to break the news in the best way possible, we have to be as prepared and informed as possible, which means running extensive auditing and answering a lot of questions. This also means understanding all of the tools at your disposal so that you can get the best answers in the fastest way possible.
Todd Malicoat took 50 of the most important questions for site audits, and identified the best tool available for answering every inquiry. Not only will this speed up your data collection and auditing, it will make you more prepared to create a better site and communicate properly with the clients you are working with.
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-29 13:41:552013-07-29 13:41:55Finding The Best Tools To Answer All Your Site Audit Questions
Everybody talks about SEO as if it is a monolithic entity. At most, you might hear conversation about local SEO and every few weeks someone will chime in to remind us about international SEO, but the vast majority of the dialogue just refers to SEO as a whole.
But, ignoring its constantly changing nature, SEO is also a lot harder to pin down. Great optimization bends and molds to match the client and the unique needs of a market. What works for a nearby plumbing company may not translate to a small tech startup or a healthcare provider. The absolute basics are the same, but all of these companies have different online needs that can’t be handled with a “one-size-fits-all” mentality.
Hotels are one market with especially unique needs, and now that summer is winding down and many people are trying to squeeze in a vacation before the kids return to school, now is as relevant a time to talk about SEO as any. Aleh Barvsevich broke down the topic in detail, covering how search results for hotels are chosen and displayed and what opportunities hotel clients have in PPC and SEO.
Duplicate content has always been viewed as a serious no-no for webmasters and search engines. In general, it is associated with spamming or low-quality content, and thus Google usually penalizes sites with too much duplicate content. But, what does that mean for necessary duplicate content like privacy policies, terms and conditions, and other types of legally required content that many websites must have?
This has been a bit of a reasonable point of confusion for many webmasters, and those in the legal or financial sectors especially find themselves concerned with the idea that their site could be hurt by the number of disclaimers.
Well of course Matt Cutts is here to sweep away all your concerns. He used his recent Webmaster Chat video to address the issue, and he clarified that unless you’re actively doing something spammy like keyword stuffing within these sections of legalese, you shouldn’t worry about it.
He said, “We do understand that a lot of different places across the web require various disclaimers, legal information, and terms and conditions, that sort of stuff, so it’s the sort of thing where if were to not to rank that stuff well, that would hurt our overall search quality. So, I wouldn’t stress out about that.”
https://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.png00TMOhttps://www.tulsamarketingonline.com/wp-content/uploads/2018/07/TMO-Logo.pngTMO2013-07-25 12:27:392013-07-25 12:27:39How Does Google Handle Legally Required Duplicate Content?