Posts

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

Google is in the process of rolling out a significant update to its broad search engine algorithm which appears to be having a big impact on search results.

The company announced the update on June 2nd, the day before the update began rolling out. This raised some eyebrows at the time because Google generally doesn’t update the public about algorithm updates beforehand, if at all.

As Danny Sullivan from Google explained recently, the only reason they decided to talk about the update is that it would be “definitely noticeable.”

While the update is seemingly still rolling out, the early indications are that the effects of this update certainly are noticeable and could have a big impact on your site’s performance.

What Does This Mean For You?

Unfortunately, Google is never too keen to go into the specifics of their algorithm updates and it is too early to definitively tell what the algorithm update has changed.

All that is clear from reports around the web is that the algorithm update has caused a seemingly dramatic shift for sites previously affected by Google algorithm updates. Some are reporting massive recoveries and improved traffic, while others are saying their rankings have tanked over the past week.

What Does Google Say To Do?

Oddly enough, Google has provided a little bit of guidance with this latest update, though it may not be what you want to here.

The company says to essentially do nothing because there is nothing to “fix.”

Some experts within Google has also suggested results may normalize somewhat in the coming weeks as the search engine releases further tweaks and updates.

In the meantime, the best course of action is to monitor your website analytics and watch Google Search Console for notifications or big changes.

If you do see a major shakeup, you might watch to see if it recovers within the coming days or conduct an assessment of your site to evaluate what your site can do better for both search engines and potential customers.

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”

Often, businesses think of SEO and online advertising as being entirely separate. They may feel like they need to choose one or the other. However, a new study from WordStream shows that most experts agree that SEO and advertising work best together, not apart.

The new data published in WordStream’s report on the online advertising landscape in 2019 reveals that more than three-quarters (79%) of online advertisers are also incorporating SEO within their marketing strategies.

Even more, digital advertisers ranked SEO as the leading marketing channel aside from advertising for growing their business.

The full breakdown of responses is as follows:

Outside of digital advertising, what other marketing channels are you using to grow your business in 2019?

  • SEO – 79%
  • Email marketing – 66%
  • Content marketing – 60%
  • Word of mouth marketing – 47%
  • Direct mail – 32%
  • Event marketing – 26%
  • Guerrilla marketing – 9%
  • Affinity marketing – 6%
  • Telemarketing – 4%
  • Other – 1%

As WordStream explains, the findings show that while advertisers may prioritize paid search for bringing in immediate revenue, they also recognize the importance of fostering a long-term strategy for bringing in new potential customers:

“Like content marketing, SEO can be an extremely valuable long-term strategy when done effectively. Kudos to those surveyed for recognizing the importance of balancing short-term results with a long-term strategy for sustainable growth!”

The report includes a number of other interesting tidbits about the current state of online advertising, including the discovery that nearly half of advertisers are increasing their Google search ads budgets this year.

To read the full report, click here.

Everyone wishes there was a simple recipe to guarantee you’ll rank at the top of the search engines, but Google’s Gary Illyes says there is no such thing. In fact, there isn’t even a consistent top-three ranking factors for all content.

Instead, Illyes explains that the top-ranking factors for web pages vary depending on the query being searched. Going by that thought process, factors like links might be used to verify that something is newsworthy, while page speed, content quality, and keyword usage may be more useful for some types of content.

John Mueller, also a big figure at Google, joined the discussion to suggest that worrying about optimizing for specific ranking factors is “short-term thinking.”

Surprisingly, Illyes takes it even further by saying that links – often viewed as one of the most important signals for a website – are often not a factor in the search results at all. Long-tail search queries, in particular, are likely to pull up content with few to no links.

While this can be discouraging to brands or businesses looking for specific ways to improve their site and rank higher, the overall message is clear. A holistic approach that prioritizes people’s needs and desires is bound to benefit you, while myopically focusing on specific factors is bound to eventually leave you left behind.

As Mueller suggests – if you build something awesome, Google will come.

Google Logo

With Google’s extensive personalization of search results for users, it has gotten harder and harder to tell when a major shakeup happens thanks to changes to Google’s algorithms. That hasn’t stopped people from guessing a major algorithm shift has occurred when they notice significant changes to how sites are performing across the board.

This happened last week when many major authorities in SEO speculated Google unleashed a major algorithm update. Of course, Google won’t confirm that any major changes happened, but Webmaster Trends Analyst for Google, John Mueller, did take the time to remind everyone “we make changes almost every day.”

Google’s Gary Illyes took the stance even further, tweeting “we have 3 updates in a day average. I think it’s pretty safe to assume there was one recently…”

The truth is, the days of the major Google algorithms like Penguin and Panda upending the search world overnight are largely over. Instead, Google has shifted to a model of constant evolution, tweaking and changing things perpetually.

When there is a new important algorithm, such as recent mobile-friendliness algorithms, the company tends to warn businesses ahead of time. Even then, these recent algorithm updates have been benign, only affecting a small number of websites.

The best plan isn’t to be on constant watch for unannounced shifts, and react. Instead, take a proactive stance by making sure your site follows all of Google’s latest best practices and provides value to searchers. If you do that, you should make it through any changes Google throws at you any time soon.

Source: Robert Scoble / Flickr

Source: Robert Scoble / Flickr

Were you punished by Google’s Penguin algorithm? If you have, there is a good chance you’ve been waiting a year or longer to recover after taking all the necessary steps to have your site reconsidered.

Thankfully, as part of the latest update to Penguin, you won’t have to wait much longer to see if you’ve bounced back. Google’s Gary Illyes confirmed, via Twitter, that Penguin recoveries have already begun rolling out and will be finished within the coming days.

This means that sites that were penalized should start to show improvements within the next week. What it doesn’t mean, however, is that you can expect to return to your same former glory in the search engines.

Removing the penalty still leaves you without the bad links likely driving much of your high ranking, so you can’t expect them to help boost you back up to high spots in the search results. On the other hand, if you’ve taken the time while you’ve been penalized to build new, better links and further optimize your site, you may come out ahead once all the recoveries are finished rolling out.

Google’s Penguin algorithm has been a core part of the search engines efforts to fight spam and low-quality content for years, but it has always been its own thing. The algorithm ran separate from Google’s core algorithm and was refreshed periodically. But that is all changing.

Starting today, Penguin is running in real-time as part of Google’s primary algorithm in all languages.

What Does This Mean?

In the past, the Penguin algorithm has been relatively static. When it was updated or refreshed, it would dish out penalties and remove penalties from those who had gone successfully gone through the reconsideration process. The only problem was these updates were sporadic, at best. In fact, the last update was over 700 days ago.

By turning Penguin into a real-time part of its algorithm, Google is speeding up the entire system so penalties can be given when a site is flagged and those who have resolved their problems can lose their penalty more quickly.

According to Google, Penguin can now make changes in roughly the same period of time it takes the search engine to crawl and re-index a page.

What Else Is Changing?

While the speed of Penguin is the biggest change as it becomes part of the core algorithm, there are some other small tweaks to how it works.

Penguin is now more targeted, only penalizing specific pages with that break link guidelines. Google Penguin used to punish the entire site for containing pages containing spammy link building practices, but now it will only devalue the individual pages.

Google is also making some changes to how it talks about Penguin in the public. Or, as the company stated, “We’re not going to comment on future refreshes.”

bottraffic-incapsula-050115

The bad news is half or more of your website traffic likely comes from bots. The good news is that is actually a huge improvement from in the past.

A new report from Imperva Incapsula shows that approximately 48.5% of all traffic to websites comes from bots, not actual online users. That number comes from a review of over 19 billion visits to 35,000 Incapsula client websites around the world with a minimum daily traffic count of at least 10 human visitors gathered over a 90 day period in 2015.

According to the data, 51.5% of all Web traffic comes from human users, while 29% come from “bad bots” which automate spam or other malicious activity, and 19.5% came from “good bots” which are used by search engines and other online services.

share-incapsula-050115

While this sounds bad, the share of human traffic is actually rising compared to past year. The report explains:

In a similar 2013 study conducted by Imperva, humans made up only 31.5% of all visits to sites, compared with 51.5% in 2015. This shift is mainly due to an increase in human traffic as more people use the Web and a decrease in good bot traffic.

size-incapsula-050115

The ratio of bots-to-humans your website receives is likely influenced by how popular your site is, as the most popular sites examined showed the smallest ratio of bot traffic (39.7%). In comparison, the least popular sites included in the traffic had the highest share of bot traffic (85.4%).

No matter what percentage of your traffic comes to bots, the best solution is to continue emphasizing marketing that directly connects with real humans such as social media marketing and PPC.

Firefox Yahoo

Google has been heavy-handed in trying to woo Firefox users back to their search engine since Yahoo became the default search engine for the browser. It also appears to be working.

ComScore released the latest US search market share numbers for February and it seems Yahoo is gradually losing the gains they have made since they made a deal to become the default search engine for the browser and Google is reaping the benefits.

Screen-Shot-2015-03-18-at-10.47.35-AM-600x278

Since the switch over lost Google a small portion of users, Google has been practically begging users to make switch back. While there hasn’t been a mass exodus back to the motherland of Google, Yahoo is seemingly losing a slow but steady stream of users back to Google.

According to comScore’s report, Yahoo lost approximately 10 percent of its search volume from January to February, while Google recouped a tenth of a point along with Bing. This lines up with another recent report from StatCounter which also indicated a loss by Yahoo between January and February.

Screen-Shot-2015-03-18-at-10.57.59-AM-600x379

From the time Yahoo became the primary search engine to January, Yahoo had gained 1.2 points. Now Yahoo is still above their previous levels, but it has list .2 percent of those gains. The question is whether the trend continues.

It is important to note comScore’s numbers don’t include data from mobile searches, where Google is even more dominant.