Google has always had a love-hate relationship with pop-ups or ‘interstitials’. 

Since 2016, the search engine has reportedly used a ranking penalty to punish sites using aggressive or intrusive pop-ups on their pages. Of course, if you’ve been to many sites recently, you know these disruptive pop-ups are still common across the web.

In a recent stream, Google’s John Mueller clarified exactly how the interstitial “penalty” works, and why so many sites get away with using disruptive pop-ups.

John Mueller on Website Pop-Ups

During a recent Google Search Central office hours stream, Mueller was asked about the possibility of using mobile pop-ups on their site for a short period of time.

Specifically, the individual wanted to know if they would be devalued for using interstitials to ask visitors to take a survey when visiting the site.

Perhaps surprisingly, Mueller didn’t see much issue with temporarily running pop ups on their mobile site. 

Going even further, he explained that even if the site was hit with a penalty for the pop-ups, it could potentially continue to rank well in search results. 

This is because the so-called “interstitials penalty” is quite a minor ranking factor in the grand scheme. While it can affect your rankings, it is unlikely to have a significant impact unless other issues are present.

Still, Mueller says if you are going to use pop-ups on your mobile sites, the best course is to only use them temporarily and not to show them to every visitor coming to your site.

Here’s his full response:

“I don’t think we would penalize a website for anything like this. The web spam team has other things to do than to penalize a website for having a pop-up.

There are two aspects that could come into play. On one hand we have, on mobile, the policy of the intrusive interstitials, so that might be something to watch out for that you don’t keep it too long or show it to everyone all the time.

With that policy it’s more of a subtle ranking factor that we use to adjust the ranking slightly if we see that there’s no useful content on the page when we load it. That’s something that could come into play, but it’s more something that would be a temporary thing.

If you have this survey on your site for a week or so, then during that time we might pick up on that signal, we might respond to that signal, and then if you have removed it we can essentially move on as well. So it’s not that there’s going to be a lasting effect there.

Another aspect that you want to watch out for is if you’re showing the pop-up instead of your normal content then we will index the content of the pop-up. If you’re showing the pop-up in addition to the existing content, which sounds like the case, then we would still have the existing content to index and that would kind of be okay.”

Ultimately, the take-away is to not overly fixate on being penalized specifically for using an interstitial pop-up on your site. Rather, put your attention on doing what is right for your website and what provides the best experience for visitors.

If you want to hear the question and full answer for yourself, check out the video below:

Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

On October 24, Facebook and Instagram plan to roll out a major change which has the potential to break content across millions of sites using WordPress.

On that date, the companies will remove functionality which allows sites to embed content from the social networks. 

The change does not only mean that publishers will no longer be able to embed this content on their websites. The change is retroactive, meaning that all content ever embedded on your site could potentially become inaccessible or broken. 

There is one exception – though it will likely be impractical for many out there. 

The change is removing support for unauthenticated Facebook and Instagram embeds, meaning that those with a developer account and a registered Facebook account will still be able to link content between their app and Facebook or Instagram. 

The Technical Changes

To get into the nitty-gritty – Facebook is deprecating the current oEmbed endpoints for embeddable Facebook content on October 24, 2020. oEmbed is a popular open format means of embedding content from one site to another. 

The Facebook integration of oEmbed endpoints has allowed pages to quickly and easily embed HTML or basic content from pages, posts, and videos on their own site or app. Unfortunately, that aspect of the Facebook API is being removed. 

In response, WordPress has also said it will be removing Facebook and Instagram as supported oEmbed sources.

What You Can Do

As expected, developers began work on ways to fix content or prevent losing access as soon as the announcement was made. 

So far, there are two major options for those wanting to keep support for embedded Facebook and Instagram content on their websites:

oEmbed Plus – Developer Ayesh Karunaratne has created an expanded version of the existing system for embedding content from Instagram and Facebook which provides a workaround. 

Even using the plugin, you will have to register for a Facebook developer account and “create an app”. However, you don’t have to actually develop an app, just register one with the site. 

You can see the guide for the plugin here for an idea what the process entails.

Smash Balloon Plugins – Developer Smash Balloon has provided a potentially easier option by updating their previous plugins to provide continued support – no developer account or app required. This is possible because Smash Balloon is effectively using its own API key to provide authentication for embedded content to users. 

Across the country, governors and mayors are implementing “shelter in place” or “safer at home” orders which are requiring a significant number of businesses to temporarily close during the COVID-19 epidemic.

In response, business owners are making hard decisions to cut costs and tighten belts to make it through these weeks. One such question on many business owners’ minds is whether to continue paying to maintain their website, or if they should take it offline in order to avoid paying hosting or maintenance costs.

Google Says Don’t Shut Down Your Website

It may be tempting, but disabling your site for any amount of time – even just a few days – can have long-lasting effects on your search engine rankings. Not only does it completely shut down the ability for people to find out about your products and services for the time being, it essentially removes your site from Google’s index.

In this situation, Google will have to reindex your website when you come back online, putting you back at square one.

What To Do Instead

In new recommendations, Google is suggesting that businesses limit their site’s functionality rather than go completely offline when you need to pause operations.

The company suggested a number of steps you can take to suspend your online services while still keeping customers informed and preserving your search visibility. These steps include:

  • Keep users informed with a popup or banner explaining how your business has changed. Follow Google’s guidelines for banners and popups to ensure that you’re not interfering with the user experience.
  • Adjust your structured data to reflect event updates, product availability and temporary closures. You can also mark your business as temporarily closed through Google My Business.
  • E-commerce sites should follow Google’s Merchant Center guidance on availability and, if necessary, disable cart functionality.
  • Inform Google of site updates by requesting a recrawl through Search Console.

If You Absolutely Must Take Down Your Site

As a last resort, Google does recommend a few things you can do to protect your search visibility if you must take your site down:

  • For a temporary takedown, use the Search Console Removals Tool.
  • If you’re taking down your site for one or two days, you can return an informational error page with a 503 Service Unavailable code.
  • For longer site takedowns, put up an indexable homepage placeholder for searchers using the 200 HTTP status code.

Don’t Overreact, Think Ahead

It is easy to get caught up in the current situation and lose sight of the long-term picture. While the COVID-19 epidemic is a serious concern for businesses, it will eventually pass. When it does, you want to be ready to hit the ground running, not starting again from square one.

Google has announced it plans to warn users of its Chrome browser about slow sites using a method called “badging”.

The idea is to provide a sign letting users know when a site typically loads slowly before they ever click a link to that site or while the site loads. Google sees this as a way to “reward” fast sites, saying:

“We think the web can do better and want to help users understand when a site may load slowly, while rewarding sites delivering fast experiences.”

For example, Google published one concept for what a slow speed badge could look like while a site is loading:

In this case, it is likely that the badge could increase abandonment rates for slow sites.

The company is also talking about using contextual menus that preview links and would include similar badges indicating a site is fast.

Another idea includes subtly changing the color of loading bars to indicate whether a site is fast:

As the company explained in its announcement:

“Our early explorations will look at a number of Chrome surfaces, including the loading screen (splash screen), loading progress bar and context-menu for links. The latter could enable insight into typical site speeds so you’re aware before you navigate.“

The web browser admits this idea is in the early stages, and may considerably change before they determine “which provides the most value to our users.”

Additionally, the company says they plan to expand the badges to include a number of metrics aside from speed:

“Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed.”

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”

In mid-2018, Google’s web browser Chrome made a small tweak to help users know how safe a specific site was. Specifically, it added a tag in the search bar flagging any site that had not updated to HTTPS as “not secure”.

Now, with the help of a new survey from the agency John Cabot, we are finally getting insight into how this little notification affects people’s perception of sites.

Based on a survey of 1,324 people in the UK, the survey finds that nearly half of all people respond negatively to sites which are flagged as “not secure” and many are less willing to give personal information to these sites.

According to the findings, 47% of respondents said they “knew roughly what the warning meant.” Similarly, 46% said they would not give their names or financial information to a site flagged as “non-secure”. Even more, 64% of that group say they would immediately leave non-secure sites.

The survey also found a few other fears and concerns when users come upon a non-secure site:

  • Their device was exposed to a virus — 14%
  • They had arrived on a fake version of the intended site — 12%
  • The content was “unreliable and not fact-checked” — 9%
  • Being signed up for spam email — 8.4%

Notably, the survey found that a brand’s existing perception appears to play a role in determining how people respond to a non-secure site. For example, retailer John Lewis experienced significantly less negative reactions to their site, despite being tagged as non-secure. This suggests widespread name recognition could potentially counter the warning.

Still, the findings of the survey show that a huge number of users are taking note any time they find a business website which has not implemented HTTPS encryption and many are even changing their behaviors based on this warning. If you haven’t updated your business site, these results suggest you could be losing up to 50% of your potential customers to something that is easy and affordable to implement.

A new survey of US consumers has some surprising findings about what customers expect out of business websites.

The results from 1,013 respondents between the ages of 18-60 show that consumers have high expectations when it comes to how frequently your website is updated, what features are implemented, and how you are advertising your business online.

What Consumers DON’T Want in a Website

Of the respondents, more than 80% say they view a brand more negatively if their website is out of date. Additionally, 39% of consumers say they would reconsider buying a product or service if the website isn’t current.

The issue of advertising is also a prickly subject for consumers, based on the survey results.

Less than 10% approved of brands showing ads on social media based on a person’s browsing activity. Meanwhile, approximately 26% feel negatively about ads appearing on their social media feeds based on their browsing or device history – saying it is an invasion of privacy.

On the other hand, 41% of consumers say they don’t mind if websites keep personal data, but only if it is secured on used exclusively to improve the user experience.

Overall, consumers are largely conflicted. Approximately 50% of respondents say that they like the convenience of brands keeping data for to improve ads and user experience, but they are concerned about how else it might be used.

What Consumers DO Want in a Website

In general, consumers say ease of use should be the top priority in making their online experience better.

Approximately 50% of the respondents said they prefer user-created content like reviews and photos to help inform their purchasing decision.

Meanwhile, 25% say their favorite website feature is receiving a reminder when they have left a product in their shopping cart.

Perhaps surprisingly, a major feature desired by users is an on-site search engine. Nearly one-third of respondents say they are put-off if a site does not have a search box, while more than 40% say a search box is the most important feature on a site.


The survey includes a number of interesting findings about consumer behavior and desires online covering a wide range of topics. You can read all the details from Blue Fountain Media here.

 

Google has been banging the drum for speeding up mobile websites for what seems like forever now, and they’ve released numerous tools to try to help webmasters do just that. This week, at the Mobile World Congress in Barcelona, the search engine announced two more resources to show websites how they are performing – a new “Mobile Scorecard” and Conversion Impact Calculator.

The tools present marketers and webmasters with visual-heavy depictions of how their website stacks up to the competition and what they may be missing out on by not being quicker to load pages

Google’s Mobile Scorecard

The Mobile Scorecard uses data from the Chrome User Experience Report to compare the speed of several sites on mobile. This allows you to directly compare your site against your closest competitors in a race for the fastest website. According to Google, the Mobile Scorecard can give information on thousands of sites across 12 countries.

Even if you’re the leader of the pack, Google recommends making sure your site loads and becomes usable within five seconds on most mobile devices and within three seconds on 4G connections.

Google Conversion Impact Calculator

Of course, the biggest thing keeping most businesses from enhancing their websites for mobile devices is money. To help sway you towards making the investment, Google is launching the new Impact Calculator which shows how much revenue you could be missing out on because of a slow loading speed.

The calculator uses data from The State of Online Retail Performance report from April 2017. This report found that every second it takes for your web pages to load can hurt conversions by up to 20 percent.

The tool calculates your potential lost conversion revenue based on your average monthly visitors, average order value, and conversion rate.

Both the Mobile Scorecard and Impact Calculator are available to check out here.