The Washington Post may not be the first organization you imagine when you think about SEO experts, but as a popular news organization read by millions around the world, The Post has dealt with its fair share of issues in developing its long-term strategies for web performance and SEO. 

Now, the news site is sharing the fruit of that hard work by releasing its own Web Performance and SEO Best Practices and Guidelines.

These guidelines help ensure that The Washington Post remains competitive and visible in highly competitive search spaces, drives more organic traffic, and maintains a positive user experience on its website. 

In the announcement, engineering lead Arturo Silva said:

“We identified a need for a Web Performance and SEO engineering team to build technical solutions that support the discovery of our journalism, as the majority of news consumers today read the news digitally. Without proper SEO and web performance, our stories aren’t as accessible to our readers. As leaders in engineering and media publishing, we’re creating guidelines that serve our audiences and by sharing those technical solutions in our open-source design system, we are providing tools for others to certify that their own site practices are optimal.”

What’s In The Washington Post’s SEO and Web Performance Guidelines?

If you’re hoping to see a surprise trick or secret tool being used by The Washington Post, you are likely to be disappointed. 

The guidelines are largely in line with practices used by most SEO experts, albeit with a specific focus on their specific search and web performance issues.

For example, the Web Performance section covers three specific areas: loading performance, rendering performance, and responsiveness. Similarly, the SEO guidelines are split into on-page SEO, content optimization, technical SEO, and off-page SEO. 

More than anything, the guidelines highlight the need for brands to focus their SEO efforts on their unique needs and goals and develop strategies that are likely to remain useful for the foreseeable future (instead of chasing every new SEO trend). 

To read the guidelines for yourself, visit the Washington Post’s site here. 

Since it started testing a new fullscreen redesign, Instagram has come under heavy criticism from users – including high-profile figures like the Kardashians

Now, in a recent video, Head of Instagram Adam Mosseri seems to agree that the new design is not delivering the quality experience the company had hoped for.

In the video shared on Twitter, Mosseri explained the redesign is “not yet good” and that the new layout will likely see some revisions before it becomes the default for all users. 

However, Mosseri also emphasized that the platform will not be backing away from its current direction. Recommended posts and a new emphasis on video are going to be major parts of the final redesign despite the public demand to “make Instagram Instagram again.”

More Changes Are Likely

The ongoing test has made quite a splash, but it is actually only being shown to a relatively small number of users.  While it captures what Instagram is trying to achieve, it is not up to the standards of the company.

“It’s a test to a few percentage of people out there, and the idea is that a more full-screen experience, not only for videos but for photos, might be a more fun and engaging experience. But I also want to be clear. It’s not yet good, and we’re going to have to get it to a good place if we’re going to ship to the rest of the Instagram community.”

Photos Aren’t Going Anywhere

Much of the anger about the new layout comes from the opinion that Instagram is becoming too much like TikTok by prioritizing video content. 

Though Mosseri emphasizes the platform is always going to be a photo-sharing app at its core, it also needs to grow and expand.

“I want to be clear — we’re going to continue to support photos. It’s part of our heritage. I love photos and I know a lot of you out there love photos too. That said, I need to be honest, I do believe that more and more of Instagram is going to become video over time. We see this even if we change nothing.

We see this even if you just look at chronological feed. If you look at what people share on Instagram that’s shifting more and more to videos over time. If you look at what people like and consume and view on Instagram, that’s also shifting more and more to video over time even when we stop changing anything. So we’re going to have to lean into that shift while continuing to support photos.”

Recommended Posts Are Staying In Your Feed

Another major complaint from users revolves around the inclusion of recommended content in the main feed. 

Recommended posts show content from other users you don’t currently follow. The inclusion of this type of content upset many users who found the recommended content irrelevant or poor-quality. 

Though these recommended posts are going to be sticking around, Mosseri said it is a work in progress and offered tips on how to improve the quality of recommendations:

“Recommendations are posts in your feed from accounts that you do not follow. The idea is to help you discover new and interesting things on Instagram that you might not know even exist. “It’s a test to a few percentages of people out there.”

Now, if you’re seeing things in your feed that are recommendations that you’re not interested in, that means we’re doing a bad job ranking, and we need to improve. And you can X out a recommendation, you can even snooze all recommendations for up to a month or go to your ‘following’ feed.

But we’re going to continue to try to get better at recommendations because we think it’s one of the most effective and important ways to help creators reach more people. We want to do our best by creators, particularly small creators, and we see recommendations as one of the best ways to reach a new audience and grow their following.”

Google has always had a love-hate relationship with pop-ups or ‘interstitials’. 

Since 2016, the search engine has reportedly used a ranking penalty to punish sites using aggressive or intrusive pop-ups on their pages. Of course, if you’ve been to many sites recently, you know these disruptive pop-ups are still common across the web.

In a recent stream, Google’s John Mueller clarified exactly how the interstitial “penalty” works, and why so many sites get away with using disruptive pop-ups.

John Mueller on Website Pop-Ups

During a recent Google Search Central office hours stream, Mueller was asked about the possibility of using mobile pop-ups on their site for a short period of time.

Specifically, the individual wanted to know if they would be devalued for using interstitials to ask visitors to take a survey when visiting the site.

Perhaps surprisingly, Mueller didn’t see much issue with temporarily running pop ups on their mobile site. 

Going even further, he explained that even if the site was hit with a penalty for the pop-ups, it could potentially continue to rank well in search results. 

This is because the so-called “interstitials penalty” is quite a minor ranking factor in the grand scheme. While it can affect your rankings, it is unlikely to have a significant impact unless other issues are present.

Still, Mueller says if you are going to use pop-ups on your mobile sites, the best course is to only use them temporarily and not to show them to every visitor coming to your site.

Here’s his full response:

“I don’t think we would penalize a website for anything like this. The web spam team has other things to do than to penalize a website for having a pop-up.

There are two aspects that could come into play. On one hand we have, on mobile, the policy of the intrusive interstitials, so that might be something to watch out for that you don’t keep it too long or show it to everyone all the time.

With that policy it’s more of a subtle ranking factor that we use to adjust the ranking slightly if we see that there’s no useful content on the page when we load it. That’s something that could come into play, but it’s more something that would be a temporary thing.

If you have this survey on your site for a week or so, then during that time we might pick up on that signal, we might respond to that signal, and then if you have removed it we can essentially move on as well. So it’s not that there’s going to be a lasting effect there.

Another aspect that you want to watch out for is if you’re showing the pop-up instead of your normal content then we will index the content of the pop-up. If you’re showing the pop-up in addition to the existing content, which sounds like the case, then we would still have the existing content to index and that would kind of be okay.”

Ultimately, the take-away is to not overly fixate on being penalized specifically for using an interstitial pop-up on your site. Rather, put your attention on doing what is right for your website and what provides the best experience for visitors.

If you want to hear the question and full answer for yourself, check out the video below:

Google is adding a new set of ranking signals to its search engine algorithm in the coming year, according to an announcement this week. 

The search engine says it will begin factoring “Core Web Vitals” as a ranking signal starting in May 2021, combining with already existing user experience-related ranking signals. 

Google has been measuring Core Web Vitals since earlier this year, assessing the speed, responsiveness, and stability of web pages. 

These factors are what Google calls the Core Web Vitals:

  • Largest Contentful Paint (LCP): Measures loading performance. To provide a good user experience, sites should strive to have LCP occur within the first 2.5 seconds of the page starting to load.
  • First Input Delay (FID): Measures interactivity. To provide a good user experience, sites should strive to have an FID of less than 100 milliseconds.
  • Cumulative Layout Shift (CLS): Measures visual stability. To provide a good user experience, sites should strive to have a CLS score of less than 0.1.

These signals will be joining the already announced page experience signals:

  • Mobile-friendliness
  • Safe-browsing
  • HTTPS-security
  • Intrusive interstitial guidelines

“These signals measure how users perceive the experience of interacting with a web page and contribute to our ongoing work to ensure people get the most helpful and enjoyable experiences from the web.”

Based on recent data assessments, this should concern the majority of websites out there. A study published in August suggests less than 15% of all websites would pass a Core Web Vitals assessment if the signals were implemented today. 

The search engine has also hinted at the potential to introduce new labels in search results, highlighting pages with the best user experience. Though nothing is set in stone, this would provide even more motivation for pages trying to maintain the best place in search results. 

For more information about updating your site for Core Web Vitals, you can explore Google’s resources and tools here

On October 24, Facebook and Instagram plan to roll out a major change which has the potential to break content across millions of sites using WordPress.

On that date, the companies will remove functionality which allows sites to embed content from the social networks. 

The change does not only mean that publishers will no longer be able to embed this content on their websites. The change is retroactive, meaning that all content ever embedded on your site could potentially become inaccessible or broken. 

There is one exception – though it will likely be impractical for many out there. 

The change is removing support for unauthenticated Facebook and Instagram embeds, meaning that those with a developer account and a registered Facebook account will still be able to link content between their app and Facebook or Instagram. 

The Technical Changes

To get into the nitty-gritty – Facebook is deprecating the current oEmbed endpoints for embeddable Facebook content on October 24, 2020. oEmbed is a popular open format means of embedding content from one site to another. 

The Facebook integration of oEmbed endpoints has allowed pages to quickly and easily embed HTML or basic content from pages, posts, and videos on their own site or app. Unfortunately, that aspect of the Facebook API is being removed. 

In response, WordPress has also said it will be removing Facebook and Instagram as supported oEmbed sources.

What You Can Do

As expected, developers began work on ways to fix content or prevent losing access as soon as the announcement was made. 

So far, there are two major options for those wanting to keep support for embedded Facebook and Instagram content on their websites:

oEmbed Plus – Developer Ayesh Karunaratne has created an expanded version of the existing system for embedding content from Instagram and Facebook which provides a workaround. 

Even using the plugin, you will have to register for a Facebook developer account and “create an app”. However, you don’t have to actually develop an app, just register one with the site. 

You can see the guide for the plugin here for an idea what the process entails.

Smash Balloon Plugins – Developer Smash Balloon has provided a potentially easier option by updating their previous plugins to provide continued support – no developer account or app required. This is possible because Smash Balloon is effectively using its own API key to provide authentication for embedded content to users. 

Across the country, governors and mayors are implementing “shelter in place” or “safer at home” orders which are requiring a significant number of businesses to temporarily close during the COVID-19 epidemic.

In response, business owners are making hard decisions to cut costs and tighten belts to make it through these weeks. One such question on many business owners’ minds is whether to continue paying to maintain their website, or if they should take it offline in order to avoid paying hosting or maintenance costs.

Google Says Don’t Shut Down Your Website

It may be tempting, but disabling your site for any amount of time – even just a few days – can have long-lasting effects on your search engine rankings. Not only does it completely shut down the ability for people to find out about your products and services for the time being, it essentially removes your site from Google’s index.

In this situation, Google will have to reindex your website when you come back online, putting you back at square one.

What To Do Instead

In new recommendations, Google is suggesting that businesses limit their site’s functionality rather than go completely offline when you need to pause operations.

The company suggested a number of steps you can take to suspend your online services while still keeping customers informed and preserving your search visibility. These steps include:

  • Keep users informed with a popup or banner explaining how your business has changed. Follow Google’s guidelines for banners and popups to ensure that you’re not interfering with the user experience.
  • Adjust your structured data to reflect event updates, product availability and temporary closures. You can also mark your business as temporarily closed through Google My Business.
  • E-commerce sites should follow Google’s Merchant Center guidance on availability and, if necessary, disable cart functionality.
  • Inform Google of site updates by requesting a recrawl through Search Console.

If You Absolutely Must Take Down Your Site

As a last resort, Google does recommend a few things you can do to protect your search visibility if you must take your site down:

  • For a temporary takedown, use the Search Console Removals Tool.
  • If you’re taking down your site for one or two days, you can return an informational error page with a 503 Service Unavailable code.
  • For longer site takedowns, put up an indexable homepage placeholder for searchers using the 200 HTTP status code.

Don’t Overreact, Think Ahead

It is easy to get caught up in the current situation and lose sight of the long-term picture. While the COVID-19 epidemic is a serious concern for businesses, it will eventually pass. When it does, you want to be ready to hit the ground running, not starting again from square one.

Google has announced it plans to warn users of its Chrome browser about slow sites using a method called “badging”.

The idea is to provide a sign letting users know when a site typically loads slowly before they ever click a link to that site or while the site loads. Google sees this as a way to “reward” fast sites, saying:

“We think the web can do better and want to help users understand when a site may load slowly, while rewarding sites delivering fast experiences.”

For example, Google published one concept for what a slow speed badge could look like while a site is loading:

In this case, it is likely that the badge could increase abandonment rates for slow sites.

The company is also talking about using contextual menus that preview links and would include similar badges indicating a site is fast.

Another idea includes subtly changing the color of loading bars to indicate whether a site is fast:

As the company explained in its announcement:

“Our early explorations will look at a number of Chrome surfaces, including the loading screen (splash screen), loading progress bar and context-menu for links. The latter could enable insight into typical site speeds so you’re aware before you navigate.“

The web browser admits this idea is in the early stages, and may considerably change before they determine “which provides the most value to our users.”

Additionally, the company says they plan to expand the badges to include a number of metrics aside from speed:

“Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed.”

Thanks to its high-level of adaptability, JavaScript (JS) has been in use in some shape or form for more than 20 years and remains one of the most popular programming languages used to build websites.

However, Google’s Martin Splitt, a webmaster trends analyst, recently suggested that webmasters should begin moving away from the coding language to rank most quickly on search engines.

In an SEO Mythbusting video exploring the topic of web performance and search engine optimization, Splitt and Ada Rose Cannon of Samsung found themselves talking about JavaScript.

Specifically, they discussed how using too much JS can drag down a site’s performance and potentially drag them down in Google’s search index.

How JavaScript Holds Content Back

One of the biggest issues that arise with overusing JS is when sites publish content on a daily basis.

Google uses a two-pass indexing process to help verify content before it is added to the search index. In the case of a JavaScript-heavy page, Google first renders the non-JS elements like HTML and CSS. Then, the page gets put into a queue for more advanced crawling to render the rest of the content as processing resources are available.

This means Java-heavy pages may not be completely crawled and indexed for up to a week after being published.

For time-sensitive information, this can be the difference between being on the cutting-edge and getting left behind.

What You Can Do Instead

Splitt offers a few different techniques developers can use to ensure their site is being efficiently crawled and indexed as new content is published.

One way to get around the issue would be to use dynamic rendering, which provides Google with a static rendered version of your page – saving them the time and effort of rendering and crawling the page themselves.

The best course of action, though, would be to simply rely primarily on HTML and CSS for time-sensitive content.

Splitt takes time to explain that JavaScript is not inherently bad for your SEO or search rankings. Once they are indexed, JS-heavy sites “rank just fine.” The issue is ensuring content is crawled and indexed as quickly and efficiently as possible, so you can always be on the cutting edge.

The discussion gets pretty technical, but you can view the entire discussion in the full video below:

A new study suggests that although highly ranking sites on search engines may be optimizing for search engines, they are failing to make their sites accessible to a large number of actual people – specifically, those with visual impairments.

The study from Searchmetrics used Google Lighthouse to test the technical aspects of sites ranking on Google. Unsurprisingly, it showed that high-ranking websites were largely fast and updated to use the latest online technologies, and were relatively secure.

However, the analysis revealed that these high-ranking websites were lagging behind when it came to accessibility for those with disabilities.

Based on scores from Google’s own tools, the average overall score for accessibility for sites appearing in the top 20 positions on the search engine was 66.6 out of 100.

That is the lowest score of the four ranking categories analyzed in the study.

Google’s Lighthouse accessibility score analyzes a number of issues that are largely irrelevant for many users, but hugely important for those with disabilities or impairments – such as color contrast and the presence of alt tags to provide context or understanding to visual elements.

As Daniel Furch, director of marketing EMEA at Searchmetrics, explains, this can be a major issue for sites that are otherwise performing very well on search engines:

“If you don’t make your site easily accessible to those with disabilities, including those with impaired vision, you cut yourself off of from a large group of visitors.

Not only is it ethically a good idea to be inclusive, but also obviously you could be turning away potential customers. And some sites have even faced lawsuits for failing on this issue.”

In mid-2018, Google’s web browser Chrome made a small tweak to help users know how safe a specific site was. Specifically, it added a tag in the search bar flagging any site that had not updated to HTTPS as “not secure”.

Now, with the help of a new survey from the agency John Cabot, we are finally getting insight into how this little notification affects people’s perception of sites.

Based on a survey of 1,324 people in the UK, the survey finds that nearly half of all people respond negatively to sites which are flagged as “not secure” and many are less willing to give personal information to these sites.

According to the findings, 47% of respondents said they “knew roughly what the warning meant.” Similarly, 46% said they would not give their names or financial information to a site flagged as “non-secure”. Even more, 64% of that group say they would immediately leave non-secure sites.

The survey also found a few other fears and concerns when users come upon a non-secure site:

  • Their device was exposed to a virus — 14%
  • They had arrived on a fake version of the intended site — 12%
  • The content was “unreliable and not fact-checked” — 9%
  • Being signed up for spam email — 8.4%

Notably, the survey found that a brand’s existing perception appears to play a role in determining how people respond to a non-secure site. For example, retailer John Lewis experienced significantly less negative reactions to their site, despite being tagged as non-secure. This suggests widespread name recognition could potentially counter the warning.

Still, the findings of the survey show that a huge number of users are taking note any time they find a business website which has not implemented HTTPS encryption and many are even changing their behaviors based on this warning. If you haven’t updated your business site, these results suggest you could be losing up to 50% of your potential customers to something that is easy and affordable to implement.