Google is sending emails to webmasters that are being migrated to the search engine’s new mobile-first index. If your site gets indexed, Google will start choosing the mobile version of your site as the default choice – meaning your site is fast enough and optimized for mobile users.

The search engine first said they would start sending notifications to websites being migrated into the mobile-first index, but the emails have only started being actually seen in the wild over the past few days.

The notifications are coming a bit late, considering Google has confirmed that it began moving websites over to the mobile-first index months ago.

You can see a copy of the email as shared by The SEM Post or read the full text below:

”Mobile-first indexing enabled for <URL>

To owner of <URL>

This means that you may see more traffic in your logs from Googlebot Smartphone. You may also see that snippets in Google Search results are now generated from the mobile version of your content.

Background: Mobile-first indexing means that Googlebot will now use the mobile version of your site for indexing and ranking, to better help our (primarily mobile) users find what they’re looking for. Google’s crawling, indexing, and ranking systems have historically used the desktop version of your site’s content, which can cause issues for mobile searchers when the desktop version differs from the mobile version. Our analysis indicates that the mobile and desktop versions of your site are comparable.”

Menus aren’t just for restaurants on Google anymore. Google My Business has finally expanded their menu feature to allow businesses to create and share a service menu describing the various services you offer and their prices.

The process is very simple. You just select the “Info” tab in the Google My Business dashboard and get started adding your services. The menus can be organized by name, description, and price for each item. You can also group specific items into different sections.

Google’s Allyson Wright announced the news yesterday in the Google My Business Help forums, saying:

“Back in January we launched a new Menu editor for the food service industry. This month, we are excited to announce that we have expanded our menu editor to now include additional services.

“Businesses in health & beauty, and service businesses, such as plumbers and florists, now have the ability to add their menu of services directly to their listing through their Google My Business account. Same as the food establishment menu editor, this feature will only be available if the listing is not currently connected to a third party provider and for listings in English speaking locals. If your listing is currently displaying an incorrect menu, please see this help center link for more information on how to correct or remove the link.”

The expanded menu feature is only available to those who do not currently have their listing connected to a third-party menu provider and businesses in English speaking locations.

Days before Facebook CEO Mark Zuckerberg is set to testify to Congress about the social network’s role in allowing Cambridge Analytica to exploit user data, Facebook is working to make it easy to see if your information was shared with the scandal-plagued analytics firm.

Facebook has published a new section within its help center called “How can I tell if my info was shared with Cambridge Analytica.” You can also quickly find the page by simply searching “Cambridge or Cambridge Analytica” in the Facebook search bar.

If you’re logged into your Facebook account, this page will automatically inform you whether your data was potentially breached by the “This is your digital life” app.

Since information has come to light about how Cambridge Analytica has been potentially misusing user data, the company’s relationship with Facebook has come under scrutiny. In response, the social network has taken several steps to attempt to re-win the public’s trust – such as launching this latest page. It has also introduced a data abuse bounty program that allows users to report app developers that may be misusing data.

Questions will likely remain long after Mark Zuckerberg’s testimony tomorrow, but at least you can now personally check to see whether your personal account details are safe or have been exploited.

Twitter has shut down numerous accounts accused of artificially increasing the popularity of their posts using a method called “tweetdecking.”

Tweetdecking gets its name from the app TweetDeck, which can schedule posts ahead of time. Conspiring accounts were working together to retweet content in order to force it to go viral.

In this case, most of the accounts removed were using the technique to steal content (including memes and jokes) to make accounts more prominent. These accounts would then use their artificial popularity to promote other accounts or products for financial profit.

This practice blatantly violates Twitter’s spam policy. It is also just the latest instance of users and brands gaming the system to increase their online presence.

Since the earliest days of Google, brands and “black hat” users worked together to rig the search engine to ensure high visibility. Usually, this took the form of buying links to artificially appear authoritative to Google’s algorithm. The search engine has since worked to eradicate the practice, but similar tricks like buying “likes” or “retweets” have since sprung up on almost every other popular social platform.

Twitter’s latest bans are the most recent crackdown in a long-running game of whack-a-mole. Still, it provides a harsh reminder that brands who try to manipulate social networks or search engines in bad faith are nearly guaranteed to be eventually penalized or banned entirely.

If you operate a website that is frequently creating or changing pages – such as an e-retail or publishing site – you’ve probably noticed it can take Google a while to update the search engine with your new content.

This has led to widespread speculation about just how frequently Google indexes pages and why it seems like some types of websites get indexed more frequently than others.

In a recent Q&A video, Google’s John Mueller took the time to answer this directly. He explains how Google’s indexing bots prioritize specific types of pages that are more “important” and limit excessive stress on servers. But, in typical Google fashion, he isn’t giving away everything.

The question posed was:

“How often does Google re-index a website? It seems like it’s much less often than it used to be. We add or remove pages from our site, and it’s weeks before those changes are reflected in Google Search.”

Mueller starts by explaining that Google takes its time to crawl the entirety of a website, noting that if it were to continuously crawl entire sites in short periods of time it would lead to unnecessary strain on the server. Because of this, Googlebot actually has a limit on the number of pages it can crawl every day.

Instead, Googlebot focuses on pages that should be crawled more frequently like home pages or high-level category pages. These pages will get crawled at least every few days, but it sounds like less-important pages (like maybe blog posts) might take considerably longer to get crawled.

You can watch Mueller’s response below or read the quoted statement underneath.

“Looking at the whole website all at once, or even within a short period of time, can cause a significant load on a website. Googlebot tries to be polite and is limited to a certain number of pages every day. This number is automatically adjusted as we better recognize the limits of a website. Looking at portions of a website means that we have to prioritize how we crawl.

So how does this work? In general, Googlebot tries to crawl important pages more frequently to make sure that most critical pages are covered. Often this will be a websites home page or maybe higher-level category pages. New content is often mentioned and linked from there, so it’s a great place for us to start. We’ll re-crawl these pages frequently, maybe every few days. maybe even much more frequently depending on the website.”

Google has been encouraging webmasters to make their sites as fast as possible for years, but now they’re making it an official ranking requirement.

The company announced this week that it will be launching what it is calling the “Speed Update” in July 2018, which will make page speed an official ranking signal for mobile searches.

Google recommends checking your site’s speed using its PageSpeed report, as well as using tools like LightHouse to measure page speed and improve your loading times.

As Google’s Zhiheng Wang and Doantam Phan wrote in the announcement:

The “Speed Update,” as we’re calling it, will only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries. It applies the same standard to all pages, regardless of the technology used to build the page. The intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content.

While Google says the update will only affect a “small percentage of queries”, it is impossible to tell exactly how many will be impacted. Google handles billions of queries a day, so a small piece of that could still be a substantial number of searches.

This is the first time page speed will be made a ranking factor for mobile searches, but it has been a ranking factor on desktop since 2010. It makes sense to expand this to mobile since there is a wealth of evidence showing that mobile users prioritize loading time when clicking search results. If a page doesn’t load within three-to-five seconds, they are likely to leave the page and find another relevant search result.

The importance of Google reviews has recently gotten a big boost, as it appears that the number of rankings your business has on Google My Business may play a big role in determining where you appear in the local search results. Thankfully, it appears you won’t have to rely solely on Google for your reviews in the future.

Google has begun integrating reviews from third party sources like Trip Advisor and Booking.com into their Knowledge Graph cards for Google My Business Listings. That means your reviews from these sites will be shown alongside your Google reviews, all in one convenient place for shoppers.

The reviews can also be filtered by source by clicking on the “All reviews” drop-down menu.

Currently, the sites being integrated are most beneficial for hotels and other similar travel-related businesses. It is unclear when or if more review services will be included in the future.

As Search Engine Land notes, this is not Google’s first foray into using third-party review sites directly within their search results. The search engine got into a lengthy legal battle against Yelp for scraping their reviews and displaying them in the search results without permission. The result was that Google agreed to only use third-party reviews in their search results with explicit permission from the publisher.

Based on this, it is all but certain Google is working closely with these outside sites to integrate their reviews.

The biggest question for now is whether these reviews will also be reflected in local optimization. If so, businesses that have been accumulating reviews on third-party sites may expect a big boost to their local rankings in the near future. Only time will tell.

It is official. After over a year experimenting with various types of longer tweets, Twitter is finally letting everyone tweet with 280-characters at a time.

The double-sized tweets are rolling out as the default length limit for users around the world, except Japan and Korea. The iconic 140-character limit will be phased out, although Twitter suggests the change won’t affect most tweets.

According to the blog post announcing the change, most tweets stayed below the old limit even when they had the opportunity to say more. However, “we saw when people needed to use more than 140 characters, they tweeted more easily and more often.”

Twitter also noted that “historically, 9% of tweets in English hit the character limit.” With the new extended length, that number has dropped to only 1% of tweets.

Of course, some on the platform seem to be outraged by the break in tradition. Most, including celebrities, are celebrating the longer tweets with jokes and pointlessly long tweets for fun.

Everyone seems to be ripping off Snapchat’s style these days, whether it’s the spread of vanishing video or “Stories”. Still, it doesn’t seem to be impacting the platform’s popularity with their biggest demographic.

Teens still prefer Snapchat over any other platform – and it’s not even close.

The investment firm Piper Jaffray’s latest annual “Taking Stock With Teens” report surveyed over 6,100 people across 44 states. It specifically asked teens about their social media usage over the past month.

According to their results published on AdWeek, almost half (47%) of all teens said Snapchat is their favorite app. That’s an increase from 35% last year. The closest runner-up was Instagram, which was preferred by 24% of teens. Despite being the biggest social network, Facebook trailed with 9% of the vote. Lastly, Twitter and Pinterest picked up 7% and 1% respectively.

The report also includes a number of other interesting findings about teens’ media and shopping habits, including:

  • 82% of teens say their next phone would be an iPhone
  • 23% of teens prefer to shop at specialty retailers, with 17% saying they like pure-play e-commerce retailers
  • 49% of teens say their favorite website is Amazon, while 6% choose Nike.com and 5% prefer American Eagle’s website.

Everyone wishes there was a simple recipe to guarantee you’ll rank at the top of the search engines, but Google’s Gary Illyes says there is no such thing. In fact, there isn’t even a consistent top-three ranking factors for all content.

Instead, Illyes explains that the top-ranking factors for web pages vary depending on the query being searched. Going by that thought process, factors like links might be used to verify that something is newsworthy, while page speed, content quality, and keyword usage may be more useful for some types of content.

John Mueller, also a big figure at Google, joined the discussion to suggest that worrying about optimizing for specific ranking factors is “short-term thinking.”

Surprisingly, Illyes takes it even further by saying that links – often viewed as one of the most important signals for a website – are often not a factor in the search results at all. Long-tail search queries, in particular, are likely to pull up content with few to no links.

While this can be discouraging to brands or businesses looking for specific ways to improve their site and rank higher, the overall message is clear. A holistic approach that prioritizes people’s needs and desires is bound to benefit you, while myopically focusing on specific factors is bound to eventually leave you left behind.

As Mueller suggests – if you build something awesome, Google will come.