Yelp has revealed 3 new features aimed to make it easier for brands to connect with customers and to improve user trust in the brands they find on the platform. 

These three features, which include a satisfaction guarantee program, implementing AI to deliver a better search experience, and more immersive videos, make up what Yelp is called its “most significant update in years.”

Below, we will explore each new feature and when you can expect to see it for yourself:

Yelp Guaranteed

What is it?: Yelp Guaranteed is a new customer satisfaction program that guarantees up to $2,500 back if you are not happy with home improvement work done by a verified business. Specifically, the announcement says that you’’ be able to submit a claim if you have problems resolving an issue directly with the business. 

Additionally, the company says verified businesses will receive a badge and may be prioritized in future search results on the site. 

When will you see it?: This program is gradually rolling out to both users and businesses in San Francisco, New York City, Chicago, Seattle, and Washington D.C. though the announcement says they hope to expand the program nationwide soon. 

To be a part of the Yelp Guaranteed program, your business must be a Request a Quote-enabled advertiser in these industries:

  • Movers
  • Plumbers
  • HVAC
  • Contractors
  • Landscapers
  • Electricians

Yelp says it hopes to be able to include other markets in the near future.

Enhancing Yelp Search With AI and Large Language Models

What is it?: Yelp is in the process of improving its search tools to include a slew of new abilities using AI trained with Large Language Models. Through this, the company has been able to better understand search intent and deliver  more precise and personalized results,

The first wave of these AI-enabled features to be revealed includes highlighting the most relevant parts of reviews for your needs, providing search suggestions based on search intent, clickable category tags, and even a way for the site to “Surprise Me” with dining suggestions.

When will you see it?: The first wave of these AI-powered search features seems to be rolling out to users across Yelp now, though the announcement suggests you can expect to hear about more AI-led features soon.

New User Engagement Features

What is it?: Yelp is using new visual and interactive features to make its platform more engaging and encourage users to share their own opinions and reactions. It will do this by introducing new reactions, new ways to share media in reviews, and providing ideas for generating more effective and informative reviews.

Specifically, the announcement highlights these three new features:

  • You can now include high-resolution videos up to 12 seconds long along with your text reviews and uploaded pictures.
  • Yelp will now suggest topics users may want to discuss when writing a review. Such as “food,” “service,” or “ambiance.”Once a reviewer has covered these topics, Yelp will mark each topic with a green checkmark. 
  • Respond with reviews with a wider range of review reactions beyond just “Useful,” “Funny,” and “Cool.” Now, users can select from more options including “Helpful,” “Thanks,” “Love This,” and “Oh no.”

For more about Yelp’s biggest update in years, read the full announcement here.

To understand and rank websites in search results, Google is constantly using tools called crawlers to find and analyze new or recently updated web pages. What may surprise you is that the search engine actually uses three different types of crawlers depending on the situation with web pages. In fact, some of these crawlers may ignore the rules used to control how these crawlers interact with your site.

In the past week, those in the SEO world were surprised by the reveal that the search engine had begun using a new crawler called the GoogleOther crawler to relieve the strain on its main crawlers. Amidst this, I noticed some asking “Google has three different crawlers? I thought it was just Googlebot (the most well-known crawler which has been used by the search engine for over a decade).”  

In reality, the company uses quite a few more than just one crawler and it would take a while to go into exactly what each one does as you can see from the list of them (from Search Engine Roundtable) below: 

However, Google recently updated a help document called “Verifying Googlebot and other Google crawlers” that breaks all these crawlers into three specific groups. 

The Three Types of Google Web Crawlers

Googlebot: The first type of crawler is easily the most well-known and recognized. Googlebots are the tools used to index pages for the company’s main search results. This always observes the rules set out in robots.txt files.

Special-case Crawlers: In some cases, Google will create crawlers for very specific functions, such as AdsBot which assesses web page quality for those running ads on the platform. Depending on the situation, this may include ignoring the rules dictated in a robots.txt file. 

User-triggered Fetchers: When a user does something that requires for the search engine to then verify information (when the Google Site Verifier is triggered by the site owner, for example), Google will use special robots dedicated to these tasks. Because this is initiated by the user to complete a specific process, these crawlers ignore robots.txt rules entirely. 

Why This Matters

Understanding how Google analyzes and processes the web can allow you to optimize your site for the best performance better. Additionally, it is important to identify the crawlers used by Google and ensure they are blocked in analytics tools or they can appear as false visits or impressions.

For more, read the full help article here.

Microsoft Advertising has begun alerting users that it is cutting support for Twitter across its platform starting April 25, 2023.

This means you will no longer be able to include Multiplatform Smart Campaigns or manage your Twitter account through the Digital Marketing Center (DMC). That includes being unable to schedule, create, or manage tweets and tweet drafts will be removed from the platform on that day.

Additionally, advertisers will be unable to view or track past tweets’ performance and engagement on the platform.

Why This Matters

This is notable for a few reasons. 

DMC is one of the leading tools used to manage multiple social media accounts from one location, including the crucial ability to respond to DMs from all major social networks without signing into multiple accounts and pages. 

This is because the tool is offered for free to all advertisers on Microsoft Ads and is integrated with Microsoft’s other social and paid ad tools for businesses. Once removed, this will create a significant hurdle for many advertisers wanting to manage social ads and engagement efficiently. 

This is also a major loss for Twitter, which has struggled to bring back advertisers since the takeover by Elon Musk. Estimates indicate that up to half of Twitter’s biggest advertisers have left since his purchase of the company.

Just this week, Musk has been making appearances at major marketing and advertising conferences in a bid to attract brands back to Twitter, but the loss of access through major social ad tools will only make Twitter a harder sell to the brands which have already left.

Meanwhile, Microsoft generated over $12 billion dollars in revenue for digital ads last year and is poised to make even bigger gains this year.

Typically when a site starts ranking worse for one keyword, the effect is also seen for several of the other keywords it ranks for. So what does it mean when a website only loses rankings for one keyword? According to Google’s Gary Illyes, there are a few reasons a site might experience this rare problem. 

In a recent Google SEO Office Hours episode, Illyes addressed the issue while answering a question from a site owner who had effectively disappeared from the search results for a specific keyword – despite ranking at the top of results consistently in the past. 

The Most Likely Culprit

Unfortunately, the most common cause of an issue like this is simply that competitors have outranked your website, according to Illyes:

“It’s really uncommon that you would completely lose rankings for just one keyword. Usually, you just get out-ranked by someone else in search results instead if you did indeed disappear for this one particular keyword.”

Other Potential Causes

If you believe the drop in rankings for a specific keyword is the result of something other than increased competition, Illyes recommends investigating if the issue is isolated to a specific area or part of a larger ongoing global problem. 

“First, I would check if that’s the case globally. Ask some remote friends to search for that keyword and report back. If they do see your site, then it’s just a ‘glitch in the matrix.’”

Those without friends around the globe can effectively accomplish the same thing by using a VPN to change their search location.

On the other hand, if your site is absent from results around the globe, it may be indicative of a bigger issue – potentially the result of changes to your website:

“If they don’t [find your website], then next I would go over my past actions to see if I did anything that might have caused it.”

Lastly, Gary Illyes offers a few other potential causes of a sudden ranking drop.

Technical issues such as problems with crawling or indexing can prevent your website from appearing in search results. 

Sudden changes to your backlink profile – either through mass disavowing links or through the use of low-quality or spammy links can also trigger issues with Google. If you are hit with a manual penalty for low-quality links, it is highly likely your site will stop ranking for at least one keyword (if not several).

To hear the full discussion, check out the video below:

If you’ve ever wanted to know the secret to get the best response on LinkedIn Ads, Vidmob’s recently released report on global advertising trends on the platform may be exactly what you’re looking for. 

The report breaks down every element of LinkedIn ads to show which visual elements, text, and creative strategies performed the best on the platform for driving B2B engagement. 

The findings come from over 800 million ad impressions tracked by brands paying for video ad campaigns on LinkedIn within North America and the EMEA region (Africa, Europe, and the Middle East). 

What Elements are Most Effective?

The report says video ads containing the following elements performed better than those that did not:

  • Videos that display messaging in the first quarter of the video saw a 149% boost in views through the first 25% of the ad.
  • Ads that are 7-15 seconds long received a 54% lift in engagement rates.
  • Videos with high text contrast saw a 102% lift in views through the first 25%.
  • Videos featuring a person within the first quarter received a 175% increase to views through the opening 25%
  • Ads that include the phrase “Get a Quote” received 33% higher click-through rates.
  • When ads show a brand logo within the first 2 seconds, they see a 17% increase to click-through rates.

Key Takeaways

From all the data and findings, Vidmob has made 5 key points advertisers should be aware of:

Video: “While short and sweet is the usual go-to for video length, for awareness plays in the tech industry, audiences are engaged with mid-length content too.”

Color: “Don’t shy away from bright hues in upper funnel assets.”

Terminology: “Make use of Tech industry jargon and relevant imagery in creative assets, such as ‘Data,’ ‘Leader,’ ‘Expert,’ and ‘Demand.’”

Functional Benefits: “Focus on functional benefits and how the products or solutions can add efficiencies for the audience.”

Branding: “Make sure some reference to the brand appears upfront in the first 3 sec of the creative.”

For more interesting findings about the best elements to use at each stage of the customer awareness journey, download the full report from Vidmob here.

The rise of AI continues as Google Ads has started testing using artificial intelligence to help advertisers create the message for their ads. 

The feature seems to be a very limited test that uses AI to generate suggestions for headlines and description texts. Notably, when Google Ads Liaison Ginny Martin confirmed that the ad platform is testing AI tools, it is “unrelated to Bard”, Google’s recently released AI system. 

From user reports, the AI tool helps to create responsive search ads within Google Ads. 

Responsive search ads are a type of ad option that already uses machine learning to optimize your ad for those who see it using a premade set of headlines and descriptions. 

In this small beta test, users can instead let AI create headlines and descriptions suggestions based on information about your business. Specifically, the prompt asks you to “describe the product or service you’re advertising and what makes it unique in a few sentences.”

You can then select from the suggestions Google offers or decide to write your own.

It is unclear how soon you can expect to see this feature rolled out to more advertisers but it shows that Google is seriously working to utilize AI technology in every area of its platform, including Google Ads.

Following leaks, Twitter has made its content recommendation algorithm completely available to the public – laying bare how the social network works and what sort of posts are most likely to succeed. 

Along with a lot of interesting details involving which types of content are best received, how your interactions with others affect you, and how poor grammar may hurt you, the code also includes a number of concerning details that have made human rights groups concerned. 

Let’s talk about all the most notable parts below:

Likes Count Most

Likes seem like the easiest type of interaction you can get from other users but don’t underestimate them. The code shows that likes are easily the most important type of engagement compared to retweets or replies. 

The system assigns points to each type of interaction, with each point giving a boost to a post’s visibility. In the current system, a single like gives a post 30 points. Retweets are not far behind, giving 20 points. Shockingly, replies are practically meaningless in comparison, giving just a single point for each reply. 

This means that all the conversation in the world doesn’t matter if users aren’t also liking your posts. 

Pics and Videos Are Important

Less surprisingly, the source code confirms that posts containing visual media are largely preferred over plain text. 

Linking Out Is Frowned Upon

This is another one that has been suspected for a while but has been confirmed by the source code. 

For the most part, Twitter does not want you to link users off the platform. It makes a simple type of sense. Twitter’s goal is to keep people on the app as long as possible, and each link represents a chance for users to leave the app. 

To combat this, the site largely downplays posts containing links unless they are coming from accounts that already have a lot of interaction on their posts. 

Twitter Blue Helps

Elon Musk has not been shy about his plans to make Twitter more of a pay-for-play platform through his pet project, Twitter Blue. Since its reveal, one of the touted benefits of the premium subscription is increased visibility, which is backed by the source code. 

This is not a guarantee you’ll suddenly get a ton of exposure if you sign up for Twitter Blue, though. Accounts are just given points toward their overall algorithm ranking if they are subscribed. 

Poor Spelling Costs You

For a site with the most limited options for editing posts after they go live (only available to Twitter Blue subscribers for 30 minutes after a tweet is posted), Twitter is surprisingly uptight about spelling and grammar. The source code indicates that posts with poor spelling and grammar may be demoted as a form of spam prevention.

The Controversial Stuff

Lastly, we come to the most eyebrow-raising details contained within the source code because of how Twitter appears to be handling international conflicts and vulnerable groups. 

Based on the available code, Twitter seems to be limiting the visibility of posts talking about the ongoing war in Ukraine as hate-based content. This is particularly problematic as many human aid groups have relied on social networks like Twitter to drive donations, awareness, and support.

Another hot-button topic that seems to be directly targeted by Twitter’s code are transgender individuals. Users found that several terms relating to transgender people are suppressed on the platform, particularly when sharing links to other sites containing those terms. Meanwhile, activists say that the platform is not limiting pages containing hateful terms.

Musk says part of the decision to make this source code open to the public is the hope to identify problems that can be quickly fixed by the team to improve the recommendation algorithm. As such, the code should be seen as a work in progress. Still, it is worth taking time to familiarize yourself with everything in the recent code if you drive sales for your business through Twitter.