Tag Archive for: content

Single-page websites have taken over the internet lately. More and more businesses are choosing to streamline their sites to get straight to the point, and newer brands are opting to avoid paying to create a dozen or more pages. The question is whether single-page websites are actually good for you and your brand.

Admittedly, there are a few clear benefits from single-page websites. They tend to work well on mobile devices and load more quickly than a site with numerous pages. Since more than half of all searches are now coming from mobile sources, these can help you ensure people on smartphones don’t have to wait to check out your stuff.

There are also a variety of free tools that can help set-up a stylish one-page site, while designing a full multi-page site can cost thousands of dollars.

However, it’s not all roses and sunshine when it comes to single-page websites. Here are a few things to consider before you decide to go minimalist with a one-page website for your brand:

Lack of info

The biggest problem with single-page websites is simply cramming everything your potential customers want to know all on one page.

On a multiple-page website, you can publish all sorts of content and valuable information that helps your visitors become informed and excited about your products or services. When you cut all that down to one page, you lose a lot of the details that can be a deciding factor in turning someone from a visitor to a customer.

Even with a great layout that includes separate sections for different topics or types of services, it is nearly impossible to include everything your variety of visitors want to find.

SEO limitations

Since you can’t fit in as many types of content or information, it is also hard to target as many keywords or phrases as you have in the past. Sites with lots of pages of content can cover a huge range of keywords related to your business, helping you rank on diverse search pages that might draw in different parts of your audience.

On that note, it can also be hard to keep your site looking “active” since you are only updating it for new products or when you change your business’s phone number. Rather than keeping people up-to-date, single-page websites are typically planned to be “evergreen” and need minimal updating. That may sound nice, but search engines tend to prefer sites that are regularly adding new information and resources – not stagnant sites that are only updated a few times a year at most.

Cost vs. Effect

One of the most common reasons I hear for going single-page is that it is cheaper. You don’t have to hire a web designer to customize numerous pages with unique layouts and images or have a writer fill all those pages with copy and content.

That can all be tantalizing, but as the saying goes: “you get what you pay for.” If you use a free or cheap template for your single-page website, you risk looking bland and forgettable because others are using that exact same layout.

Even if you hire someone to create a great single-page layout, it becomes hard to make your page effective. Strategized approaches get cut to fit within the limited mold, and your copy becomes broad to cover as much as possible as quickly as you can.

All-in-all, single-pages require a ton of work to be anywhere as effective as a traditional website. You have to fight an uphill battle to optimize your site for search engines and hope your content is so insanely precise that you aren’t missing any details your customers want. So, if you are choosing a one-page site for its low-cost, you should realize it will cost you one-way or the other down the road.

The final verdict

As with any trend, it can be hard to resist the urge to be up-to-date and hip. But, trends are fleeting because they often aren’t fully thought through. There will always be a small number of brands who benefit from going to a single-page site, but most discover it’s not as great or easy as they thought it would be.

FacebookClick

Facebook is changing its mind on branded content, though it isn’t ready to completely dive in. The social media giant is revising its policy on branded content, which is anything that specifically “mentions or features a third party product, brand, or sponsor.”

With the latest change, Facebook is allowing any verified page to share branded content, however, the content must be labeled as such. This is a significant turn from the company’s previous stance against branded content and ads.

To help brands with verified pages label their branded content, Facebook is also offering a new tool to assist in tagging brands mentioned in the content. The company says the tool must be used every time branded content is published.

By changing their policy, Facebook is allowing companies with existing partnerships or sponsorships to bring their relationship into the world’s largest social network.

Notably, branded content can also be pushed via sponsored posts or leveraged in paid ads. The company says the new tool will hopefully lead to greater transparency while continuing to help users find valuable information.

When a brand is tagged in a piece of branded content, they will also receive access to post insights and can share the boosted post themselves.

While this is a notable change, Facebook still has some restrictions. Here is what Facebook will still not allow:

“…our branded content guidelines prohibit overly promotional features, such as persistent watermarks and pre-roll advertisements. Additionally, cover photos and profile pictures must not feature third party products, brands, or sponsors. Branded content integrations that are allowed to be posted on Facebook include content like product placement, endcards, and marketer’s logos.”

Duplicate content has been an important topic for webmasters for years. It should be absolutely no secret by now that duplicate content is generally dangerous to a site and usually offers no value, but there are occasional reasons for duplicate content to exist.

Of course, there are very real risks with hosting a significant amount of duplicate content, but often the fear is larger than the actual risk of penalties – so long as you aren’t taking advantage and purposely posting excessive duplicate content.

Google’s John Mueller puts the risk of using duplicate content in the best context,. According to John, there are two real issues with duplicate content.

The first issue is that Google’s algorithms typically automatically choose one URL to show for specific content in search, and sometimes you don’t get to choose. The only way you can effectively let Google know your preference is by using redirects or canonical tags, and that isn’t foolproof.

Secondly, if you are hosted a ton of duplicate content it can actually make the process of crawling to overwhelming for the server, which will slow new content from being noticed as quickly as it should be.

Still, John said that in most cases, “reasonable amounts of duplication […] with a strong server” is not a huge problem, as “most users won’t notice the choice of URL and crawling can still be sufficient.”

Keymaster

Source: Jason Tamez

Does Google control the internet? Of course no one has control over the entire existance of the internet, but the major search engine has a huge influence in how we browse the web. So, it is interesting to hear a Google representative entirely downplay their role in managing the content online.

Barry Schwartz noticed the statement in a Google Webmaster Help forums thread about removing content from showing up in Google. It’s a fairly common question, but the response had some particularly interesting information. According to Eric Kuan from Google, the search engine doesn’t play a part in controlling content on the internet.

His statement reads:

Google doesn’t control the contents of the web, so before you submit a URL removal request, the content on the page has to be removed. There are some exceptions that pertain to personal information that could cause harm. You can find more information about those exceptions here: https://support.google.com/websearch/answer/2744324.

Now, what Kuan said is technically true. Google doesn’t have any control over what is published to the internet. But, Google is the largest gateway to all that content, and plays a role in two-thirds of searches.

This raises some notable questions for website owners and searchers alike. We rarely consider how much of an influence Google has in deciding what information we absorb, but they hold some very important keys to areas of the web we otherwise wouldn’t find.

As a publisher, you are obliged to follow Google’s guidelines in order to be made visible to the huge wealth of searchers. It is an agreement which often toes uncomfortable lines as the search engine has grown into a massive corporation encompassing many aspects of our lives and future technology.

When you begin marketing and optimizing your site online to become more visible, you should keep this agreement in mind. A lot of people think of Google as a system to take advantage of in order to reach a larger audience. While you can attempt to do that, you are breaking the agreement with the search engine and they can penalize your efforts at any time.

Facebook MemeWhen the news broke of Facebook’s updates to their News Feed, advertisers everywhere scrambled to analyze the changes. Well, it appears we got it a bit wrong. One of the most reported elements of the updates aimed at “rewarding high-quality content” focused on the supposed removal of memes from user feeds, but it doesn’t appear that is actually the case.

Facebook really is revamping how they judge the quality of the content they deliver to users, but Facebook’s News Feed Manager Lars Backman gave some insight to the changes and denies there is an attack on memes during a recent interview with AllThingsD. Instead, Backman says it is a broader effort “to provide user value” in the News Feed.

The most interesting aspect of the interview actually says Facebook isn’t differentiating different forms of content for the most part. As Backman told Peter Kafka:

Are you paying attention to the source of the content? Or is it solely the type of content?

Right now, it’s mostly oriented around the source. As we refine our approaches, we’ll start distinguishing more and more between different types of content. But, for right now, when we think about how we identify “high quality,” it’s mostly at the source level.

So something that comes from publisher X, you might consider high quality, and if it comes from publisher Y, it’s low quality?

Yes.

However, while this sums up Facebook’s approach overall, Backman did say there is a specific type of content they are trying to do away with, but it isn’t memes. Instead, Facebook is attacking the types of content that blatantly begs for likes or shares, such as Like this if you are having a good day!

So, when the text or photo has a call to action, those posts naturally do much better. And in a traditional feed ranking, where we’re evaluating just on the number of likes, those things all did very well.

In a way, Facebook is simply leveling the playing field, because those types of content offered very little to users aside from surface level interaction, but they were consistently doing very well on likes and shares which were making them more visible. However, if your user base responds well to the average meme, you shouldn’t be afraid to use them as a part of your content.

Google recently integrated their Panda algorithms into their normal indexing process, and this has sprung up a whole new batch of questions from webmasters. The most common question is specifically how site owners will know if their site has been hit by Panda. Really, it was only a matter of time before Matt Cutts, the noted Google engineer and head of Webspam, addressed the issue.

And that is what he did earlier this week, when Cutts used one of his Webmaster Help videos to respond to Nandita B.’s question, “how will a webmaster come to know whether her site is hit by Panda? And, if her site is already hit, how she will know that she has recovered from Panda?”

Now that the Panda algorithm is a part of the normal search indexing process, finding out if you’ve been affected by Panda won’t be near as easy. You can’t just compare your analytics reports with recorded dates for Panda rollouts. But, Cutts does have some suggestions if you think your site has been affected.

Cutts said, “basically, we’re looking for high quality content. So if you think you might be affected by Panda, the overriding goal is to make sure that you’ve got high quality content.”

Of course, high quality content in this context means sites that offer real value to users. It appears integrating Panda was actually one of the last steps in a shift towards a high focus on high quality content. They’ve been suggesting focusing on value for a long time, and now it is officially a large part of the normal search algorithm.

300px-Free_Content_Logo.svgContent marketing is becoming more and more of a talking point for SEO services as more people realize they can’t try to trick search engines with pages strictly for the search crawlers and shady link profiles, but many don’t realize this is also changing the standards for content.

Content has always been an important part of an SEO campaign, but it is indisputable that its status is being raised within Google and they are tightening their guidelines. You can’t just stuff keywords into a wall of barely legible text and expect Google to think your page has value. Now your content must be informational, resourceful, and actually captivating.

The biggest question for most is what type of content they need. If they’ve done any research, they might come to you with a list of types of content like infographics and webinars they “need” according to “the internet”, but more likely you will just get asked the broad question of what type of content will be needed. Once you know their business, you can probably make some good guesses, but making a wide statement for what type of content works is a farce.

While blog posts are always a good place to start with creating content, infographics or ebooks will only help relevant areas. A nursing home probably won’t be able to find a relevant infographic, because that way of delivering information doesn’t work well for portraying the complex and focused care they will be giving loved ones. Similarly, videos don’t make much sense for a photographer to have, and tutorials don’t have much place in a medical website.

Most importantly, the content has to be quality, and it has to fit your companies needs. Even if you are delivering daily blog posts and guest blogs, they won’t have any effect if they aren’t worth reading. The best way to know what type of content you need to be making is trying to think like your competitors and customers. If you can make users happy with your website, you are already well on your way to making Google happy with your content.

Speaking of your competitors, you can do competitive analysis to find out what is working for them. I don’t mean scoping out their site and seeing what they have that you don’t. Instead you can use a number of sites and tools to see what is doing well on their site compared to yours, which will give you a good indication what type of content you should be making. Josh McCoy collected a few of those for you to get the jump on your competition.

Andre Weyher worked on Google’s Search Quality/Webspam team for two years, according to his LinkedIn profile. Recently, he spoke with James Norquay, a digital/search marketer from Australia, offering insight that possibly could help search marketers and web marketers understand Google’s SEO strategies.

Since Matt McGee published his initial report on Weyher’s comments on Search Engine Land, Google has released a short statement denying Weyher worked on webspam engineering or algorithms, but Weyher stands by his statements.

According to Weyher, everyone on the search quality team covers a specific “market” and his was content quality and backlink profiles.

Speaking about the Penguin update, Weyher says, “Everyone knew that Penguin would be pointed at links, but I don’t think many people expected the impact to be as large as it turned out to be. At this stage a webmaster is out of his mind to still rely on techniques that were common practice 8 months ago.”

He emphasizes the shift to anchor text ratios, which has been a frequent piece of SEO advice following the Penguin update. His statement could confirm Google’s perspective on anchor text ratios.

If Weyher’s statements are to be believed, they could be a source of great insight into Google’s SEO strategies. However, even if you take Weyher’s words as truth, he would have been just one member of Google’s huge team, which he confirms when he says in his defense of the original interview, “No one within Google knows the entire picture apart from maybe 1 engineer, 1 level under Larry Page.”