Thứ Hai, 9 tháng 3, 2020

Crawled — Currently Not Indexed: A Coverage Status Guide

Posted by cml63

Google’s Index Coverage report is absolutely fantastic because it gives SEOs clearer insights into Google’s crawling and indexing decisions. Since its roll-out, we use it almost daily at Go Fish Digital to diagnose technical issues at scale for our clients.

Within the report, there are many different “statuses” that provide webmasters with information about how Google is handling their site content. While many of the statuses provide some context around Google’s crawling and indexation decisions, one remains unclear: “Crawled — currently not indexed”.

Since seeing the “Crawled — currently not indexed” status reported, we’ve heard from several site owners inquiring about its meaning. One of the benefits of working at an agency is being able to get in front of a lot of data, and because we’ve seen this message across multiple accounts, we’ve begun to pick up on trends from reported URLs.

Google’s definition

Let’s start with the official definition. According to Google’s official documentation, this status means: “The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”

So, essentially what we know is that:

  1. Google is able to access the page
  2. Google took time to crawl the page
  3. After crawling, Google decided not to include it in the index

The key to understanding this status is to think of reasons why Google would “consciously” decide against indexation. We know that Google isn’t having trouble finding the page, but for some reason it feels users wouldn’t benefit from finding it.

This can be quite frustrating, as you might not know why your content isn’t getting indexed. Below I’ll detail some of the most common reasons our team has seen to explain why this mysterious status might be affecting your website.

1. False positives

Priority: Low

Our first step is to always perform a few spot checks of URLs flagged in the “Crawled — currently not indexed” section for indexation. It’s not uncommon to find URLs that are getting reported as excluded but turn out to be in Google’s index after all.

For example, here’s a URL that’s getting flagged in the report for our website: https://ift.tt/2IsfU2O

However, when using a site search operator, we can see that the URL is actually included in Google’s index. You can do this by appending the text “site:” before the URL.

If you’re seeing URLs reported under this status, I recommend starting by using the site search operator to determine whether the URL is indexed or not. Sometimes, these turn out to be false positives.

Solution: Do nothing! You’re good.

2. RSS feed URLs

Priority: Low

This is one of the most common examples that we see. If your site utilizes an RSS feed, you might be finding URLs appearing in Google’s “Crawled — currently not indexed” report. Many times these URLs will have the “/feed/” string appended to the end. They can appear in the report like this:

Google finding these RSS feed URLs linked from the primary page. They’ll often be linked to using a "rel=alternate" element. WordPress plugins such as Yoast can automatically generate these URLs.

Solution: Do nothing! You're good.

Google is likely selectively choosing not to index these URLs, and for good reason. If you navigate to an RSS feed URL, you’ll see an XML document like the one below:

While this XML document is useful for RSS feeds, there’s no need for Google to include it in the index. This would provide a very poor experience as the content is not meant for users.

3. Paginated URLs

Priority: Low

Another extremely common reason for the “Crawled — currently not indexed” exclusion is pagination. We will often see a good number of paginated URLs appear in this report. Here we can see some paginated URLs appearing from a very large e-commerce site:

Solution: Do nothing! You’re good.

Google will need to crawl through paginated URLs to get a complete crawl of the site. This is its pathway to content such as deeper category pages or product description pages. However, while Google uses the pagination as a pathway to access the content, it doesn’t necessarily need to index the paginated URLs themselves.

If anything, make sure that you don’t do anything to impact the crawling of the individual pagination. Ensure that all of your pagination contains a self-referential canonical tag and is free of any “nofollow” tags. This pagination acts as an avenue for Google to crawl other key pages on your site so you’ll definitely want Google to continue crawling it.

4. Expired products

Priority: Medium

When spot-checking individual pages that are listed in the report, a common problem we see across clients is URLs that contain text noting “expired” or “out of stock” products. Especially on e-commerce sites, it appears that Google checks to see the availability of a particular product. If it determines that a product is not available, it proceeds to exclude that product from the index.

This makes sense from a UX perspective as Google might not want to include content in the index that users aren’t able to purchase.

However, if these products are actually available on your site, this could result in a lot of missed SEO opportunity. By excluding the pages from the index, your content isn’t given a chance to rank at all.

In addition, Google doesn’t just check the visible content on the page. There have been instances where we’ve found no indication within the visible content that the product is not available. However, when checking the structured data, we can see that the “availability” property is set to “OutOfStock”.

It appears that Google is taking clues from both the visible content and structured data about a particular product's availability. Thus, it’s important that you check both the content and schema.

Solution: Check your inventory availability.

If you’re finding products that are actually available getting listed in this report, you’ll want to check all of your products that may be incorrectly listed as unavailable. Perform a crawl of your site and use a custom extraction tool like Screaming Frog's to scrape data from your product pages.

For instance, if you want to see at scale all of your URLs with schema set to “OutOfStock”, you can set the “Regex” to: "availability":"<="" p="">

This: <="" p="">"class="redactor-autoparser-object">http://schema.org/OutOfStock" should automatically scrape all of the URLs with this property:

You can export this list and cross-reference with inventory data using Excel or business intelligence tools. This should quickly allow you to find discrepancies between the structured data on your site and products that are actually available. The same process can be repeated to find instances where your visible content indicates that products are expired.

5. 301 redirects

Priority: Medium

One interesting example we’ve seen appear under this status is destination URLs of redirected pages. Often, we’ll see that Google is crawling the destination URL but not including it in the index. However, upon looking at the SERP, we find that Google is indexing a redirecting URL. Since the redirecting URL is the one indexed, the destination URL is thrown into the “Crawled — currently not indexed” report.

The issue here is that Google may not be recognizing the redirect yet. As a result, it sees the destination URL as a “duplicate” because it is still indexing the redirecting URL.

Solution: Create a temporary sitemap.xml.

If this is occurring on a large number of URLs, it is worth taking steps to send stronger consolidation signals to Google. This issue could indicate that Google isn’t recognizing your redirects in a timely manner, leading to unconsolidated content signals.

One option might be setting up a “temporary sitemap”. This is a sitemap that you can create to expedite the crawling of these redirected URLs. This is a strategy that John Mueller has previously recommended.

To create one, you will need to reverse-engineer redirects that you have created in the past:

  1. Export all of the URLs from the “Crawled — currently not indexed” report.
  2. Match them up in Excel with redirects that have been previously set up.
  3. Find all of the redirects that have a destination URL in the “Crawled — currently not indexed” bucket.
  4. Create a static sitemap.xml of these URLs with Screaming Frog. 
  5. Upload the sitemap and monitor the “Crawled — currently not indexed” report in Search Console.

The goal here is for Google to crawl the URLs in the temporary sitemap.xml more frequently than it otherwise would have. This will lead to faster consolidation of these redirects.

6. Thin content

Priority: Medium

Sometimes we see URLs included in this report that are extremely thin on content. These pages may have all of the technical elements set up correctly and may even be properly internally linked to, however, when Google runs into these URLs, there is very little actual content on the page. Below is an example of a product category page where there is very little unique text:

This product listing page was flagged as “Crawled — Currently Not Indexed”. This may be due to very thin content on the page.


This page is likely either too thin for Google to think it’s useful or there is so little content that Google considers it to be a duplicate of another page. The result is Google removing the content from the index.

Here is another example: Google was able to crawl a testimonial component page on the Go Fish Digital site (shown above). While this content is unique to our site, Google probably doesn’t believe that the single sentence testimonial should stand alone as an indexable page.

Once again, Google has made the executive decision to exclude the page from the index due to a lack of quality.

Solution: Add more content or adjust indexation signals.

Next steps will depend on how important it is for you to index these pages.

If you believe that the page should definitely be included in the index, consider adding additional content. This will help Google see the page as providing a better experience to users. 

If indexation is unnecessary for the content you're finding, the bigger question becomes whether or not you should take the additional steps to strongly signal that this content shouldn’t be indexed. The “Crawled —currently not indexed” report is indicating that the content is eligible to appear in Google’s index, but Google is electing not to include it.

There also could be other low quality pages to which Google is not applying this logic. You can perform a general “site:” search to find indexed content that meets the same criteria as the examples above. If you’re finding that a large number of these pages are appearing in the index, you might want to consider stronger initiatives to ensure these pages are removed from the index such as a “noindex” tag, 404 error, or removing them from your internal linking structure completely.

7. Duplicate content

Priority: High

When evaluating this exclusion across a large number of clients, this is the highest priority we’ve seen. If Google sees your content as duplicate, it may crawl the content but elect not to include it in the index. This is one of the ways that Google avoids SERP duplication. By removing duplicate content from the index, Google ensures that users have a larger variety of unique pages to interact with. Sometimes the report will label these URLs with a “Duplicate” status (“Duplicate, Google chose different canonical than user”). However, this is not always the case.

This is a high priority issue, especially on a lot of e-commerce sites. Key pages such as product description pages often include the same or similar product descriptions as many other results across the Web. If Google recognizes these as too similar to other pages internally or externally, it might exclude them from the index all together.

Solution: Add unique elements to the duplicate content.

If you think that this situation applies to your site, here’s how you test for it:

  1. Take a snippet of the potential duplicate text and paste it into Google.
  2. In the SERP URL, append the following string to the end: “#=100”. This will show you the top 100 results.
  3. Use your browser’s “Find” function to see if your result appears in the top 100 results. If it doesn’t, your result might be getting filtered out of the index.
  4. Go back to the SERP URL and append the following string to the end: “&filter=0”. This should show you Google’s unfiltered result (thanks, Patrick Stox, for the tip).
  5. Use the “Find” function to search for your URL. If you see your page now appearing, this is a good indication that your content is getting filtered out of the index.
  6. Repeat this process for a few URLs with potential duplicate or very similar content you’re seeing in the “Crawled — currently not indexed” report.

If you’re consistently seeing your URLs getting filtered out of the index, you’ll need to take steps to make your content more unique.

While there is no one-size-fits-all standard for achieving this, here are some options:

  1. Rewrite the content to be more unique on high-priority pages.
  2. Use dynamic properties to automatically inject unique content onto the page.
  3. Remove large amounts of unnecessary boilerplate content. Pages with more templated text than unique text might be getting read as duplicate.
  4. If your site is dependent on user-generated content, inform contributors that all provided content should be unique. This may help prevent instances where contributors use the same content across multiple pages or domains.

8. Private-facing content

Priority: High

There are some instances where Google’s crawlers gain access to content that they shouldn’t have access to. If Google is finding dev environments, it could include those URLs in this report. We’ve even seen examples of Google crawling a particular client’s subdomain that is set up for JIRA tickets. This caused an explosive crawl of the site, which focused on URLs that shouldn’t ever be considered for indexation.

The issue here is that Google’s crawl of the site isn’t focused, and it’s spending time crawling (and potentially indexing) URLs that aren’t meant for searchers. This can have massive ramifications for a site’s crawl budget.

Solution: Adjust your crawling and indexing initiatives.

This solution is going to be entirely dependent on the situation and what Google is able to access. Typically, the first thing you want to do is determine how Google is able to discover these private-facing URLs, especially if it’s via your internal linking structure.

Start a crawl from the home page of your primary subdomain and see if any undesirable subdomains are able to be accessed by Screaming Frog through a standard crawl. If so, it’s safe to say that Googlebot might be finding those exact same pathways. You’ll want to remove any internal links to this content to cut Google’s access.

The next step is to check the indexation status of the URLs that should be excluded. Is Google sufficiently keeping all of them out of the index, or were some caught in the index? If Google isn’t indexing a large amount of this content, you might consider adjusting your robots.txt file to block crawling immediately. If not, “noindex” tags, canonicals, and password protected pages are all on the table.

Case study: duplicate user-generated content

For a real-world example, this is an instance where we diagnosed the issue on a client site. This client is similar to an e-commerce site as a lot of their content is made up of product description pages. However, these product description pages are all user-generated content.

Essentially, third parties are allowed to create listings on this site. However, the third parties were often adding very short descriptions to their pages, resulting in thin content. The issue occurring frequently was that these user-generated product description pages were getting caught in the “Crawled — currently not indexed” report. This resulted in missed SEO opportunity as pages that were capable of generating organic traffic were completely excluded from the index.

When going through the process above, we found that the client’s product description pages were quite thin in terms of unique content. The pages that were getting excluded only appeared to have a paragraph or less of unique text. In addition, the bulk of on-page content was templated text that existed across all of these page types. Since there was very little unique content on the page, the templated content might have caused Google to view these pages as duplicates. The result was that Google excluded these pages from the index, citing the “Crawled — currently not indexed” status.

To solve for these issues, we worked with the client to determine which of the templated content didn’t need to exist on each product description page. We were able to remove the unnecessary templated content from thousands of URLs. This resulted in a significant decrease in “Crawled — currently not indexed” pages as Google began to see each page as more unique.

Conclusion

Hopefully, this helps search marketers better understand the mysterious “Crawled — currently not indexed” status in the Index Coverage report. Of course, there are likely many other reasons that Google would choose to categorize URLs like this, but these are the most common instances we’ve seen with our clients to date.

Overall, the Index Coverage report is one of the most powerful tools in Search Console. I would highly encourage search marketers to get familiar with the data and reports as we routinely find suboptimal crawling and indexing behavior, especially on larger sites. If you’ve seen other examples of URLs in the “Crawled — currently not indexed” report, let me know in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


https://ift.tt/336AT4H

Thứ Sáu, 6 tháng 3, 2020

Defense Against the Dark Arts: Why Negative SEO Matters, Even if Rankings Are Unaffected

Posted by rjonesx.

Negative SEO can hurt your website and your work in search, even when your rankings are unaffected by it. In this week's Whiteboard Friday, search expert Russ Jones dives into what negative SEO is, what it can affect beyond rankings, and tips on how to fight it.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

All right, folks. Russ Jones here and I am so excited just to have the opportunity to do any kind of presentation with the title "Defense Against the Dark Arts." I'm not going to pretend like I'm a huge Harry Potter fan, but anyway, this is just going to be fun.

But what I want to talk about today is actually pretty bad. It's the reality that negative SEO, even if it is completely ineffective at doing its primary goal, which is to knock your website out of the rankings, will still play havoc on your website and the likelihood that you or your customers will be able to make correct decisions in the future and improve your rankings.

Today I'm going to talk about why negative SEO still matters even if your rankings are unaffected, and then I'm going to talk about a couple of techniques that you can use that will help abate some of the negative SEO techniques and also potentially make it so that whoever is attacking you gets hurt a little bit in the process, maybe. Let's talk a little bit about negative SEO.

What is negative SEO?

The most common form of negative SEO is someone who would go out and purchase tens of thousands of spammy links or hundreds of thousands even, using all sorts of different software, and point them to your site with the hope of what we used to call "Google bowling," which is to knock you out of the search results the same way you would knock down a pin with a bowling ball.

The hope is that it's sort of like a false flag campaign, that Google thinks that you went out and got all of those spammy links to try to improve your rankings, and now Google has caught you and so you're penalized. But in reality, it was someone else who acquired those links. Now to their credit, Google actually has done a pretty good job of ignoring those types of links.

It's been my experience that, in most cases, negative SEO campaigns don't really affect rankings the way they're intended to in most cases, and I give a lot of caveats there because I've seen it be effective certainly. But in the majority of cases all of those spammy links are just ignored by Google. But that's not it. That's not the complete story. 

Problem #1: Corrupt data

You see, the first problem is that if you get 100,000 links pointing to your site, what's really going on in the background is that there's this corruption of data that's important to making decisions about search results. 

Pushes you over data limits in GSC

For example, if you get 100,000 links pointing to your site, it is going to push you over the limit of the number of links that Google Search Console will give back to you in the various reports about links.

Pushes out the good links

This means that in the second case there are probably links, that you should know about or care about, that don't show up in the report simply because Google cuts off at 100,000 total links in the export.

Well, that's a big deal, because if you're trying to make decisions about how to improve your rankings and you can't get to the link data you need because it's been replaced with hundreds of thousands of spammy links, then you're not going to be able to make the right decision. 

Increased cost to see all your data

The other big issue here is that there are ways around it.

You can get the data for more than 100,000 links pointing to your site. You're just going to have to pay for it. You could come to Moz and use our Link Explorer tool for example. But you'll have to increase the amount of money that you're spending in order to get access to the accounts that will actually deliver all of that data.

The one big issue sitting behind all of this is that even though we know Google is ignoring most of these links, they don't label that for us in any kind of useful fashion. Even after we can get access to all of that link data, all of those hundreds of thousands of spammy links, we still can't be certain which ones matter and which ones don't.

Problem #2: Copied content

That's not the only type of negative SEO that there is out there. It's the most common by far, but there are other types. Another common type is to take the content that you have and distribute it across the web in the way that article syndication used to work. So if you're fairly new to SEO, one of the old methodologies of improving rankings was to write an article on your site, but then syndicate that article to a number of article websites and these sites would then post your article and that article would link back to you.

Now the reason why these sites would do this is because they would hope that, in some cases, they would outrank your website and in doing so they would get some traffic and maybe earn some AdSense money. But for the most part, that kind of industry has died down because it hasn't been effective in quite some time. But once again, that's not the whole picture. 

No attribution

If all of your content is being distributed to all of these other sites, even if it doesn't affect your rankings, it still means there's the possibility that somebody is getting access to your quality content without any kind of attribution whatsoever.

If they've stripped out all of the links and stripped out all of the names and all of the bylines, then your hard earned work is actually getting taken advantage of, even if Google isn't really the arbiter anymore of whether or not traffic gets to that article. 

Internal links become syndicated links

Then on the flip side of it, if they don't remove the attribution, all the various internal links that you had in that article in the first place that point to other pages on your site, those now become syndicated links, which are part of the link schemes that Google has historically gone after.

In the same sort of situation, it's not really just about the intent behind the type of negative SEO campaign. It's the impact that it has on your data, because if somebody syndicates an article of yours that has let's say eight links to other internal pages and they syndicate it to 10,000 websites, well, then you've just got 80,000 new what should have been internal links, now external links pointing to your site.

We actually do know just a couple of years back several pretty strong brands got in trouble for syndicating their news content to other news websites. Now I'm not saying that negative SEO would necessarily trigger that same sort of penalty, but there's the possibility. Even if it doesn't trigger that penalty, chances are it's going to sully the waters in terms of your link data.

Problem #3: Nofollowed malware links & hacked content

There are a couple of other miscellaneous types of negative SEO that don't get really talked about a lot. 

Nofollowed malware links in UGC

For example, if you have any kind of user-generated content on your site, like let's say you have comments for example, even if you nofollow those comments, the links that are included in there might point to things like malware.

We know that Google will ultimately identify your site as not being safe if it finds these types of links. 

Hacked content

Unfortunately, in some cases, there are ways to make it look like there are links on your site that aren't really under your control through things like HTML injection. For example, you can actually do this to Google right now.

You can inject HTML onto the page of part of their website that makes it look like they're linking to someone else. If Google actually crawled itself, which luckily they don't in this case, if they crawled that page and found that malware link, the whole domain in the Google search results would likely start to show that this site might not be safe.

Of course, there's always the issue with hacked content, which is becoming more and more popular. 

Fear, uncertainty, and doubt

All of this really boils down to this concept of FUD — fear, uncertainty, and doubt. You see it's not so much about bowling you out of the search engines. It's about making it so that SEO just isn't workable anymore.

1. Lose access to critical data

Now it's been at least a decade since everybody started saying that they used data-driven SEO tactics, data-driven SEO strategies. Well, if your data is corrupted, if you lose access to critical data, you will not be able to make smart decisions. How will you know whether or not the reason your page has lost rankings to another has anything to do with links if you can't get to the link data that you need because it's been filled with 100,000 spammy links?

2. Impossible to discern the cause of rankings lost

This leads to number two. It's impossible to discern the cause of rankings lost. It could be duplicate content. It could be an issue with these hundreds of thousands of links. It could be something completely different. But because the waters have been muddied so much, it makes it very difficult to determine exactly what's going on, and this of course then makes SEO less certain.

3. Makes SEO uncertain

The less certain it becomes, the more other advertising channels become valuable. Paid search becomes more valuable. Social media becomes more valuable. That's a problem if you're a search engine optimization agency or a consultant, because you have the real likelihood of losing clients because you can't make smart decisions for them anymore because their data has been damaged by negative SEO.

It would be really wonderful if Google would actually show us in Google Search Console what links they're ignoring and then would allow us to export only the ones they care about. But something tells me that that's probably beyond what Google is willing to share. So do we have any kind of way to fight back? There are a couple.

How do you fight back against negative SEO?

1. Canonical burn pages

Chances are if you've seen some of my other Whiteboard Fridays, you've heard me talk about canonical burn pages. Real simply, when you have an important page on your site that you intend to rank, you should create another version of it that is identical and that has a canonical link pointing back to the original. Any kind of link building that you do, you should point to that canonical page.

The reason is simple. If somebody does negative SEO, they're going to have two choices. They're either going to do it to the page that's getting linked to, or they're going to do it to the page that's getting ranked. Normally, they'll do it to the one that's getting ranked. Well, if they do, then you can get rid of that page and just hold on to the canonical burn page because it doesn't have any of these negative links.

Or if they choose the canonical burn page, you can get rid of that one and just keep your original page. Yes, it means you sacrifice the hard earned links that you acquired in the first place, but it's better than losing the possibility in the future altogether. 

2. Embedded styled attribution

Another opportunity here, which I think is kind of sneaky and fun, is what I call embedded styled attribution.

You can imagine that my content might say "Russ Jones says so and so and so and so." Well, imagine surrounding "Russ Jones" by H1 tags and then surrounding that by a span tag with a class that makes it so that the H1 tag that's under it is the normal-sized text.

Well, chances are if they're using one of these copied content techniques, they're not copying your CSS style sheet as well. When that gets published to all of these other sites, in giant, big letters it has your name or any other phrase that you really want. Now this isn't actually going to solve your problem, other than just really frustrate the hell out of whoever is trying to screw with you.

But sometimes that's enough to get them to stop. 

3. Link Lists

The third one, the one that I really recommend is Link Lists. This is a feature inside of Moz's Link Explorer, which allows you to track the links that are pointing to your site. As you get links, real links, good links, add them to a Link List, and that way you will always have a list of links that you know are good, that you can compare against the list of links that might be sullied by a negative SEO campaign.

By using the Link lists, you can discern the difference between what's actually being ignored by Google, at least to some degree, and what actually matters. I hope this is helpful to some degree. But unfortunately, I've got to say, at the end of the day, a sufficiently well-run negative SEO campaign can make the difference in whether or not you use SEO in the future at all.

It might not knock you out of Google, but it might make it so that other types of marketing are just better choices. So hopefully this has been some help. I'd love to talk you in the comments about different ways of dealing with negative SEO, like how to track down who is responsible. So just go ahead and fill those comments up with any questions or ideas.

I would love to hear them. Thanks again and I look forward to talking to you in another Whiteboard Friday.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


https://ift.tt/2x81ang

Thứ Tư, 4 tháng 3, 2020

Benchmark for Success: What Your Vertical Can Achieve With Content Marketing

Posted by Domenica

You’ve produced a piece of content you thought was going to be a huge success, but the results were underwhelming.

You double and triple checked the content for all the crucial elements: it’s newsworthy, data-driven, emotional, and even a bit controversial, but it failed to “go viral”. Your digital PR team set out to pitch it, but writers didn’t bite.

So, what's next?

Two questions you might ask yourself are:

  • Do I have unrealistic link expectations for my link-building content?
  • Is my definition of success backed by data-driven evidence?

Fractl has produced thousands of content marketing campaigns across every topic — sports, entertainment, fashion, home improvement, relationships — you name it. We also have several years’ worth of campaign performance data that we use to learn from our successes and mistakes.

In this article, I’m going to explain how businesses and agencies across seven different niches can set realistic expectations for their link-building content based on the performance of 626 content projects Fractl has produced and promoted in the last five years. I’ll also walk through some best practices for ensuring your content reaches its highest potential.

Managing expectations across verticals

You can’t compare apples to oranges. Each beat has its own unique challenges and advantages. Content for each vertical has to be produced with expert-level knowledge of how publishers within each vertical behave.

We selected the following common verticals for analysis:

  • Health and fitness
  • Travel
  • Sex and relationships
  • Finance
  • Technology
  • Sports
  • Food and drink

Across the entire sample of 626 content projects, on average, a project received 23 dofollow links and 88 press mentions in total. Some individual vertical averages didn’t deviate much from these averages, while others niches did.

Of course, you can’t necessarily expect these numbers when you just start dipping your toes in content marketing or digital PR. It’s a long-term investment, and it usually takes at least six months to a year before you get the results you’re looking for.

A “press mention” refers to any time a publisher wrote about the campaign. A press mention could involve any type of link (dofollow, nofollow, simple text attribution, etc.). We also looked at dofollow links individually, as they provide more value than a nofollow link or text attribution. For campaigns that went “viral” and performed well above the norm, we excluded them in the calculation so as not to skew the averages higher. 

Based on averages from these 626 campaigns, are your performance expectations too high or too low?

Vertical-specific content considerations

Of course, there are universal principles that you should apply to all content no matter the vertical. The data needs to be sound. The graphic assets need to be pleasing to the eye and easy to understand. The information needs to be surprising and informative.

But when it comes to vertical-specific content considerations, what should you pay attention to? What tactics or guidelines apply to one niche that you can disregard for other niches? I solicited advice from the senior team at Fractl and asked what they look out for when making content for different verticals. All have several years of experience producing and promoting content across every vertical and niche. Here’s what they said:

Sex and dating


For content relating to sex and relationships, it’s important to err on the side of caution.

“Be careful not to cross the line between ‘sexy’ content and raunchy content,” says Angela Skane, Creative Strategy. “The internet can be an exciting place, but if something is too out-there or too descriptive, publishers are going to be turned off from covering your content.”

Even magazine websites like Cosmopolitan — a publication known for its sex content — have editorial standards to make sure lines aren’t crossed. For example, when pitching a particularly risqué project exploring bedroom habits of men and women, we learned that just because a project is doing well over at Playboy or Maxim doesn’t mean it would resonate with the primarily female audience over at Cosmopolitan.

Especially be aware of anything that could be construed as misogynistic or pin women against each other. It’s likely not the message your client will want to promote, anyway.

Finance

Given the fact that money is frequently touted as one of the topics you avoid over polite dinner conversation, there's no doubt that talking and thinking about money evokes a lot of emotion in people.

“Finance can seem dry at first glance, but mentions of money can evoke strong emotions. Tapping into financial frustrations, regrets, and mistakes makes for highly entertaining and even educational content,” says Corie Colliton, Creative Strategy. “For example, one of my best finance campaigns featured the purchases people felt their partners wasted money on. Another showed the amount people spend on holiday gifts — and the number who were in debt for a full year after the holidays as a result.”

Emotion is one of the drivers of social sharing, so use it to your advantage when producing finance-related content.

We also heard from Chris Lewis, Account Strategy: “Relate to your audience. Readers will often try to use financial content marketing campaigns as a way to benchmark their own financial well-being, so giving people lots of data about potential new norms helps readers relate to your content.”

People want to read content and be able to picture themselves within it. How do they compare to the rest of America, or their state, or their age group? Relatability is key in finance-related content.

Sports

A little healthy competition never hurt anyone, and that’s why Tyler Burchett, Promotions Strategy, thinks you should always utilize fan bases when creating sports content: “Get samples from different fan bases when possible. Writers like to pit fans against each other, and fans take pride in seeing how they rank.”

Food and drink

According to Chris Lewis, don’t forgo design when creating marketing campaigns about food: “Make sure to include good visuals. People eat with their eyes!”

If the topic for which you’re creating content typically has visual appeal, it’s best to take advantage of that to draw people into your content. Have you ever bought a recipe book that didn’t include photos of the food?

Technology

Think tech campaigns are just about tech? Think again. Matt Gillespie, Data Science, says: “Technology campaigns are always culture and human behavior campaigns. Comparing devices, social media usage, or more nuanced topics like privacy and security, can only resonate with a general audience if it ties to more common themes like connection, safety, or shared experience — tech savvy without being overly technical.”

Travel

When creating content for travel, it’s important to make sure there are actionable takeaways in the content. If there aren’t, it can be hard for publishers to justify covering it.

“Travel writers love to extract ‘tips’ from the content they're provided. If your project provides helpful information to travelers or little-known statistics on flights and amenities, you're likely to gain a lot of traction in the travel vertical,” says Delaney Kline, Brand Promotions. “Come up with these ideal statistics before creating your project and use them as a template for your work.”

Health and fitness

In the health and wellness world, it can seem like everyone is giving advice. If you’re not a doctor, however, err on the side of caution when speaking about specific topics. Try not to pit any particular standard against another. Be careful around diet culture and mental health topics, specifically.

“Try striking a balance between physical and mental well-being, particularly being careful to not glorify or objectify one standard while demeaning others,” says Matt Gillespie, Data Science. “Emphasize overall wellness as opposed to focus on a single area. In this vertical, you need to be especially careful with whatever is trending. Do the legwork to understand the research, or lack thereof, behind the big topics of the moment.”

Improving content in any vertical

While you can certainly tailor your content production and promotion to your specific niche, there are also some guidelines you can follow to improve the chances that you’ll get more media coverage for your content overall.

Create content with a headline in mind

When you begin mapping out your content, identify what you want the outcome to look like. Before you even begin, ask yourself: what do you want people to learn from your content? What are the elements of the content you’re producing that journalists will find compelling for their audiences?

For example, we wrote a survey in which we wanted to compare the levels of cooking experience across different generations. We hypothesized that we’d see some discrepancies between boomers and millennials specifically, and given that millennials ruin everything, it was a good time to join the discussion.

As it turns out, only 64% of millennials could correctly identify a butter knife. Publishers jumped at the stats revealing millennials have a tough time in the kitchen. Having a thesis and an idea of what we wanted the project to look like in advance had a tremendous positive impact on our results.

Appeal to the emotionality of people

In past research on the emotions that make content go viral, we learned that negative content may have a better chance of going viral if it is also surprising. Nothing embodies this combination of emotional drivers than a project we did for a travel client in which we used germ swabs to determine the dirtiest surfaces on airplanes.

This campaign did so well (and continues to earn links to this day) that it’s actually excluded from our vertical benchmarks analysis as we consider it a viral outlier.

Why did this idea work? Most people travel via plane at least once a year, and everyone wants to avoid getting sick while traveling. So, a data-backed report like this one that also yielded some click-worthy headlines is sure to exceed your outreach goals.

Evergreen content wins (sometimes)

You may have noticed from the analysis above that, of the seven topics we chose to look at, the sports vertical has the lowest average dofollows and total press mentions of any other category.

For seasoned content marketers, this is very understandable. Unlike the other verticals, the sports beat is an ever-changing and fast-paced news cycle that’s hard for content marketers to have a presence in. However, for our sports clients we achieve success by understanding this system and working with it — not trying to be louder than it.

One technique we’ve found that works for sports campaigns (as well as other sectors with fast-paced news cycles such as entertainment or politics) is to come up with content that is both timely and evergreen. By capitalizing on the current interests around major sporting events (timely) and creating an idea that would work on any given day of the year (evergreen) we can produce content that's the best of both worlds, and that will still have legs once the timeliness wears off.

In a series of campaigns for one sports client, we took a look at the evolution of sports jerseys and chose teams with loyal fan bases such as the New York Yankees, Carolina Panthers, Denver Broncos, and Chicago Bears.

The sports niche has an ongoing, fast-paced news cycle that changes every day, if not every hour. Reporters are busy covering by-the-minute breaking news, games, statistics, rankings, trades, personal player news, and injuries. This makes it one of the most challenging verticals to compete in. By capitalizing on teams of interest throughout the year, we were able to squeeze projects into tight editorial calendars and earn our client some press.

For example, timing couldn’t have been better when we pitched “Evolution of the Football Jersey”. We pitched this campaign to USA Today right before the tenacious playoffs in which the Steelers and the Redskins played. Time was of the essence — the editor wrote and published this article within 24 hours and our client enjoyed a lot of good syndication from the powerful publication. In total, the one placement resulted in 15 dofollow links and over 45 press mentions. Not bad for a few transforming GIFs!

Top it off with the best practices in pitching

If you have great content and you have a set of realistic expectations for that content, all that’s left is to distribute it and collect those links and press mentions.

Moz has previously covered some of the best outreach practices for promoting your content to top-tier publishers, but I want to note that when it comes to PR, what you do is just as important as what you don’t do.

In a survey of over 500 journalists in 2019, I asked online editors and writers what their biggest PR pitch pet peeves were. When you conduct content marketing outreach, avoid these top-listed items and you’ll be good to go:



While you might get away with sending one too many follow-ups, most of the offenses on this list are just that — totally offensive to the writer you’re trying to pitch.

Avoid mass email blasts, personalize your pitch, and triple-check that the person you're contacting is receptive to your content before you hit send.

Conclusion

While there are certainly some characteristics that all great content should have, there are ways to increase the chances your content will be engaging within a specific vertical. Research what your particular audience is interested in, and be sure to measure your results realistically based on how content generally performs in your space.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


https://ift.tt/3at0abE

Thứ Ba, 3 tháng 3, 2020

Heart, Ear, Eye, Mind, Mouth: Local SEO Exercises for Your Least Technical Clients

Posted by MiriamEllis

When was the last time you relaxed with a client?

As a local business consultant, I know that deeper marketing insights can be discovered when you set aside formality and share experiences: a moment, a laugh, a common bond. 

When I’m looking for ways to make life easier for a client, I sometimes reflect on ancient practices like yoga, tai chi, and mindful breathing, which are increasingly understood as beneficial to human health. For a space in time, they reduce the complex world we live in to a simpler one where being, breath, movement, and focus bring the practitioner to a more intuitive state. 

Local marketing agencies can empathize with the complex world their clients inhabit. Local business owners must manage everything from rent and employee benefits to customer service, business reviews, web content, and online listings. When you take on a new client, you expect them to onboard a ton of information about marketing their brand online. Sometimes, the most basic motivations go unaddressed and get lost in assumptions and jargon — instead of decreasing client stress for your least technical clients, you can accidentally increase it. 

Today, I’ll help you newly create an intuitive space by sharing five simple meditation exercises you can use with your agency’s clients. Instead of signaling via SEO, CTR, USPs, and GMB, let’s relax with clients by relating successful local search marketing practices to experiences people at any level of technical proficiency already understand.

Heart

To show their heart is in the right place, the Vermont Country Store publishes a customer bill of rights.

For a local business owner, there is no more important quality than having their heart in the right place when it comes to their motivation for running a company.

Yes, all of us work to earn money, but it’s the dedication to serving others that is felt by customers in every interaction with them. When customers feel that a business is there for them, it establishes the loyalty and reputation that secure local search marketing success. 

Heart meditation

Close your eyes for a few seconds and think of a time in your life when you most needed help from a business. Maybe you needed a tow truck, a veterinarian, a dentist, or a plumber. You really needed them to understand your plight, deliver the right help, and treat you as an important person who is worthy of respect. Whether you received what you required or not, remember the feeling of need. 

Now, extend that recognition beyond your own heart to the heart of every customer who feels a need for something your client can offer them.

A business owner with their heart in the right place can powerfully contribute to local search marketing by:

  • Running a customer-centric business.
  • Creating customer guarantees that are fair.
  • Creating an employee culture of respect and empowerment that extends to customers.
  • Creating a location that is clean, functional, and pleasant for all.
  • Honestly representing their products, services, location, and reputation.
  • Refraining from practices that negatively impact their customers and reputation.
  • Participating positively in the life of the community they serve.

A good local search marketing agency will help the business owner translate these basics into online content that meets customer needs, local business listings that accurately and richly represent the business, and genuine reviews that serve as a healthy and vital ongoing conversation between the brand and its customers. A trustworthy agency will ensure avoidance of any tactics that pollute the Internet with spam listings, spam reviews, negative attacks on competitors, and negative impacts on the service community. An excellent agency will also assist in finding and promoting community engagement opportunities, helping to win desirable online publicity from offline efforts.

Ear

Keter Salon of Berkeley, Calif. really listens to customers and it shows in its reviews.

Local business success is so linked to the art of listening, I sometimes think Google should replace their teardrop map markers with little ears. In the local SEO world, there are few things sadder than seeing local business profiles filled with disregarded reviews, questions, and negative photos. (Someone cue “The Sound of Silence”.)

From a business perspective, the sound of branded silence is also the sound of customers and profits trickling away. Why does it work this way? Because only 4% of your unhappy customers may actually make the effort to speak up, and if a business owner is not even hearing them, they’ve lost the ability to hear consumer demand. Let’s make sure this doesn’t happen.

Ear meditation

Close your eyes for a few seconds and listen closely to every noise within the range of your hearing. Ask yourself, “Where am I?”

The sound of typing, phone calls, and co-workers chatting might place you in an office. Sliding doors, footsteps on linoleum, and floor staff speaking might mean you’re at your client's brick-and-mortar location. Maybe it’s birdsong outside and the baby in their crib that tell you you’re working from home today. Listen to every sound that tells you exactly where you are right now.

Now, commit to listening with this level of attention and intention to the signals of customer voices, telling you exactly where a local brand is right now in terms of faults and successes. 

A business owner who keeps their ears open can actively gauge how their business is really doing with its customers by:

  • Having one-on-one conversations with customers.
  • Recording and analyzing phone conversations with customers.
  • Reading reviews on platforms like Google My Business, Yelp, Facebook and sites that are specific to their industry (like Avvo for lawyers or Healthgrades for physicians).
  • Reading the Q&A questions of customers on their Google Business Profile.
  • Reading mentions of their brand on social media platforms like Twitter, Facebook, and Instagram.
  • Reading the responses to surveys they conduct.
  • Reading the emails and form submissions the company receives.

A good local search marketing agency will help their client amass, organize, and analyze all of this sentiment to discover the current reputation of the business. From this information, you and your client can chart a course for improvement. Consider that, in this study, a 1.5 star improvement in online reputation increased consumer activity by 10%-12% and generated 13,000 more leads for the brands included. The first step to a better reputation is simply listening. 

Eye

Moz’s Local Market Analytics (Beta) helps you see your market through customer location emulation.

When your clients choose their business locations, they weigh several factors. They compare how the mantra of “location, location, location” matches their budget, and whether a certain part of town is lacking something their business could provide. They also look at the local competitors to see if the competition would be hard to beat, or if they could do the job better. Success lies in truly seeing the lay of the land.

Local search mirrors the real world. The market on the Internet is made up of the physical locations of your clients’ customers at the time they search for what your client has to offer. 

Eye meditation

You already know most of the businesses on your street, and many of them in your neighborhood. Now, with eyes wide open, start searching Google for the things your listening work has told you customers need. Where appropriate, include attributes you’ve noticed them using like “best tacos near me”, “cheapest gym in North Beach”, or “shipping store downtown.”

See how your client is ranking when a person does these type of searches while at their location. Now, walk or drive a few blocks away and try again. Go to the city perimeter and try again. Where are they ranking, and who is outranking them as you move about their geographic market?

A local business keeping its eyes open never makes assumptions about who its true competitors are or how its customers search. Instead, it:

  • Regularly assesses the competition in its market, taking into account the distance from which customers are likely to come for goods and services.
  • Regularly reviews materials assembled in the listening phase to see how customers word their requests and sentiments.
  • Makes use of tools to analyze both markets and keyword searches.

A good local search marketing agency will help with the tools needed for market and search language analysis. These findings can inform everything from what a client names their business, to how they categorize it on their Google My Business listing, to what they write about to draw in customers from all geographic points in their market. Clear vision simultaneously enables you to analyze competitors who are outranking your client and assess why they’re doing so. It can empower your client to report spammers who are outranking them via forbidden tactics. An excellent agency will help their client see their competitive landscape with eyes on the prize.

Mind

When an independent Arizona appliance chain surprised three shoppers with $10,000, it made headlines.


With hearts ready for service, ears set on listening, and eyes determined to see, you and your client have now taken in useful information about their brand and the customers who make up their local market. You know now whether they’re doing a poor, moderate, or exceptional job of fulfilling needs, and are working with them to strategize next steps. But what are those next steps? 

Mind meditation

Sit back comfortably and think of a time a business completely surprised you, or a time when an owner or employee did something so unexpectedly great, it convinced you that you were in good hands. Maybe they comped your meal when it wasn’t prepared properly, or special-ordered an item just for you, or showed you how to do something you’d never thought of before.

Recall that lightbulb moment of delight. Ask yourself how your client’s brand could surprise customers in memorable ways they would love. Create a list of those ideas.

A creative local business gives full play to the awesome imaginative powers of the brain. It gives all staff permission to daydream and brainstorm questions like:

  • What is something unexpected the business could do that would come as a delightful surprise to customers?
  • What is the most impactful thing the business could do that would be felt as a positive force in the lives of its customers?
  • What risks can the business take for the sake of benevolence, social good, beauty, renown, or joy?

A good local search marketing agency will help sort through ideas that could truly differentiate their clients from the competition and bring them closer to making the kinds of impressions that turn local brands into household names. An excellent agency will bring ideas of their own. Study “surprise and delight marketing” as it’s done on the large, corporate scale, and get it going at a local level like this small coffee roaster in Alexandria, Va. selling ethical java while doubling as funding for LGBTQ+ organizations. 

Mouth

Put your best stories everywhere, like in this social media example. Moz Local can help with publishing those stories.


“Think before you speak” is an old adage that serves well as a marketing guideline. Another way we might say it is “research before you publish”. With heart, ear, eye, and mind, you and your client have committed, collected, analyzed, and ideated their brand to a point where it’s ready to address the public from a firm foundation.

Mouth meditation

Open your favorite word processor on your computer and type a few bars of the lyrics to your favorite song. Next, type the first three brand slogans that come to your mind. Next, type a memorable line from a movie or book. Finally, type out the the words of the nicest compliment or best advice someone ever gave you. 

Sit back and look at your screen. Look at how those words have stuck in your mind — you remember them all! The people who wrote and spoke those words have indelibly direct-messaged you. 

How will you message the public in a way that’s unforgettable?

A well-spoken local business masters the art of face-to-face customer conversation. In-store signage and offline media require great words, too, but local search marketing will take spoken skills onto the web, where they'll be communicated via:

  • Every page of the website 
  • Every article or blog post 
  • Social media content
  • Review responses
  • Answers to questions like Google Business Profile Q&A
  • Business descriptions on local business listings
  • Google posts
  • Featured snippet content
  • Live chat
  • Email
  • Press releases
  • Interviews
  • Images on the website, business listings, and third-party platforms like Google Images and Pinterest
  • Videos on the website, YouTube, and other platforms

A good local search marketing agency will help their client find the best words, images, and videos based on all the research done together. An excellent agency will help a local business move beyond simply being discovered online to being remembered as a household name each time customer needs arise. An agency should help their clients earn links, unstructured citations, and other forms of publicity from those research efforts.

Determine to help your client be the "snap, crackle, pop", "un-Cola", "last honest pizza" with everything you publish for their local market, and to build an Internet presence that speaks well of their business 24-hours a day.

Closing pose



One of the most encouraging aspects of running and marketing a local business is that it’s based on things you already have some life experience doing: caring, listening, observing, imagining, and communicating. 

I personally should be better at technical tasks like diagnosing errors in Schema, configuring Google Search Console for local purposes, or troubleshooting bulk GMB uploads. I can work at improving in those areas, but I can also work at growing my heart, ear, eye, mind, and mouth to master serving clients and customers.

Business is technical. Business is transactional. But good business is also deeply human, with real rewards for well-rounded growth.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


https://ift.tt/2IdCbRN

Thứ Hai, 2 tháng 3, 2020

2020 Google Search Survey: How Much Do Users Trust Their Search Results?

Posted by LilyRayNYC

While Google’s mission has always been to surface high-quality content, over the past few years the company has worked especially hard to ensure that its search results are also consistently accurate, credible, and trustworthy.

Reducing false and misleading information has been a top priority for Google since concerns over misinformation surfaced during the 2016 US presidential election. The search giant is investing huge sums of money and brain power into organizing the ever-increasing amounts of content on the web in a way that prioritizes accuracy and credibility.

In a 30-page whitepaper published last year, Google delineates specifically how it fights against bad actors and misinformation across Google Search, News, Youtube, Ads, and other Google products.

In this whitepaper, Google explains how Knowledge Panels — a common organic search feature — are part of its initiative to show “context and diversity of perspectives to form their own views.” With Knowledge Panel results, Google provides answers to queries with content displayed directly in its organic search results (often without including a link to a corresponding organic result), potentially eliminating the need for users to click through to a website to find an answer to their query. While this feature benefits users by answering their questions even more quickly, it brings with it the danger of providing quick answers that might be misleading or incorrect.

Another feature with this issue is Featured Snippets, where Google pulls website content directly into the search results. Google maintains specific policies for Featured Snippets, prohibiting the display of content that is sexually explicit, hateful, violent, dangerous, or in violation of expert consensus on civic, medical, scientific, or historical topics. However, this doesn’t mean the content included in Featured Snippets is always entirely accurate.

According to data pulled by Dr. Pete Meyers, based on a sample set of 10,000 keywords, Google has increased the frequency with which it displays Featured Snippets as part of the search results. In the beginning of 2018, Google displayed Featured Snippets in approximately 12% of search results; in early 2020, that number hovers around 16%.

Google has also rolled out several core algorithm updates in the past two years, with the stated goal of “delivering on [their] mission to present relevant and authoritative content to searchers.” What makes these recent algorithm updates particularly interesting is how much E-A-T (expertise, authoritativeness, and trustworthiness) appears to be playing a role in website performance, particularly for YMYL (your money, your life) websites.

As a result of Google’s dedication to combating misinformation and fake news, we could reasonably expect searchers to agree that Google has improved in its ability to surface credible and trusted content. But does the average searcher actually feel that way? At Path Interactive, we conducted a survey to find out how users feel about the information they encounter in Google’s organic results.

About our survey respondents and methodology

Out of 1,100 respondents, 70% of live in the United States, 21% in India, and 5% in Europe. 63% of respondents are between the ages of 18 and 35, and 17% are over the age of 46. All respondent data is self-reported.

For all questions involving specific search results or types of SERP features, respondents were provided with screenshots of those features. For questions related to levels of trustworthiness or the extent to which the respondent agreed with the statement, respondents were presented with answers on a scale of 1-5.

Our findings

Trustworthiness in the medical, political, financial, and legal categories

Given how much fluctuation we’ve seen in the YMYL category of Google with recent algorithm updates, we thought it would be interesting to ask respondents how much they trust the medical, political, financial, and legal information they find on Google.

We started by asking respondents about the extent to which they have made important financial, legal, or medical decisions based on information they found in organic search. The majority (51%) of respondents indicated that they “very frequently” or “often” make important life decisions based on Google information, while 39% make important legal decisions, and 46% make important medical decisions. Only 10-13% of respondents indicated that they never make these types of important life decisions based on the information they’ve found on Google.

Medical searches

As it relates to medical searches, 72% of users agree or strongly agree that Google has improved at showing accurate medical results over time.

Breaking down these responses by age, a few interesting patterns emerge:

  • The youngest searchers (ages 18-25) are 94% more likely than the oldest searchers (65+) to strongly believe that Google’s medical results have improved over time.
  • 75% of the youngest searchers (ages 18-25) agree or strongly agree that Google has improved in showing accurate medical searches over time, whereas only 54% of the oldest searchers (65+) feel the same way.
  • Searchers ages 46-64 are the most likely to disagree that Google’s medical results are improving over time.

Next, we wanted to know if Google’s emphasis on surfacing medical content from trusted medical publications — such as WebMD and the Mayo Clinic — is resonating with its users. One outcome of recent core algorithm updates is that Google’s algorithms appear to be deprioritizing content that contradicts scientific and medical consensus (consistently described as a negative quality indicator throughout their Search Quality Guidelines).

The majority (66%) of respondents agree that it is very important to them that Google surfaces content from highly trusted medical websites. However, 14% indicated they would rather not see these results, and another 14% indicated they’d rather see more diverse results, such as content from natural medicine websites. These numbers suggest that more than a quarter of respondents may be unsatisfied with Google’s current health initiatives aimed at surfacing medical content from a set of acclaimed partners who support the scientific consensus.

We asked survey respondents about Symptom Cards, in which information related to medical symptoms or specific medical conditions is surfaced directly within the search results.

Examples of Symptom Cards. Source: https://blog.google/products/search/im-feeling-yucky-searching-for-symptoms/

Our question aimed to gather how much searchers felt the content within Symptom Cards can be trusted.

The vast majority (76%) of respondents indicated they trust or strongly trust the content within Symptom Cards.

When looking at the responses by age, younger searchers once again reveal that they are much more likely than older searchers to strongly trust the medical content found within Google. In fact, the youngest bracket of searchers (ages 18-25) are 138% more likely than the oldest searchers (65+) to strongly trust the medical content found in Symptom Cards.

News and political searches

The majority of respondents (61%) agree or strongly agree that Google has improved at showing high-quality, trustworthy news and political content over time. Only 13% disagree or strongly disagree with this statement.

Breaking the same question down by age reveals interesting trends:

  • The majority (67%) of the youngest searchers (ages 18-25) agree that the quality of Google’s news and political content has improved over time, whereas the majority (61%) of the oldest age group (65+) only somewhat agrees or disagrees.
  • The youngest searchers (ages 18-25) are 250% more likely than the oldest searchers to strongly agree that the quality of news and political content on Google is improving over time.

Misinformation

Given Google’s emphasis on combating misinformation in its search results, we also wanted to ask respondents about the extent to which they feel they still encounter dangerous or highly untrustworthy information on Google.

Interestingly, the vast majority of respondents (70%) feel that they have encountered misinformation on Google at least sometimes, although 29% indicate they rarely or never see misinformation in the results.

Segmenting the responses by age groups reveals a clear pattern that the older the searcher, the more likely they are to indicate that they have seen misinformation in Google’s search results. In fact, the oldest searchers (65+) are 138% more likely than the youngest searchers (18-25) to say they’ve encountered misinformation on Google either often or very frequently.

Throughout the responses to all questions related to YMYL topics such as health, politics, and news, a consistent pattern emerged that the youngest searchers appear to have more trust in the content Google displays for these queries, and that older searchers are more skeptical.

This aligns with our findings from a similar survey we conducted last year, which found that younger searchers were more likely to take much of the content displayed directly in the SERP at face value, whereas older searchers were more likely to browse deeper into the organic results to find answers to their queries.

This information is alarming, especially given another question we posed asking about the extent to which searchers believe the information they find on Google influences their political opinions and outlook on the world.

The question revealed some interesting trends related to the oldest searchers: according to the results, the oldest searchers (65+) are 450% more likely than the youngest searchers to strongly disagree that information they find on Google influences their worldview.

However, the oldest searchers are also most likely to agree with this statement; 11% of respondents ages 65+ strongly agree that Google information influences their worldview. On both ends of the spectrum, the oldest searchers appear to hold stronger opinions about the extent to which Google influences their political opinions and outlook than respondents from other age brackets.

Featured Snippets and the Knowledge Graph

We also wanted to understand the extent to which respondents found the content contained within Featured Snippets to be trustworthy, and to segment those responses by age brackets. As with the other scale-based questions, respondents were asked to indicate how much they trusted these features on a scale of 1-5 (Likert scale).

According to the results, the youngest searchers (ages 18-25) are 100% more likely than the oldest searchers (ages 65+) to find the content within Featured Snippets to be very trustworthy. This aligns with a similar discovery we found in our survey from last year: “The youngest searchers (13–18) are 220 percent more likely than the oldest searchers (70–100) to consider their question answered without clicking on the snippet (or any) result.”

For Knowledge Graph results, the results are less conclusive when segmented by age. 95% of respondents across all age groups find the Knowledge Panel results to be at least “trustworthy.”

Conclusion: Young users trust search results more than older users

In general, the majority of survey respondents appear to trust the information they find on Google — both in terms of the results themselves, as well as the content they find within SERP features such as the Knowledge Panel and Featured Snippets. However, there still appears to be a small subset of searchers who are dissatisfied with Google’s results. This subset consists of mostly older searchers who appear to be more skeptical about taking Google’s information at face value, especially for YMYL queries.

Across almost all survey questions, there is a clear pattern that the youngest searchers tend to trust the information they find on Google more so than the older respondents. This aligns with a similar survey we conducted last year, which indicated that younger searchers were more likely to accept the content in Featured Snippets and Knowledge Panels without needing to click on additional results on Google.

It is unclear whether younger searchers trust information from Google more because the information itself has improved, or because they are generally more trusting of information they find online. These results may also be due to older searchers not having grown up with the ability to rely on internet search engines to answer their questions. Either way, the results raise an interesting question about the future of information online: will searchers become less skeptical of online information over time?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!


https://ift.tt/2PDR6J3

Mách bạn cách làm kem tươi bằng máy ngon “nút lưỡi”

Kem luôn là một trong những món ăn khoái khẩu nhất của hầu hết tất cả mọi người bởi hương vị mát lạnh, thơm ngon, ngọt ngọt, đa dạng về màu ...