The Useful Content

A Special Article: to start a blog

I just had a website fall off a cliff. After the useful content update. However, the content is hand written with an expert's advice on the subject matter. Any ideas how I move forward. Cheers.
12 👍🏽2 😢
72 💬🗨

The Useful Content

Ammon 🎓🎩
There's a reason that Google named it the 'useful' content update, rather than the 'great content', or 'quality content', or 'expert content' update – it is hard to measure quality of writing in the true sense with machine learning other than by the one essential metric – are humans finding it useful and helpful.
A SEO friend of mine from years and years back who was a moderator at the Cre8asiteForums with the late Bill and I had a favourite saying: "Help is defined by the recipient". No matter how helpful you intend to be, or think you are being, only the other person can say whether it is actually helping them.
Well, just like that, Google are looking partly at whether your content is actually helping people, measurably, in determining whether or not it is 'useful'.
It's a smart move by Google in a way because it is something that scales to the current trend in so-called 'Artificial Intelligence (AI)' generated content. Even if your content has all the same info as what Google already have, and is already ranking, if they got the answers elsewhere and didn't need yours then it won't get those 'useful' metrics going. It devalues duplicate or 'me too' content, by looking at whether content is actually performing in the Search Engine Result Pages (SERPs) across everything on that site or by that publisher.
If you already have a fair amount of content in the index, but the metrics from search say it isn't useful and performing compared to similar, then Google deem it to be of low value to search users, and thus of low value to them. You start to find that Google is discovering URLs but not crawling them, or crawling content but not indexing it. It seems inevitable therefore that once this limit kicks in, some of the existing indexed content won't get re-crawled and refreshed and may also drop out of the index.
We saw and noted the first part of this change in approach kick in last November, as suddenly lots of sites had trouble getting content crawled or indexed, and many resorted to extreme means to force it (rather than fix the root cause). But at that time it only affected fresh content.
Even so, I wondered then if it would eventually apply to refreshes and recrawling too. And right about the time we'd expect to find out, along came the 'Useful Content' update that had very, very similar effects and aims.
Okay, enough about what it is meant to be, let's cut to the chase on what you can do about it.
The first and most important thing is to see if the content has attracted ANY links of its own, without being forced. If not, then that's sometimes a sign that it is not all that useful. Nobody chose to use it as a citation to prove anything, to argue with anything, or to educate anyone.
Now, if you think the lack of genuine links is simply that nobody saw it, well, that in itself is an obvious problem with an obvious solution. Get it seen. At this stage that's probably about building some links to it, maybe some paid promotion, etc. Get some traffic to it and see if it attracts some links. If it has traffic but still isn't getting links, how 'useful' is it really?
For future scalability, you want to do brand building (brand search is a big signal of usefulness) and you want to enhance your ability to drive traffic to fresh content for free – subscription newsletters work wonders here, but any other method of update that users choose works great too. Again, if nobody subscribes, how useful is the content and have you been fooling yourself as to the real quality, not according to you, or your expert, but to the people who need to read it?

Wallace ✍️ » Ammon
Great reply, with lots of salient points. It for sure has built links from the community it serves, with large brands and leaders in the space linking to it from their websites organically. Cool, lots for me to think about. Much appreciated.
Ammon 🎓🎩 » Wallace
Given what you said about links from good sources, it seems to me there is a fair chance that (unless your good links are only to a tiny fraction of content, and you have a far, far higher ratio of not so useful content than useful) this may simply be coincidental timing, and nothing to do directly with the useful content update.
I'm afraid you really need to give your site a full audit, checking in case any little technical issues have crept in and been overlooked because of looking at content. It's better to rule out other causes and know.
From what you've said otherwise – expert writing, gets links – this should NOT be an issue with the content itself.
Still worth doing the brand boosting and of course the subscription methods of getting traffic as that has a huge long-term benefit regardless. But I think you're going to have to get forensic to discover the real cause, mate.
Wallace ✍️ » Ammon
Working on it now. Deep diving into the site as I type. Again, thanks for your insights, much appreciated. Robb
Steph » Ammon
my AI content is ranking awesomely
Ammon 🎓🎩 » Steph
never said it wouldn't, and if you think I even came close, read again. I specifically said that this scales with Artificial Intelligence (AI) generated content to measure usefulness. If your content, however generated, is useful matters more than how it is generated. Google's content is generated by algorithms and machine learning. Amazon's content is put together from User Generated Content (UGC), algorithms, and machine learning. YouTube's content is generated from UGC sorted by algorithms and machine learning…
Steph » Ammon
sorry bro, my add kicks in pretty fast lol
Ammon 🎓🎩 » Steph
No worries. People so often bash 'machine generated content' without realizing just how much of the internet is built on it, and thus Google certainly are not going to penalize it all. It is always about the quality, the utility, not the source of creation.
Steph » Ammon
absolutely true
Nathan » Ammon
amazing insights! Do you think that G looks at the searchers' time to return back to the Search Engine Result Pages (SERPs) (and if ever) in determining usefulness?
Ammon 🎓🎩 » Nathan
Google have always had to look at time as a factor of BBTS (Bounce Back To Search Engine Result Page (SERP)) or probably what they term re-querying. Too fast and it could have been a misclick, too slow and the page may have been great but someone wants to confirm it (a second opinion). It's why they never used bounces or BBTS as a ranking signal.
My feeling is that hasn't changed. They still wont use Click Through Rate (CTR) as a direct ranking signal for a specific keyword/query. From the statements made, it seems more that they are using aggregated CTR across mul;tiple queries and pages to determine an overall 'Site Quality and Usefulness' metric instead, applied at the domain level.
Nathan » Ammon
yes exactly my thought formed as a question… Whether they are using BBTS in addition to the other factors of this site-wide HC signal (they do seem to state HCU is a site-wide factor anyway)
Ammon 🎓🎩 » Nathan
What is the most fascinating (and the thing that confuses newer SEO users the absolute most) is that Google have a different perspective on BBTS to that which the uninitiated/uninvested often assume.
You see, Google have long been interested in measuring bounces (and they use BBTS with a very tight timing window to define a 'bounce' of this kind) in order to assess the quality of their SERP
The thing is that most SEO users and webmasters are invested in the site side of the equation, and thus assume Google use this to increase or decrease the ranking of various pages in the results. That is absolutely wrong. You see, Google don't believe that YOU made your site rank, they believe their algorithm did. And if their algorithm put bad results in then it means they are either looking at the wrong signals, or at the least have the balance of the importance of the signals wrong. To Google, you don't penalize or reward a site for Google already selecting it – you modify your signal weighting until it consistently delivers the better results across a wide range of queries, not just that one.
High bounces, or having to skip a lot of the top results to select a low one, or having to go several pages deep, or having to refine the query, are all signs that Google may not be serving the intent correctly, and need to refine the algorithm further.
Before now, it never affected the site that had the bounces, nor the site that was selected. So this latest change, if working the way I think it is based on all they have said, is a big change. It still isn't rewarding or penalizing a site for a specific search. But it does *seem* that it is keeping track of domains that are often a poor result, either not getting clicks commensurate with impressions, or getting higher bounces, across a whole range of search terms, as a site-wide, all-query quality signal.
Nathan » Ammon
Ah very helpful! Now that you say it, it makes sense that G'd use BBTS to gauge the success of their algorithm. We all do the same thing with our sites: look at user behavior to see if our sites are accomplishing their purpose.
Your perspective really helps: Google is curating the SERPs through algorithms. It's THEIR SERP – the SERP is their page not a page of UGC. So they take ownership of the success & failure.
What you say about "keeping track of domains" reminded me of this quote from them "site-wide signal… among many other signals for ranking web pages" and this one "Any content… on sites [with]… high amounts of unhelpful content overall is less likely to perform well in Search…"
Do you think this is G introducing an algorithmic site-wide penalty for the first time? Almost history-making?
Ammon 🎓🎩 » Nathan
Actually, penalties have often been domain-wide. It's why I think of the 'helpful content' update more like Panda or Penguin, a new penalty system, rather than actually rewarding the positives, which always tends to be page specific.
Phil » Ammon
Brand search is a really good point! I'm adding that to my list of usefullness signals 😉
Ammon 🎓🎩 » Phil
Brand search is a big one, and something one of the Google statements particularly mentioned as a signal of usefulness. Sometimes the expert or authority will tell you the exact same thing you just heard from a non-expert or not as trusted source – but it still has unique value as authoritative. Brand search is a big signal of who you trust or want the opinion of.
Phil » Ammon
Absolutely! I didn't read that, so appreciate you pointing it out. Thanks!


Was the content focused on ranking or usefulness?
Are the articles focused on your keywords, with the "right Search Engine Optimization (SEO) headings"?
I don't know what your niche is, but how well do you know whatever your affiliate links point to?
This wasn't the Artificial Intelligence (AI) update .. it's the helpfulness update.
If you've focused on helpfulness, Google may believe it already has one of what you've written. That one other thing to consider.

Wallace ✍️ » Phil
I wrote the content with both in mind. Good content first with SEO optimisation at the heart of is produced. Using tools like FRase, and NeuronWriter as rough guides. As well as measuring it independently against what was currently in the top 5. Most affiliate links to Amazon and other Large online stores. But not every page is an affiliate page, 50% are information posts.
Phil » Wallace
I do wonder whether tools like frase might be part of the issue these days.. and I also wonder about the patterns that are left behind a "50% informational post" mentality.
Also .. is it good or helpful? They are very different.
Montti » Wallace » Phil
I agree with Phil about the use of tools like Frase.
I have a lot to say about this practice of studying the Search Engine Result Pages (SERPs) and using these kinds of tools but i'll save it for later.
Phil » Montti
I'll look forward to hearing that!
My view on the subject of content tools is that Google may boil so many of these articles down to the same article with slightly different "spins". I doubt they consider that anything more than "over seo'ed content".
Wallace ✍️ » Phil
Yip, I use them to analyse similar content. However, my articles are all unique. 50% of the articles on the site are information based, articles that are written to answer questions, be useful etc. The other 50% are gear guides and reviews. Real world gear guides and reviews.
Phil » Wallace
I think the point I'm trying to make is that they may be unique in words, but are they unique in entities / Natural Language Processing (NLP)
When Google looks at the key phrases in the articles that it recognises, does it follow the same pattern as others, or is THAT unique?
Have you gone your own way, or have you looked at what Frase tell you to include and written content to satisfy that?
And LOADS of affiliates will have drummed into them "you must have 50% questions answered type of content".. This is a pattern that Google can see.
Wallace ✍️ » Phil
Yip, the content is unique, the expertise and experience I have in the niche, is rather rare. And this shows in my content. I use Frase etc to get a rough idea of what other are saying. Mainly to cover those bases, and then to expand upon. But it is mainly after the fact. I write the article first. The 50% is just that, I answer lots of questions with authority, to show the users and Google that its more than just pushing people to affilaite links.
Phil » Wallace
Without taking a good look, I don't think I can help you more.
From the outside, it seems like you believe your content is bang on – and maybe it is – and maybe it isn't – but the patterns Google may be seeing in the WAY you're SEOing could be what it sees from thousands of other affiliates who all follow the same **overall** path.
I think what you're looking for here is *combinations* of signals that lead to a pattern of Search Engine Optimization (SEO) behaviour eg. "written mostly on questions that are popular" + "don't have many strong links" + "prices on affiliate links" etc…
The answer may be … to stop optimising so hard, if you are. Stop being an SEO first.
I may be completely wrong… but I may not be.
Good luck.
Phil » Montti
indeed. Tools (AI tools and content optimisation) literally can lead to articles that aren't unique.
Not saying it's happening here, but I think Google will have these things in their crosshairs in the next few years and people will have to adapt their processes.
Google *might* have just started down a path of winning a battle, particularly against affiliates who write very unhelpful content (again not saying that's happening here)

Is it affiliate? What affiliate links? Amazon? Is it obvious its affiliate with clear messaging? Did any pages stay stable? I would dive more into data. There is obvious reasons if you look at the data.

Wallace ✍️ » Altman
It is a clear affiliate with a clear affiliate disclosure, cookie policy and privacy policy. Amazon and a few other large online stores. Nope, the entire site fell off a cliff, some of my key pages don't even seem to be on google. From first page to page 10 and beyond. I will keep looking, thanks for your input.
Altman » Robb
is your content review style pages like "best …" which is what Google is going after. They announced it. Do you have any content with no affiliate links to build authority? Like silo'd authority?
First step I would do is build content with no affiliate links and silo.
If no results then I would personally remove affiliate content and launch none affiliate silo. Then add back affiliate content pages slowly as a test. Then point silo main page to affiliate.
No idea how you did you content authority but might be my approach. I see a lot of posts about amazon affiliate sites tanking.
Wallace ✍️ » Altman
Yip they are mostly BEST style posts. I created lots of information posts around one main niche, with each of these in some way linking to the affiliate BEST article. Each is siloed. For talking sake, it is around15 info pages to one affiliate product page. I think I will change all my titles.

Seeing your website would help answering your question much easier.
Law of diminishing returns says that the first bucket of water in the middle of the desert is invaluable.
The tenth is good to have.
The thousandth has little to no value.
But, you can add value to that thousandh bucket by making it thermaly resistant, so it keeps the water cool. You can improve it in many other ways.
So the question is, how digestible your content is? How exciting it is, no matter the subject?
Expertly written content is not helpful unless easily consumed, because people rarely have enough attention span and patience to read it.
So, among 10 expertly written content pieces, if Google has to decide which one to rank first, isn't it logical that it would rank the one people love the most?
That brings us to the vital foundations of long-term Search Engine Optimization (SEO)User Experience (UX)Conversion Rate Optimization (CRO) and Copywriting. As well as appealing design, real offline authority and trustworthiness. Expertise is the easiest part of all. 🙂


The 100% Bounce Rate Means

Keyword Research with Modifiers and KeyPhrases

The Complete Guide to Pinterest

$450k Income From Copywriting for Clients

Leave a Reply

Your email address will not be published. Required fields are marked *