This was initially sent as an email newsletter. Hence the use of “email” throughout.
This email is pure conjecture. Normally, I don’t spend much any time trying to figure out Google updates but this one is both baffling and big. Big as in it’s creamed a lot of sites and big as in traffic losses are huge.
Most updates in recent years (since Medic in 2018) would reduce traffic by 20% to 25% or so. While not fun, certainly not catastrophic.
This update is hammering sites. I’ve seen so many traffic charts in Ahrefs with an absolute traffic cliff starting at the end of May. 50% traffic loss overnight is not unusual. Check out three competitors in one of my niches who basically ripped off other more established sites:
Here’s a site that I’m 99.9% certain is pure AI content:
I’m going to take a stab at suggesting what I think is going on.
This is not based on me analyzing 1 million sites or using any fancy software. This is me checking out a good number of sites in a few of my niches that got creamed. I checked out each site. Not entirely but enough to get a sense of their content strategy.
Here goes…
This update is targeting AI content but it’s not just hitting pure AI content. It’s hitting all content the algo interprets as AI.
I’m sure there are many variables involved but in a nutshell, content that goes after keywords that were already established by another site with similar heading tags and info are getting pummelled.
Basically, it’s impacting sites that are “rewrites” of other sites. Replicas. Sites that go after the same keywords and publish content in the same format as other established sites. There’s nothing unique about them.
I have a hunch that the content optimizers aren’t helping. It’s just a hunch but think about it. Everyone now uses the same content optimizers that suggest the same headings and outlines as established articles.
The first to publish on a particular topic have no problems.
In fact, in a couple of my niches, established sites are faring just fine. It’s the “replica” sites that are getting smoked.
The established sites tend to (not always), publish more unique content. They are first to cover a keyword. If they go after an established keyword, they do so in unique ways with unique titles and content formats.
In fact, I noticed this trend after finding a few sites blatantly copying my content. Okay, they rewrote it but it was clear they basically copied it… headings, sections, etc.
These are were successful sites monetized with Mediavine probably earning well.
What does this mean?
It’s simple. Stop copying other sites. Be unique. Publish something different. Don’t go the replica route.
I’m not guilt-free here. I hire writers. I know many writers basically rewrite stuff so I no doubt have my share of “replica” content. But I have enough unique content to not get caught in the update. In fact, one thing I’ve done over the years is find keywords other pubs don’t cover. If the keywords are covered, I approach it in a unique way.
I can tell you what I’m doing going forward and that is I’m putting in a huge effort into ensuring my content is unique and offers something not already published. I do this already but I’m ramping it up big time.
I’ll still go after established keywords. It would be foolish not to. But I’m doing it my way.
IF you believe there’s merit in my theory about this update, you should do the same.
If you think I’m out to lunch, you have to admit the suggestion to be unique is still sound.
Don’t just rewrite established sites and articles. Sure, if you want to test it on some throwaway domain, go for it but I think you’re wasting your time.
What do I mean about “be unique”? What exactly does that entail?
I could mean many things. Here are some examples:
- Never published before keyword: This is the most obvious. If the keyword phrase is not covered, that will be unique no matter how you write it.
- Make it better: If you find a site ranking for a clearly outdated article or an article that’s missing a pile of good info, that’s a perfect opportunity to go after it and do it better.
- Personalize it: An easy way to make your content unique is inject personality and personal info. I’ve been doing this for years. I ask writers to do it. Include personal examples and anecdotes. Publish your own images, charts and illustrations (I do this a lot too).
Reduce reliance on content optimizers: Instead of using the same headings and sections suggested by content optimizers, write your own. By all means use these tools but put your own spin on it. Write unique subheadings. Add sections and info not suggested by the optimizers.
I’ve used content optimizers extensively although not nearly as much lately. I have one writer using one because she likes it and that’s great. She’s my best writer and her content is some of the best stuff on my sites. She uses the optimizer as a guide.
If you hire writers, this will require tweaking your instructions requiring them to inject personality and to ensure the article is not some replica in format and outline.
But Jon, my site is unique, my content is awesome and I lost a lot of traffic?
Is it though? Are your titles unique? Are your subheadings unique? Do you include fresh content angles? Do you personalize the content? Is it based on primary sources?
I was reluctant sending this email because I know some folks will take offense. Some folks will strongly disagree with me. I get that. This email is conjecture at best but I also believe it to be sound advice regardless the main purpose of the Google update.
I would take offense too if I received this email after I had a site take a beating. I’d interpret it as my site is not unique and I just copied other sites. It’s a tough pill to swallow. I would take a hard look though.
Be objective. If it is, you are a casualty of an often overreaching algo update. That really is unfortunate.
Or, my entire theory is wrong and it’s something else. That could be too.
Whether I’m right or wrong, it never hurts to publish unique content with a fresh perspective.
I strongly encourage you to read as much as you can about this update. I didn’t get hit but that doesn’t mean I won’t make any adjustments in response to it.
I have a ton of room for improvement with how I go about publishing content. Not all my content is unique. This update is a wake-up call for me too.
That said, I could be way off. It’s still rolling out. Much data will be collected over the next several weeks.
Read all theories and make your own decision.
One FS reader emailed me his theory, which could very well be spot on. They believe it’s an EAT update and that sites with poor trust signals got it. Could be. In fact, in a way my theory and his dovetail. EAT includes content quality and signalling expertise and credibility. Replica content signals lack of expertise. So the two could be going hand-in hand.
Another theory is that this update targets over-optimization. Could be? Again, that falls within my theory framework. Over-optimization is basically outlining and writing articles that already exist.
Whether I’m right or wrong, I’m going to do all I can to distinguish my sites.

Jon Dykstra is a six figure niche site creator with 10+ years of experience. His willingness to openly share his wins and losses in the email newsletter he publishes has made him a go-to source of guidance and motivation for many. His popular “Niche site profits” course has helped thousands follow his footsteps in creating simple niche sites that earn big.
Thanks for the analysis. This was helpful. Will you update this post when the core update is fully complete?
Looks like google is responding to AI content. AI content can now be produced at a better level than low-quality outsourced writers. The implication of this is that the search engines will be swamped with easy to produce “acceptable” content at low cost. They will have to find a way to filter this, which is hard to do, you may even often need a human to do it – but they will find a way. I feel AI and outsourced (because outsourcers will be using it too) content will need to value added to it by thorough proofreading removing the tell-tale signs of automation, and by adding some unique human content to it. It won’t be able to be used “as is”.
Good analysis. I’ve seen a couple wannabe gurus complaining about losing their featured snippets on YouTube and suggesting there’s a featured snippet ban. I’m convinced content that looks AI-generated is why.
A lot of these AI sites scrape thousands of featured snippets and People Also Ask questions then spin the existing answer. Some bloggers pretty much do the same thing with or without AI.
They use the exact question as the H2 and riff on the answer currently displayed. They think that’s how you win featured snippets and PAA blocks.
If Google is going after AI content, stop writing like a bot. Be better.