SEO Piece 10 · ~9 min read

What I learned from publishing thousands of bad SEO pages.

A public, honest reflection on the location-page strategy that powered the previous version of this site, why it failed, and what replaced it.

Until this rebuild, the previous version of sanammunshi.com ran on a strategy that's embarrassing in retrospect. Several thousand programmatically generated pages, each targeting a permutation of "[service] in [suburb]" or "[service] for [industry]". The pages were thin, repetitive, and indistinguishable from each other except for a swapped place name in the headline and the meta description. The strategy was a textbook example of the kind of SEO playbook I now write pieces criticising other agencies for using.

It worked, briefly, around 2018-2020. It stopped working in 2021. It became actively harmful in 2023 with the Google helpful-content systems update. By 2024, the site was being suppressed in search results, and by 2025 most of those pages had been deindexed entirely. The rebuild this site is part of started, in honest terms, with the acknowledgment that the old strategy was wrong.

Writing this in public is uncomfortable. It's also more useful than the polite alternative — the version where I quietly retire the old pages, never mention them, and hope no one notices the discrepancy between what I publish now and what I published five years ago. Operators who are honest about what they got wrong are more credible than the ones who pretend they were always right.

What I was thinking, then

The argument for programmatic SEO pages was straightforward and, at the time, evidence-supported. Google rewarded sites that had pages targeting specific long-tail queries. Long-tail queries had less competition than short-tail queries. Programmatic generation made it cheap to produce thousands of pages, each targeting a different long-tail variant. The math, on paper, was great: low cost per page, low competition per query, large volume of small wins.

The argument was right about the mechanics and wrong about the trajectory. Google's quality bar for what counted as a useful page was rising fast. Programmatic content that satisfied the 2018 quality bar didn't survive the 2021 bar, and was actively penalised under the 2023 bar. The strategy was harvesting a temporary inefficiency in Google's ability to detect thin content, and the inefficiency closed.

What I missed

I missed three things, and missing any one of them would have been enough to invalidate the strategy:

What I learned

The lesson isn't "don't do programmatic SEO". Programmatic content can still be excellent when each page genuinely earns its place — when there's real, distinct, useful information on each variant that justifies its existence. The lesson is sharper than that:

Strategies that rely on a temporary algorithmic inefficiency are not strategies. They're trades. They have to be exited before the inefficiency closes, and most operators don't exit them in time, including me.

The compounding strategies — the ones worth investing in — are the ones aligned with the direction the algorithm and the user behaviour are both moving. In 2020 that was already clear: entity authority, demonstrated expertise, original artefacts that other sources cite, distinctive points of view. I wasn't blind to it; I just didn't act on it because the programmatic strategy was still producing acceptable short-term results.

What replaced it

The rebuild this site is part of replaces the old strategy with the opposite shape:

The meta-lesson

The strategies that scale to thousands of pages and rely on Google's inability to spot a pattern are, by their nature, fragile to Google getting better at spotting that pattern. Google has gotten better. It will continue to get better. Generative AI search engines are going to compress this dynamic further — they're far harder to game with thin content because they're optimising for whether to confidently recommend a source, not just whether to surface it in a results page.

The operators who win in the next five years are the ones who build for entity authority and substantive expertise from the start. The operators who lose are the ones still betting on the next algorithmic loophole. I was betting on the loophole. I'm not, anymore.