What I learned from publishing thousands of bad SEO pages.
A public, honest reflection on the location-page strategy that powered the previous version of this site, why it failed, and what replaced it.
Until this rebuild, the previous version of sanammunshi.com ran on a strategy that's embarrassing in retrospect. Several thousand programmatically generated pages, each targeting a permutation of "[service] in [suburb]" or "[service] for [industry]". The pages were thin, repetitive, and indistinguishable from each other except for a swapped place name in the headline and the meta description. The strategy was a textbook example of the kind of SEO playbook I now write pieces criticising other agencies for using.
It worked, briefly, around 2018-2020. It stopped working in 2021. It became actively harmful in 2023 with the Google helpful-content systems update. By 2024, the site was being suppressed in search results, and by 2025 most of those pages had been deindexed entirely. The rebuild this site is part of started, in honest terms, with the acknowledgment that the old strategy was wrong.
Writing this in public is uncomfortable. It's also more useful than the polite alternative — the version where I quietly retire the old pages, never mention them, and hope no one notices the discrepancy between what I publish now and what I published five years ago. Operators who are honest about what they got wrong are more credible than the ones who pretend they were always right.
What I was thinking, then
The argument for programmatic SEO pages was straightforward and, at the time, evidence-supported. Google rewarded sites that had pages targeting specific long-tail queries. Long-tail queries had less competition than short-tail queries. Programmatic generation made it cheap to produce thousands of pages, each targeting a different long-tail variant. The math, on paper, was great: low cost per page, low competition per query, large volume of small wins.
The argument was right about the mechanics and wrong about the trajectory. Google's quality bar for what counted as a useful page was rising fast. Programmatic content that satisfied the 2018 quality bar didn't survive the 2021 bar, and was actively penalised under the 2023 bar. The strategy was harvesting a temporary inefficiency in Google's ability to detect thin content, and the inefficiency closed.
What I missed
I missed three things, and missing any one of them would have been enough to invalidate the strategy:
- The user side of the equation. The pages weren't useful to anyone who landed on them. Even when they ranked, the engagement metrics were terrible. Bounce rate was over 80% on most of them. The user signal Google was capturing was unambiguously negative — and Google was learning to weight that signal more heavily every quarter.
- The brand consequence. A prospective client who ended up on one of the programmatic pages and clicked around the site was getting a clear signal: this operator publishes thin, low-effort content under their own name. That's a brand cost I didn't fully account for at the time, and it cost me business I'd never know about.
- The compounding direction of search. Google's whole trajectory through the early 2020s was toward weighting entity confidence, originality, and demonstrable expertise more heavily — not less. Programmatic content goes against all three. I was building leverage in the direction the algorithm was retreating from.
What I learned
The lesson isn't "don't do programmatic SEO". Programmatic content can still be excellent when each page genuinely earns its place — when there's real, distinct, useful information on each variant that justifies its existence. The lesson is sharper than that:
Strategies that rely on a temporary algorithmic inefficiency are not strategies. They're trades. They have to be exited before the inefficiency closes, and most operators don't exit them in time, including me.
The compounding strategies — the ones worth investing in — are the ones aligned with the direction the algorithm and the user behaviour are both moving. In 2020 that was already clear: entity authority, demonstrated expertise, original artefacts that other sources cite, distinctive points of view. I wasn't blind to it; I just didn't act on it because the programmatic strategy was still producing acceptable short-term results.
What replaced it
The rebuild this site is part of replaces the old strategy with the opposite shape:
- A small number of pages, each substantial. Eight or nine canonical pages. Plus an archive of long-form pieces that grows by one to four every month. Total page count an order of magnitude smaller than before.
- Original artefacts. Frameworks documented in detail. Original benchmarks. Honest writing on what's worked and what hasn't. The kind of content other operators cite — and that AI search engines surface in answers.
- Entity-first SEO. Schema markup that asserts a clean identity, sameAs corroboration across authoritative sources, knowledge-panel infrastructure built into the foundation rather than bolted on later.
- Patience on traffic. The new shape will produce less traffic in the first year than the old shape did at peak. It will produce significantly more relevant traffic over five years, and it will be defensible against the algorithm changes that come next.
The meta-lesson
The strategies that scale to thousands of pages and rely on Google's inability to spot a pattern are, by their nature, fragile to Google getting better at spotting that pattern. Google has gotten better. It will continue to get better. Generative AI search engines are going to compress this dynamic further — they're far harder to game with thin content because they're optimising for whether to confidently recommend a source, not just whether to surface it in a results page.
The operators who win in the next five years are the ones who build for entity authority and substantive expertise from the start. The operators who lose are the ones still betting on the next algorithmic loophole. I was betting on the loophole. I'm not, anymore.