(Predictable) Trouble in the House of Google

Jeff Atwood recently blogged about the declining quality of search results, particularly with respect to Stack Overflow content.

His main observation is that content syndication websites are starting to outrank the original content source in search results. He's right to suggest that this is a problem; for everyone except content syndicators of course.

While I appreciate his disappointment, far from being upset by this, I'm impressed that it's taken this long.  It also wouldn't surprise me if this was a short term thing. As Jeff mentions, Google could probably "tweak a few algorithmic knobs" to make the problem will go away for a while.

It's clear that Google does a lot of work to try to give you useful results to your queries.  But Google is only one company.  A single (albeit massive) company fighting to provide relevant results against an army of hundreds of millions (billions?) of sites trying to get to the front page of Google search results.

So why should we be surprised that it's finally becoming a problem?  There are striking similarities in the history of DRM, or computer viruses, or spam detection. One protagonist against millions; all looking for ways to game or cheat whatever system they're trying to defeat. In my opinion, Google has been orders of magnitude more effective in staving off attacks than the three prior examples.

Effective DRM is really only renowned for pissing people off (those links took mere seconds to find - in Google coincidentally), and we're always being subjected to new viruses and spam techniques.

I hope that this trend is halted by the twiddling of knobs, but either way I'm impressed by how long Google has largely held off this threat.

Damian Brady

I'm an Australian developer, speaker, and author specialising in DevOps, MLOps, developer process, and software architecture. I love Azure DevOps, GitHub Actions, and reducing process waste.

--