How to murder a Blog

Short of a server going down, one of the quickest and most effective ways for a Blog to die is for it to drop out of search engines. Sure, you might have a ton of repeat visitors, but no search ranking means no new visitors because, well, no one can find you. What a blinding flash of the obvious.

So what’s the best way to drop a Blog out of search engines? Add User-agent: * Disallow: / to robots.txt. And who would be stupid enough to do that?

How to murder a Blog
How to murder a Blog

Me.

During a server upgrade a few years ago I ran a script to automatically update the code on all of my sites. It just so happened that the script was based on my new site install script which included the above robots.txt line to prevent staging sites from getting indexed before their launch.

So, you guessed it, it took me just long enough to notice my FUBAR that Inert Ramblings had effectively disappeared off the Net over the course of two weeks. Unfortunately, recovery wasn’t as simple as just restoring the original robots.txt and the site instantly reappearing; it still took months for Google and other search engines to reindex the entire site and for traffic to pick back up.

Is there a moral to the story (other than pointing and laughing at me)? Sure. Be absolutely certain that none of your maintenance scripts make unexpected changes to robots.txt or any other config file. Luckily only one site was effected in the long-term but it could have been much worse had it taken longer for me to notice.