What the Great Hanoi Rat Massacre of 1902 and Modern Risk Practices Have in Common
When the French tried to solve Hanoi’s rat problem, they accidentally made it worse , and today’s cyber risk management is making the same mistake. Beneath the polished audits and colorful risk charts, a hidden system of perverse incentives is quietly breeding more problems than it solves.
Source: AI-generated using ChatGPT
In 1902, French colonial administrators in Hanoi discovered rats swarming the city's newly built sewer system. Alarmed by the public health risk, they launched a bounty program: locals would be paid for every rat tail they turned in.
At first, the numbers looked great. Thousands of tails poured in. But city officials started noticing something strange: rats were still everywhere. Then came the reports. People were breeding rats. Others clipped the tails and released the rats back into the wild, free to breed and be harvested again. Some even smuggled rats in from outside the city just to cash in.
The Hanoi rat ecosystem, 1902
The bounty created the illusion of progress while quietly making the problem worse. It was a textbook case of perverse incentives, a system where the rewards were perfectly aligned to reinforce failure.
Perverse Incentives in Cyber Risk
Maybe five years ago, I would’ve told you that cyber risk quantification was on the brink of going mainstream. Leaders were waking up to the flaws in traditional methods. Standards bodies were paying attention. Things felt like they were moving.
But now? I’m not so sure.
The deeper I go into this field, the more I see how powerful the gravitational pull is to keep things exactly the way they are. It turns out that cybersecurity is riddled with perverse incentives, not just in isolated cases, but as a feature of the system itself. Nearly 25 years ago, Ross Anderson made this point powerfully in his classic paper Why Information Security Is Hard, that cybersecurity isn’t just a technology problem, it’s a microeconomics problem. The challenge isn’t just building secure systems; it’s that incentives between users, vendors, insurers, consultants, and regulators are often misaligned. When the people making security decisions aren’t the ones who bear the consequences, we all suffer.
Cyber risk management today is our own version of the Hanoi rat bounty. On paper, it looks like we’re making progress: reports filed, audits passed, standards met. But beneath the surface, it's a system designed to reward motion over progress, activity over outcomes. It perpetuates itself, rather than improving.
Let me explain.
The Risk Ecosystem Is Built on Circular Incentives
The risk services ecosystem
Companies start with good intentions. They look to frameworks like NIST CSF, ISO/IEC 27001, or COBIT to shape their security programs. These standards often include language about how risk should be managed, but stop short of prescribing any particular model. That flexibility is by design: it makes the standards widely applicable. But it also leaves just enough latitude for organizations to build the easiest, cheapest, least rigorous version of a risk management program and still check the box.
So boards and executives give the directive: “Get a SOC 2,” or “Get us ISO certified.” That becomes the mission. The mission, among many other things, includes an end-to-end cyber risk management program.
Enter the consulting firms—often the Big Four. One comes in to help build the program. Another comes in to audit it. Technically, they’re separate firms. But functionally, they’re reading from the same playbook. Their job isn’t to push for rigor - it’s to get you the report. The frameworks they implement are optimized for speed, defensibility, and auditability, not for insight, accuracy, or actual risk reduction.
And so we get the usual deliverables: heat maps, red/yellow/green scoring, high/medium/low labels. Programs built for repeatability, not understanding.
So, the heatmap has become the de facto language of risk. Because the standards don’t demand more, and the auditors don’t ask for more, nobody builds more.
Where the Loop Starts to Break
Here’s where things start to wobble: the same ecosystem that builds your program is also judging whether you’ve done it “right.” Even if it’s not the same firm doing both, the templates, language, and expectations are virtually identical.
It's like asking a student to take a test, but also letting them write the questions, choose the answers, and let their buddy grade it. What kind of test do you think they’ll make? The easiest one that still counts as a win.
That’s what’s happening here.
Source: AI-generated using ChatGPT
The programs are often built to meet the bare minimum—the lowest common denominator of what’s needed to pass the audit that everyone knows is coming. Not because the people involved are bad. But because the system is set up to reward efficiency, defensibility, and status-quo deliverables - not insight, not improvement.
The goal becomes: “Check the box.” Not: “Understand the risk. Reduce the uncertainty.”
So we get frameworks with tidy charts and generic scoring systems that fit nicely on a slide deck. But they’re not designed to help us make better decisions. They’re designed to look like we’re managing risk.
And because these programs satisfy auditors, regulators, and boards, nobody asks the hard question: “Is this actually helping us reduce real-world risk?”
The Rat Tail Economy
To be clear: NIST, ISO, and similar frameworks aren’t bad. They’re foundational. But when the same firms design the implementation and define the audit, and when the frameworks are optimized for speed rather than depth, you get a system that’s highly efficient at sustaining itself and deeply ineffective at solving the problem it was created to address.
It’s a rat tail economy. We’re counting the symbols of progress while the actual risks continue breeding in the shadows.
If You're Doing Qualitative Risk, You're Not Failing
If you’re working in a company that relies on qualitative assessments—heat maps, color scores—and you feel like you should be doing more: take a breath.
You’re not alone. And you’re not failing.
The pressure to maintain the status quo is enormous. You are surrounded by it. Most organizations are perfectly content with a system that satisfies auditors and makes execs feel covered.
But that doesn’t mean it’s working.
The result is a system that:
Measures what’s easy, not what matters
Prioritizes passing audits over reducing real risk
Funnels resources into checklists, not outcomes
Incentivizes doing just enough to comply, but never more
So How Do We Stop Being Rat Catchers?
The truth? We probably won’t fix the whole system overnight. But we can start acting differently inside it.
We can build toward better—even if we have to work within its constraints for now.
A few years ago, a friend vented to me about how their board only cared about audit results. “They don’t even read my risk reports,” he said.
I asked what the reports looked like.
“Mostly red-yellow-green charts,” he admitted.
And that’s when it clicked: the first step isn’t getting the board to care—it’s giving them something worth caring about.
So start there:
Baby steps. Meet your compliance obligations, but begin quantifying real risks in parallel. Use dollar-value estimates, likelihoods, and impact scenarios. Start small - pick just one or two meaningful areas.
Translate risk into decisions. Show how quantified information can justify spending, prioritize controls, or reduce uncertainty in a way that matters to the business.
Tell better stories. Don’t just show charts; frame your findings around real-world impact, trade-offs, and possible futures.
Push gently. When working with auditors or consultants, ask: “What would it look like if we wanted to measure risk more rigorously?” Plant the seed.
This isn’t about being perfect. It’s about shifting direction, one decision at a time.
We can’t tear down the bounty system in a day—but we don’t have to breed more rats, either. We can step outside the loop, see it for what it is, and try something different.
That’s how change begins.
The Downstream Effects of Cyberextortion
Dumping sewage and toxic waste into public waterways and paying cyberextortionists to get data back are examples of negative externalities. In the case of the Chicago River, business was booming, but people downstream suffered unintended consequences. “Negative externality” is a term used in the field of economics that describes an “uncompensated harm to others in society that is generated as a by-product of production and exchange.”
Polluted Bubbly Creek - South Fork of the South Branch of the Chicago River (1911)
The following article was posted ISACA Journal Volume 4, 2018. It was originally published behind the member paywall and I’m permitted to re-post it after a waiting period. The waiting period is expired, so here it is… The text is verbatim, but I’ve added a few more graphics that did not make it to printed journal.
In the mid-1800s, manufacturing was alive and well in the Chicago (Illinois, USA) area. Demand for industrial goods was growing, the population swelled faster than infrastructure and factories had to work overtime to keep up. At the same time, the Chicago River was a toxic, contaminated, lethal mess, caused by factories dumping waste and by-products and the city itself funneling sewage into it. The river, at the time, emptied into Lake Michigan, which was also the city’s freshwater drinking source. The fact that sewage and pollution were dumped directly into residents’ drinking water caused regular outbreaks of typhoid, cholera and other waterborne diseases. The situation seemed so hopeless that the city planners embarked on a bold engineering feat to reverse the flow of the river so that it no longer flowed into Lake Michigan. Their ingenuity paid off and the public drinking water was protected. (1)
What does this have to do with paying cyberextortionists? Dumping sewage and toxic waste into public waterways and paying cyberextortionists to get data back are examples of negative externalities. In the case of the Chicago River, business was booming, but people downstream suffered unintended consequences. “Negative externality” is a term used in the field of economics that describes an “uncompensated harm to others in society that is generated as a by-product of production and exchange.”(2)
Negative externalities exist everywhere in society. This condition occurs when there is a misalignment between interests of the individual and interests of society. In the case of pollution, it may be convenient or even cost-effective for an organization to dump waste into a public waterway and, while the action is harmful, the organization does not bear the full brunt of the cost. Paying extortionists to release data is also an example of how an exchange creates societal harm leading to negative externalities. The criminal/victim relationship is a business interaction and, for those victims who pay, it is an exchange. The exact number of ransomware (the most common form of cyberextortion) victims is hard to ascertain because many crimes go unreported to law enforcement;(3) however, payment amounts and rate statistics have been collected and analyzed by cybersecurity vendors, therefore, it is possible to start to understand the scope of the problem. In 2017, the average ransomware payment demand was US $522,(4) with the average payment rate at 40 percent.(5) The US Federal Bureau of Investigation (FBI) states that “[p]aying a ransom emboldens the adversary to target other victims for profit and could provide incentive for other criminals to engage in similar illicit activities for financial gain.”(6) It costs a few bitcoin to get data back, but that action directly enriches and encourages the cybercriminals, thereby creating an environment for more extortion attempts and more victims.
Ransomware is specially crafted malicious software designed to render a system and/or data files unreadable until the victim pays a ransom. The ransom is almost always paid in bitcoin or another form of cryptocurrency; the amount is typically US $400 to $1,000 for home users and tens of thousands to hundreds of thousands of US dollars for organizations. Typically, ransomware infections start with the user clicking on a malicious link from email or from a website. The link downloads the payload, which starts the nightmare for the user. If the user is connected to a corporate network, network shares may be infected, affecting many users.
The economic exchange in the ransomware ecosystem occurs when cybercriminals infect computers with malware, encrypt files and demand a ransom to unlock the files, and the victim pays the ransom and presumably receives a decryption key. Both parties are benefiting from the transaction: The cybercriminal receives money and the victim receives access to his/her files. The negative externality emerges when the cost that this transaction imposes on society is considered. Cybercriminals are enriched and bolstered. Funds can be funneled into purchasing more exploit kits or to fund other criminal activities. Just like other forms of negative externalities, if victims simply stopped supporting the producers, the problem would go away. But, it is never that easy.
Cyberextortion and Ransomware
The Interview, 2014
Cyberextortion takes many different shapes. In November 2014, hackers demanded that Sony Pictures Entertainment pull the release of the film The Interview or they would release terabytes of confidential information and intellectual property to the public. (7) In 2015, a group of hackers calling themselves The Impact Team did essentially the same to the parent company of the Ashley Madison website, Avid Life Media. The hackers demanded the company fold up shop and cease operations or be subject to a massive data breach.(8) Neither company gave in to the demands of the extortionists and the threats were carried out: Both companies suffered major data breaches after the deadlines had passed. However, there are many examples of known payments to extortionists; ProtonMail and Nitrogen Sports both paid to stop distributed denial-of-service (DDoS) attacks and it was widely publicized in 2016 and 2017 that many hospitals paid ransomware demands to regain access to critical files. (9)
There is a reason why cyberextortion, especially ransomware, is a growing problem and affects many people and companies: Enough victims pay the ransom to make it profitable for the cybercriminals and, while the victims do suffer in the form of downtime and ransom payment, they do not bear the brunt of the wider societal issues payment causes. Paying ransomware is like dumping waste into public waterways; other people pay the cost of the negative externality it creates (figure 1).
Fig. 1: The ransomware ecosystem
The Ransomware Decision Tree
There are several decisions a victim can make when faced with cyberextortion due to ransomware. The decision tree starts with a relatively easy action, restoring from backup, but if that option is not available, difficult decisions need to be made—including possibly paying the ransom. The decision to pay the ransom can not only be costly, but can also introduce negative externalities as an unfortunate by-product. The decision is usually not as simple as pay or do not pay; many factors influence the decision-making process (figure 2).
Fig. 2: Ransomware response decision tree
Understanding the most common options can help security leaders introduce solutions into the decision-making process:
Restore from backup—This is the best option for the victim. If good quality, current backups exist, the entire problem can be mitigated with minimal disruption and data loss. This typically entails reloading the operating system and restoring the data to a point in time prior to the infection.
Decrypters—Decrypter kits are the product of the good guys hacking bad-guy software. Just like any software, ransomware has flaws. Antivirus vendors and projects such as No More Ransom! (10) have developed free decrypter kits for some of the most popular ransomware strains. This enables the victim to decrypt files themselves without paying the ransom.
Engage with extortionists—This is a common choice because it is convenient and may result in access to locked files, but it should be the path of last resort. This involves engaging the extortionists, possibly negotiating the price and paying the ransom. Victims will usually get a working decryption key, but there are cases in which a key was not provided or the key did not work.
Ignore—If the files on the computer are not important, if the victim simply has no means to pay, and a decrypter kit is not available, the victim can simply ignore the extortion request and never again gain access to the locked files.
It is clear that there are few good options. They are all inconvenient and, at best, include some period of time without access to data and, at worst, could result in total data loss without a chance of recovery. What is notable about ransomware and other forms of cyberextortion is that choices have ripple effects. What a victim chooses to do (or not do) affects the larger computing and cybercrime ecosystems. This is where the concept of externalities come in—providing a construct for understanding how choices affect society and revealing clues about how to minimize negative effects.
What Can Be Done?
“Do not pay” is great advice if one is playing the long game and has a goal of improving overall computer security, but it is horrible advice to the individual or the hospital that cannot gain access to important, possibly life-saving, information and there are no other options. Advising a victim to not pay is like trying to stop one person from throwing waste into the Chicago River. Turning the tide of ransomware requires computer security professionals to start thinking of the long game—reversing the flow of the river.
English economist Arthur Pigou argued that public policies, such as “taxes and subsidies that internalize externalities” can counteract the effects of negative externalities.(11) Many of the same concepts can be applied to computer security to help people from falling victim in the first place or to avoid having to pay if they already are. Possible solutions include discouraging negative externalities and encouraging (or nudging parties toward) positive externalities.
On the broader subject of negative externalities, economists have proposed and implemented many ideas to deal with societal issues, with varying results. For example, carbon credits have long been a proposal for curbing greenhouse gas emissions. Taxes, fines and additional regulations have been used in an attempt to curb other kinds of pollution. (12) Ransomware is different. There is no single entity to tax or fine toward which to direct public policy or even with which to enter into negotiations.
Positive externalities are the flip side of negative—a third party, such as a group of people or society as a whole, benefits from a transaction. Public schools are an excellent example of positive externalities. A small group of people—children who attend school—directly benefit from the transaction, but society gains significantly. An educated population eventually leads to lower unemployment rates and higher wages, makes the nation more competitive, and results in lower crime rates.
Positive externalities are also observed in the ransomware life cycle. As mentioned previously, antivirus companies and other organizations have, both separately and in partnership, developed and released to the public, free of charge, decrypter kits for the most common strains of ransomware. These decrypter kits allow victims to retrieve data from affected systems without paying the ransom. This has several benefits. The victim receives access to his/her files free of charge, and the larger security ecosystem benefits as well.
Once a decrypter kit is released for a common strain, that strain of ransomware loses much of its effectiveness. There may be some people who still pay the ransom, due to their lack of awareness of the decrypter kit. However, if the majority of victims stop paying, the cost to attackers increases because they must develop or purchase new ransomware strains and absorb the sunk cost of previous investments.
Decrypter kits are part of a larger strategy called “nudges” in which interested parties attempt to influence outcomes in nonintrusive, unforced ways. Behavioral economists have been researching nudge theory and have discovered that nudges are very effective at reducing negative externalities and can be more effective than direct intervention. This is an area in which both corporations and governments can invest to help with the ransomware problem and other areas of cybercrime. Some future areas of research include:
Public and private funding of more decrypter kits for more strains of ransomware
Long-term efforts to encourage software vendors to release products to the market with fewer vulnerabilities and to make it easier for consumers to keep software updated
Education and assistance to victims; basic system hygiene (e.g., backups, patching), assistance with finding decrypter kits, help negotiating ransoms
It is important for information security professionals to consider figure 2 and determine where they can disrupt or influence the decision tree. The current state of ransomware and other forms of cyberextortion are causing negative societal problems and fixing them will take a multi-pronged, long-term effort. The solution will be a combination of reducing negative externalities and encouraging positive ones through public policy or nudging. The keys are changing consumer behavior and attitudes and encouraging a greater, concerted effort to disrupt the ransomware life cycle.
Endnotes
1 Hill, L.; The Chicago River: A Natural and Unnatural History, Southern Illinois University Press, USA, 2016
2 Hackett, S. C.; Environmental and Natural Resources Economics: Theory, Policy, and the Sustainable Society, M. E. Sharpe, USA, 2001
3 Federal Bureau of Investigation, “Ransomware Victims Urged to Report Infections to Federal Law Enforcement,” USA, 15 September 2016, https://www.ic3.gov/media/2016/160915.aspx
4 Symantec, Internet Security Threat Report, volume 23, USA, 2018
5 Baker, W.; “Measuring Ransomware, Part 1: Payment Rate,” Cyentia Institute, https://www.cyentia.com/2017/07/05/ransomware-p1-payment-rate/
6 Op cit Federal Bureau of Investigation
7 Pagliery, J.; “What Caused Sony Hack: What We Know Now,” CNNtech, 29 December 2014, http://money.cnn.com/2014/12/24/technology/security/sony-hack-facts/index.html
8 Hackett, R.; “What to Know About the Ashley Madison Hack,” Fortune, 26 August 2015, http://fortune.com/2015/08/26/ashley-madison-hack/
9 Glaser, A.; “U.S. Hospitals Have Been Hit by the Global Ransomware Attack,” Recode, 27 June 2017, https://www.recode.net/2017/6/27/15881666/global-eu-cyber-attack-us-hackers-nsa-hospitals
10 No More Ransom!, https://www.nomoreransom.org
11 Frontier Issues in Economic Thought, Human Well-Being and Economic Goals, Island Press, USA, 1997
12 McMahon, J.; “What Would Milton Friedman Do About Climate Change? Tax Carbon,” Forbes, 12 October 2014, https://www.forbes.com/sites/jeffmcmahon/2014/10/12/what-would-milton-friedman-do-about-climate-change-tax-carbon/#53a4ef046928
What do paying cyber extortionists and dumping toxic sludge into the Chicago River have in common?
Paying cyber ransoms is like dumping toxic sludge into a public river—cheap in the short term, but costly to society. In this post, I explain how ransomware payments create negative externalities, and why real solutions require changing incentives, not just victim behavior.
What do paying cyber extortionists and dumping toxic sludge into the Chicago River have in common? A lot, actually! Decipher recently interviewed me on some of the research I’ve published and talks I’ve given on ransomware, incentives, negative externalities and how we, the defenders, can influence decisions.
A negative externality is a term used in the field of economics that describes a situation in which a third party incurs a cost from an economic activity. In the case of pollution, it may be convenient or even cost-effective for a firm to dump waste into a public waterway, and while the action is harmful, the firm does not bear the full brunt of the cost. In the case of paying cyber extortionists, it may cost a few Bitcoin to get data back, but that action directly enriches, emboldens and encourages the cybercriminals, thereby creating an environment for more extortion attempts and more victims. We see negative externalities everywhere in society. This condition occurs when there is a misalignment between interests to the individual and interests to society.
City planners in Chicago reversed the flow of the Chicago River to solve the pollution problem, and it worked! A similar solution is needed in the case of cyber extortion, ransomware and malware in general. Focusing on changing victims’ behavior by constantly saying “Don’t ever pay the ransom!” isn’t working. We need to move upstream – further up in the decision tree – to affect real change.
The cover image is a picture taken in 1911 of a man standing on slaughterhouse waste floating on Bubbly Creek, a fork of the Chicago River. Bubbly Creek was described in horrifying detail by Upton Sinclair in The Jungle. The drainage of many meat packing houses flowed into Bubble Creek and was made of sewage, hair, lard and chemicals. It periodically spontaneously combusted into flames and the Chicago Fire Department had to be dispatched regularly to put it out.