In Doctorow’s Little Brother, Marcus Yallow is a young boy who is falsely accused and interrogated on the grounds of being a terrorist. He decides to wage war against the DHS, the organization that kidnapped him, by creating more instances of suspicious behavior in order to make their security systems seem wildly inaccurate. He explains it by saying, “the more people [the security system] catches, the more it gets brittle. If it catches too many people, it dies”. He uses the paradox of the false positive to help him achieve this.

So, what is the paradox of the false positive? Well, let’s say 1 in every 100,000 college students commits suicide and universities have a system that can predict these tragic events 99% of the time based on the student’s web behavior. At first glance this seems pretty accurate, right? Wrong. This means that for every 100,000 students, 1% of students flags up on the system. 1% of 100,000 is 1,000 students. That is way larger than the actual number of students who commit suicide. Therefore, only if only 1 of these 1,000 students commit suicide, that’s an inaccuracy of 99.9%. This is known as the paradox of the false positive.

When reading Yallow’s explanation of this paradox it caught my eye. I found it very interesting because it highlights just how easy it can be for data to be manipulated in many different ways in order to portray a certain story. For example, a test for XYZ disease could be 99% accurate, however, it doesn’t paint the whole picture of how reliable the actual product is. This could lead to consumers who falsely tested positive for the disease to not only worry but also pay a money for medication that they don’t necessarily need. This can apply to many other products and services as well, and so it has made think twice before blindingly accepting data.