# Cryptography

#### Tag: paradox of a false positive

While I had previously worked with false positives in various statistic problems, I never considered the implications behind it. Cory Doctorow addresses this "false positive paradox" in his book Little Brother.

In the story, the narrator, Marcus, talks about a "99 percent accurate" test for "Super-AIDS" (Doctorow 47). However, this means the test is one percent inaccurate. One percent of one million is 10,000. Only one of those people will actually have the disease, and thus, the "99 percent accurate test will perform with 99.99 percent inaccuracy" (Doctorow 47).

This was extremely interesting, as I had never considered how inaccurate these tests really were. The statistics and numbers had always seemed solid. It made sense, and 99% seemed like an extremely high percentages. But Doctorow makes an intriguing analogy. Pointing at a single pixel on your screen with a sharp pencil works. However, pointing at a single atom with a sharp pencil would be extremely inaccurate. This essentially highlights the flaws in any test that could potentially produce false positives. Whether detecting diseases or terrorism, these tests could result in large wastes of resources, such as time and money.

This could also be connected to the process of cracking ciphers. For example, when searching for cribs in an Enigma deciphered message, a crib may make sense in a particular context but could turn out to be an incorrect decipherment for the whole message . Even in general cryptanalysis, you could potentially make progress on deciphering a message. After hours of working on a message, you could realize that you made a mistake originally and your previous progress had simply been a "false positive". Clearly, false positives can be quite dangerous and misleading. The false positive paradox further magnifies the huge effect these false positive readings can have on carrying out important tests or examinations, and the consequences could be devastating. Imagine administering a drug thought to combat a certain disease to a patient who really didn't have the disease. Because the patient was perfectly fine, this drug actually resulted in the patients' children having birth defects. A simple false positive could cause tragic repercussions.

As I was reading Little Brother, there was actually a passage that made me stop reading and reread and think about it for a while. This really never happens to me so I figured it was something I should take note of. This was the passage that talked about the paradox of a false positive. What this means is that if you are testing for something very rare in a sample population, like terrorists or people who have contracted Super AIDS, as the book says, then the test accuracy must match the rarity of whatever the case you're trying to look at. This reminded me of when we looked at the idea of data mining students to search for potential of suicide or mass shootings and the like. At the time, we simply talked about the ethics of looking into a students private data, but the efficiency and accuracy is a huge part of this too. Many college students at times will feel depressed or overwhelmed, and may search things that could potential be a red flag for the data mining for suicide. Similarly, I think a lot of people, myself included, are sometimes just curious about weird things that could seem like a red flag for mass violence, like how to make a bomb and stuff like that. If the schools took the time to look into all students that raised a red flag for suspicious behavior, they would be wasting so much of their time, and they could potentially miss real risks because of their focus on these non-risky individuals.