Cryptography

The History and Mathematics of Codes and Code Breaking

Month: October 2015 Page 2 of 5

Justified Paranoia

When reading Little Brother, the passage that really stuck out to me was after Marcus’s kidnapping when he initially realized he was being bugged. The combination of paranoia, fear, and anger surrounding Marcus’s every thought became evident as he emotionally responded to the Department of Homeland Security watching his every move. He cautiously approached the seriousness of his situation, stating that “There were eyes out there, eyes and ears, and they were watching me. Surveilling me.” (Doctorow 86)

I was drawn to this passage because I had a similar reaction when the Edward Snowden leak occurred. Although I obviously have different circumstances than Marcus and have nothing to hide whatsoever, I still felt as if my privacy had been unrightfully invaded. Knowing that the government was capable of surveying my daily activities through my browser history, phone calls, text messages, and more gave me an uncertainty about if anything I ever did was truly private.

And as an ordinary, completely harmless citizen, I mostly viewed this process as unnecessary. The United States government has nothing to gain by monitoring my online activity, as all they will discover is the unhealthy amount of time I spend on Facebook, my ability to watch countless Grey’s Anatomy episodes on Netflix, and my slight obsession with Taylor Swift music videos.

Still, I view my browser history as mine, as it is a reflection of my day-to-day thoughts. Googling whatever comes to my mind has become a kind of second nature to me and looking at what I’ve searched will quickly reveal my favorite TV shows, places to online shop, and which classes I’m taking. At times, particularly when ads directed at me from my past search history show up, I think the Internet knows me better than I know myself. And do I want the United States government to know me on this same, personal level? Definitely not. My online activity is arguably one of the closest things to a diary that I have, and while I understand the goal of finding potential terrorists through data mining, I can’t help but feel the same paranoid, taken-aback emotions Marcus did when he was bugged.

The Right of a Citizen

The passage in Little Brother that caught my attention was the debate about halfway through concerning the moral implications of breaching citizens’ privacy in the name of security. Marcus ends the debate by quoting from the Declaration of Independence, that it is “the right of the people to alter or abolish” any government that is no longer “deriving just power from the consent of the governed” (Doctorow 180). This novel was published in 2008, before the NSA and Edward Snowden scandal, and in some aspects of the story this is very obvious. The NSA is skimmed over the few times it is mentioned, and it seems to be an impenetrable fortress of hidden information – not so today, after their secrets were published for the world to see.

Though it was to a far lesser extent than in the novel, when the news broke that the NSA had been collecting phone data from millions of Americans people were outraged. Though some privacy has to be given up in order to ameliorate security, such a blatant breach of privacy was something the public was incredibly incensed about. To be American citizens and have the security afforded by such government organizations as the NSA, CIA, and FBI is one thing, but to be secretly spied on by one’s own government was another matter entirely.

Due to the public’s outrage, the NSA was forced to start changing some of its policies, which is a living example of the people’s right to change the government if it is not benefiting them.

Progression is Activism

Although at first I was a little miffed about the idea of reading an entire novel over break, but it was actually a pretty relaxing read and some points the author made were really thought provoking. Sometimes I felt he was trying to be too hip – I suppose this is a common occurrence in a lot of teen fiction – every time I read “total horn-dog” I was thinking, “what?” But that’s neither here nor there. There were a number of quotes that I really thought about, like when Marcus was arguing for the absolute protection of the Bill of Rights, and the total non-professionalism some of the authority figures in the book seemed to exude, but Marcus also pointed out something very important. “I can’t go underground for a year, ten years, my whole life, waiting for freedom to be handed to me. Freedom is something you have to take for yourself.” Inspiring, isn’t it?

Truly, nothing will be accomplished by passivity. The constant activism and solving problems is what propels movements forward – awareness will get something started, but there must be steps taken beyond that. Cryptography is similarly a constantly evolving subject, requiring analysis that is always considering different options and perspectives. It couldn’t progress so efficiently if cryptanalysts were always waiting for other cryptanalysts to decipher notes themselves – and in many cases, that’s exactly what they don’t want to happen.

The False Positive Paradox

While I had previously worked with false positives in various statistic problems, I never considered the implications behind it. Cory Doctorow addresses this “false positive paradox” in his book Little Brother.

In the story, the narrator, Marcus, talks about a “99 percent accurate” test for “Super-AIDS” (Doctorow 47). However, this means the test is one percent inaccurate. One percent of one million is 10,000. Only one of those people will actually have the disease, and thus, the “99 percent accurate test will perform with 99.99 percent inaccuracy” (Doctorow 47).

This was extremely interesting, as I had never considered how inaccurate these tests really were. The statistics and numbers had always seemed solid. It made sense, and 99% seemed like an extremely high percentages. But Doctorow makes an intriguing analogy. Pointing at a single pixel on your screen with a sharp pencil works. However, pointing at a single atom with a sharp pencil would be extremely inaccurate. This essentially highlights the flaws in any test that could potentially produce false positives. Whether detecting diseases or terrorism, these tests could result in large wastes of resources, such as time and money.

This could also be connected to the process of cracking ciphers. For example, when searching for cribs in an Enigma deciphered message, a crib may make sense in a particular context but could turn out to be an incorrect decipherment for the whole message . Even in general cryptanalysis, you could potentially make progress on deciphering a message. After hours of working on a message, you could realize that you made a mistake originally and your previous progress had simply been a “false positive”. Clearly, false positives can be quite dangerous and misleading. The false positive paradox further magnifies the huge effect these false positive readings can have on carrying out important tests or examinations, and the consequences could be devastating. Imagine administering a drug thought to combat a certain disease to a patient who really didn’t have the disease. Because the patient was perfectly fine, this drug actually resulted in the patients’ children having birth defects. A simple false positive could cause tragic repercussions.

I Don’t Care.

The best security measures in the world, even the seemingly flawless and fool-proof of systems, can be beaten by the simplest of things: the failing to enforce it.

In Little Brother by Cory Doctorow, there was a point in which Marcus received an email from some followers who were caught jamming by some cops. However, after waiting to be interrogated in one a prison truck, they were released after a new shift of cops decided that they had “better things to do than bother [them] with more questions.”

Isn’t this so telling of our relationship with technology? It is the whole “Technology is only as good as the person using it” argument all over again. If someone somewhere along the chain of command decides that he/she doesn’t care, the whole system goes down the drains.

This is also evident in other historical cryptographic events as well. The Enigma machine was cracked partially due to the fact that cryptographers got lazy. They had predictable language and sometimes even predictable keys (e.g. initials of loved ones). Without these, the Enigma could have been “uncrackable” for a much longer time.

This story from the book also interests me because it discusses human motivation in an almost real-world setting. Antoine de Saint-Exupery once said, “If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, but rather teach them to long for the endless immensity of the sea.” The DHS did not properly motivate their employees to ensure that they would follow through with their commands which led to failure in this situation. People will only go so far for so long when listening to orders before they give up. This can be due to any number of reasons. In one scene, when Marcus invited Ange over to use the XBox and chill for the evening, Ange points out one of these reasons and notes that Marcus’s “only weapon is [his] ability to make [the DHS] look like morons.” She points out that in order to derail the DHS, Marcus needs to undermine the motivation of the DHS to continue what they were doing.

 

When Government Surveillance Goes Too Far…

Many segments of Little Brother brought up interesting points about the frightening consequences of complete government surveillance and its relation to security. One passage that caught my eye was in the beginning of Chapter 6 when the Turkish coffee shop owner told Marcus about the newly implemented Patriot Act II. In this fictional world, Congress passed Patriot Act II, which let the government monitor just about everything, including every time someone used his or her debit card, as a means to increase security following the terrorist attack.

I do not think that the government should be allowed to monitor its citizens so closely. I reacted to this passage because of its applicability to real life. After massive acts of terrorism, security is usually heightened. For example, after the horrific 9/11 attacks, Congress passed the Patriot Act, which vastly expanded the government’s authority to spy on American citizens. In Little Brother, the fabricated Patriot Act II represents a whole new level of increased governmental surveillance that goes too far. Will the US government ever try to implement such a law? This passage made me reflect on the future of our nation and what would happen if our society eventually mirrored the one portrayed in the novel.

The topic of my first paper was data mining. Data mining is a way of increasing surveillance and reducing one’s privacy, and can be related to Patriot Act II because of this. Will data mining progress into an uncontrollable amount of surveillance? Should data mining be allowed? Is it ethical? Every time you relinquish some amount of your privacy to the government, no matter how small or seemingly insignificant, the government’s power increases. This novel shows the destructive nature of extreme governmental authority, which is something we all need to be aware of and watch out for in our current society.

It’s What We Make of It

For this blog post I would like to introduce a rather grim prospect that I’ve been pondering about the human knowledge threshold. I was reminded of the thought after reading Marcus’ discussion on how we can wield computers and other tools of such sophistication and power with just a few “lines of code” (Doctorow 119).

The idea starts out with the outlook that humankind will continue to accumulate more and more knowledge as long as we exist, be it from discoveries in the natural world or learning it from the complex information systems that others create (ex: designing and keeping up with the expansion of the internet).

Particularly in the scientific fields, as we continue to acquire more information, we will eventually reach a point of information overload. What I mean by that is, maybe in a few centuries from now, even if someone were to start learning and specializing in one specific field from the youngest age possible, it would take them more than a lifetime to learn the information already known in that subject.

Needless to say, further expansion of knowledge within that field may be impossible to achieve by studying, which grants the scholar the awful prospect that all work that is physically possible to do in a lifetime has already been done, and a sense of futility in pursuing further scholarship in that field for greater knowledge. However, one can argue that the introduction of computers that can process all the information and sustain a method of inquiry may be able to replace us and surpass the knowledge threshold we may have.

After all, the components of such “complicated machines” have been “microminiaturized” so that “billions” of parts can now fit within the “machines,” making them more efficient overall (Doctorow 119). So who’s to say we can’t further miniaturize and compact existing computers so that they are more efficient, resulting in the possibility of synthetically attaining knowledge above a human’s threshold?

We can imagine endless possibilities as arguments against this prospect of the human knowledge threshold, and counterarguments against those arguments, but the more we try to solve this information overload, the more complex our understanding of the world becomes, the greater our fall is…when, let’s say, a terrorist group wipes out all electronic data in the world. How do we advance our quest for knowledge then?

Maybe Darwinism will have an answer for our intellectual threshold.

 

An additional note: the above prospect is really more focused on fields that are more scientific in nature, and that demand expansion off already existing knowledge. Thus, we will never see a “knowledge threshold” in the arts, for creativity is boundless.

 

Illegal Math: Fact not Fiction

I chose the beginning of chapter 17, when Marcus and Ange went to the journalist, Barbara Stratford, to expose the rampant abuses of power that were occurring in San Francisco. During this, they discovered that Barbara herself had covered the original ‘crypto wars’ in the 90’s. Barbara describes how the government had labeled cryptography as a munition and made it illegal to use it or export it, all in the name of national security. While I thought this was really interesting, the next sentence blew my mind. This means that we had ILLEGAL math. MATH, made illegal.

Can you imagine there being a time when certain equations and formulae were considered illegal? This interests me most because less than two decades after this illegal math, we are taking a class specifically about this illegal math. We’ve seen in class how cryptography has been used throughout history, and it always has been, and probably always will be, a part of life in government. However, it was always that it was only accessible to the wealthy, and those in government. No one else could afford the knowledge required, so we couldn’t keep secrets from the government. With the rapid spread of computers and advancement in technology, suddenly average citizens could afford to encode their messages, and it is very interesting to me that the government was so threatened by this that they felt the need to ban this knowledge.

Of course, it is also my opinion that, like Prohibition, this just proliferated the use of cryptography, but with even less government control. My favorite part of class so far has been our discussions about the intersection of cryptography, government, and privacy, which is why Little Brother, and especially this chapter hold my interest so well. With cryptography and cryptanalysis becoming ever more advanced, it will be exciting to see how the government handles all this as well.

There’s Always Someone Smarter

A passage from Cory Doctorow’s Little Brother that caught my attention was “[t]he problem had been that Turing was smarter than the guy who thought up Enigma.  Any time you had a cipher, you were vulnerable to someone smarter than you coming up with a way of breaking it (99).”  It first caught my attention because we just recently discussed World War II and Bletchley Park, where Alan Turing broke Enigma.  Second, when Marcus, the main character of the novel, said “you were vulnerable to someone smarter than you coming up with a way of breaking it,” I was reminded of the several in-class discussions about always assuming your cipher is breakable.

Throughout history, there have been several instances where cipher makers have been overconfident in the security of their ciphers.  For example, in the 16th century, Mary Queen of Scots employed a substitution cipher, using numbers and symbols.  Unfortunately for her, she placed too much confidence in both her cipher and her contacts, and by the use of frequency analysis, her cipher was broken, and her plot to escape imprisonment and murder Queen Elizabeth I was unveiled.  In 1587, Mary Queen of Scots was executed.

Another example of cipher overconfidence came from the Germans during WWII.  Their Enigma was incredibly secure, so naturally the Germans assumed that it was unbreakable.  However, the British had a team of highly intelligent mathematicians on their side, including Alan Turing, who discovered a flaw in Enigma, and was therefore able to break it.  He created a machine called a “bombe” that was able to break the daily German Enigma key.  The Germans were unknowingly sending intercepted and decipherable messages.  Their overconfidence in their cipher led to their ultimate downfall.

Overconfidence in one’s security plays a large role in Little Brother.  With the Department of Homeland Security monitoring nearly everything from transportation routes to Internet usage, Marcus and his friends were in serious need of ciphers and codes to protect their privacy.  Marcus learns early on that with every bright idea, there exists another that is better.  Most likely, there is always someone who’s smarter.

Timelines

Some timeline links you’ll need for class today:

And the syllabus description of your upcoming second paper assignment:

  • In this paper, you’ll identify one or more lessons about keeping secrets drawn from historical examples of codes and ciphers—examples we’ve read and discussed, as well as ones we haven’t. This paper will give you the chance to practice your descriptive writing, while using examples and stories to support a central argument.
  • Here’s where that timeline will come in handy, by providing leads for examples you could use in your paper. Also potentially useful: the essays on historical codes and ciphers written by students in the 2012 and 2014 offerings of this course, available on the course blog. You’ll need to do some original research, too.
  • Your paper should be between 1,000 and 1,250 words in length. It will be graded primarily on the quality of your examples and how well you connect your examples to your thesis.

Page 2 of 5

Powered by WordPress & Theme by Anders Norén