Cryptography

The History and Mathematics of Codes and Code Breaking

Author: Maria Sellers

Necessity is the Mother of Public Use

Necessity is the mother of invention. It wasn’t until the late 19th century that cryptography began to amass a large public following. According to Singh, it was the invention of the telegraph that made the use of ciphers common in the general public. Since telegraph operators had to read a message to send it, those who wished to send more sensitive or private messages had to figure out ways to maintain maintain their privacy. As Singh puts it, “The telegraph operators had access to every message, and hence the risk that one might bribe an operator in order to gain access to a rival’s communications” (Singh, 61). In order to protect their messages, many people began using simple monoalphabetic ciphers to encrypt their messages before sending them. This was more expensive and more time consuming, but the messages were unintelligible to your average nosy telegraph operator.

The public only became interested in ciphers once they had a reason to; they needed to keep their information private. It is much easier to trust that a letter in a sealed envelope will make it to its intended recipient unread than a message sent through another person, although, as seen with Mary Queen of Scots, this is not always the case. Once ciphers became known to the general public, however, they quickly gained popularity. They were not only useful, but also a fun diversion. Victorian lovers used ciphers to send each other notes in the newspaper, the Times was tricked into printing an unflattering encrypted comment about itself, and Edgar Allen Poe wrote a short story centered around cryptography, The Gold Bug. Ciphers, albeit fairly simplistic ciphers, were suddenly everywhere. This is why today even schoolchildren will come up with monoalphabetic ciphers like those that had once stumped the cryptanalysts of the world. Ciphers have become a deeply engrained part of our culture.

That being said, there is less of an interest in ciphers among the general public of today. While we still romanticize ciphers and codes in movies, books, and other media, we don’t have the practical crypto graphical skills that we once did. Phones and email have removed the middle man, the operator, from the equation; it appears that there is no need to encrypt our messages anymore. While there is still an interest in cryptography, few people ever go beyond the simple mono-alphabetic or shift ciphers from their schoolyard days.

Getting Our Priorities in Order

“The role of government is to secure for citizens the rights of life, liberty and the pursuit of happiness. In that order. It’s like a filter. If the government wants to do something that makes us a little unhappy, or takes away some of our liberty, it’s okay, providing they’re doing it to save our lives,” (Doctorow, 209). This is an excerpt from Little Brother, a novel by Cory Doctorow. At this point in the story, a DHS approved social studies teacher, Mrs. Andersen, has replaced Ms. Galvez, the regular teacher. Mrs. Andersen is explaining the Bill of Rights to the students from the DHS perspective.

Protecting life, liberty, and the pursuit of happiness “in that order” is not only useless, but dangerous. If your first concern is always to protect your life above all else, then you shouldn’t get out of bed in the morning. Think of all the terrible accidents that could happen, just by getting out of bed. Not that staying in bed is much safer, what if a tree falls on your house? When you sacrifice liberty and the pursuit of happiness due to fear, you are no longer living at all. If the government has the right to take away your freedoms at will, as long as they can claim to be “protecting life,” then there is no government action that is unjustifiable.

This is an obvious example of the slippery slope between privacy and security. If the Bill of Rights is treated as a set of guidelines, then what is to prevent ideas like these? The framers of our Constitution did not intend for the Bill of Rights to be flexible; these rights are absolute. When they become less than that, we open ourselves up to the tyranny that those rights are meant to protect us from. It’s time that we recognize this, time that we truly get our priorities in order, before it’s too late. If the government has unrestricted access to all personal data, then we have failed to live up to the ideals that make this country what it is. Even in the name of remaining safe, of preventing terror, we will be causing it. Innocent until proven guilty could quickly become guilty until proven innocent. If we prioritize security over privacy, it may not be long before we are living in a world like that of Little Brother.

Mining Student Data Poses More Threats Than it Resolves

In the article “Mining Student Data Could Save Lives” Michael Morris makes an interesting point about data mining on college campuses. According to Morris, since college students are already using accounts and internet access provided by the school, there is no reason that colleges should not be able to monitor student data for early warning signs of mental instability. Morris says “…the truth is that society has been systematically forfeiting its rights to online privacy over the past several years through the continued and increased use of services on the Internet” (Morris). That’s true. Between social media, google searches, and smart phones, most of our lives are now completely digital. That does not mean, however, that I agree with Morris’ sentiments regarding colleges data mining their students.

It all comes down to a basic question of security vs. privacy. How much of our privacy are we willing to give up in the interest of staying safe? The better question might be, how much of our privacy can we give up while still staying safe? Who is to say that the school officials monitoring the data would be completely aboveboard? I realize that college staff is usually very trustworthy, but there are always exceptions to the rule. Imagine what one corrupted school official could do with access to all of that data. Additionally, once those back channels are established, what is to prevent an accomplished hacker abusing them? Data mining may be to keep us “safe,” but it actually opens the door to a whole new set of problems that colleges may not be equipped to deal with.

It is also important to consider the consequences of false threats. If a school decides that a student’s activity is suspicious they would intervene. But then what if the school was wrong? For example, I have had some strange google search histories in the past. I have always wanted to write a murder mystery and I have researched various poisons to see they would work in my plot. It is likely that, should my college be monitoring my activity, that could be flagged as a dangerous. Even if my search histories were an exception to the rule, how would schools avoid adopting a “guilty until proven innocent” mentality in the interest of keeping everyone “safe?” Morris’ idea has good intentions, but ultimately results in more problems and potential security threats than it solves.

Morris, Michael. “Mining Student Data Could Save Lives.” The Chronicle of Higher Education, The Chronicle of Higher Education, 2 Oct. 2011, www.chronicle.com/article/Mining-Student-Data-Could-Save/129231/.

Finding the Balance of Confidence and Cryptography

The bigger they are, the harder they fall. In chapter one of The Code Bookby Simon Singh, Singh states that “…a weak encryption can be worse than no encryption at all” (Singh, 41). When it comes to cryptography, this could not be more true.

A successfully encrypted message should only be decipherable to the intended recipient, otherwise it fails to accomplish its purpose. As a result, those responsible for encrypting the message must be certain that, without the proper key, their message is indecipherable. This, however, is a dangerous assumption. False confidence can lull cryptographers and their intended recipients into a false sense of security, thereby causing them to let their guard down. For example, in the instance of the Babington plot, both Mary Queen of Scots and Anthony Babington assumed that their cipher was unbreakable and spoke quite openly about their plans in their correspondences. As a result, when Thomas Phelippes managed to crack their cipher, he effectively signed their death warrants. Had Mary Queen of Scots and Babington been less assured of the strength of their code, they would never have written their plans out as obviously as they did.

Additionally, there is much that depends on the abilities of the cryptanalysts of the times. For example, the Spanish cryptographers that Singh refers to on pages 28 and 29 of his book believed their code to be indecipherable. When they discovered that their codes were, in fact, quite obvious to a French cryptographer, Philibert Babou, they could not accept it. They had been so confident in their ciphers that they went so far as to suggest that Babou was in league with the devil. Such overconfidence is a constant danger to cryptographers.

Confidence is one of the most basic conundrums of cryptography. On the one hand, if cryptographers are overly confident in their ciphers they risk exposure should their ciphers be broken. On the other, if a cryptographer is not confident enough in their cipher, then there would be no sensible reason risk using it for secret correspondence. The answer must be somewhere in the middle. Cryptographers must have enough faith in their own work to use their ciphers, and yet they must be wary enough to watch what they say.

Singh, Simon. The Code Book: The Science of Secrecy From Ancient 

       Egypt to Quantum Cryptography. Anchor Books, 2000.

Powered by WordPress & Theme by Anders Norén