Cryptography

The History and Mathematics of Codes and Code Breaking

Category: Uncategorized Page 1 of 6

Reimann Sum(feat. Technology)

Technology has quite literally transformed our lives. We live in an age of undeniable prosperity and freedom, where even our poorest live a better life than ancient kings. But in recent years the very technologies that we use for pleasure have been turned against us by governments and bad-faith actors. Of course we don't live in an era of absolute freedom; we agree to cede some of our rights for safety and security. For example, we as a society agree on the use of surveillance cameras as a means of deterrence and protection, but are we ready to make the leap to facial ID? We agree that police should use DNA testing to solve crime, but what about an artificial intelligence reconstruction of a criminal that may present flaws?

One of the most striking paragraphs from Big Brother came up on page 42 when Cory Doctorow discussed how despite advancements in gait recognition software allowed recognition of individuals from their movements, the software's success rate was reduced by any number of external factors including floor material, ankle angle measure, and your energy level. This variability can lead to errors in the system which can often have devastating consequences, especially when peoples' lives and security hang in the balance. The title, I believe, accurately reflects our society's desire to perfect our creations: we input more data points, update more software, create new tools, in a never-ending journey to create the perfect AI tool. But at what point do the ethical complications from such a tool lead to sufficient harm such that an objective cost-benefit analysis would overturn the progress of such a tool? No matter how many data points we inject, a piece of technology will never perfectly emulate the human mind. Every error/mistake that's caused by the inaccuracy of technology threatens our stability, and is only magnified as the scope of the instrument exists. One particular example exists in the NSA. What would be the fallout of an inaccurate terror watch list that was compiled using the latest data points? Although this question is astronomical, it is important that we examine this issue with the utmost scrutiny.

Little Brother: Innocence and Regret

Little Brother by Cory Doctorow, as it follows its protagonist, Marcus, juxtaposes complex technological and political conflicts with intimate moments of childhood and adolescence. The passage from this book that resonated with me the most, and by far my favorite passage, was the passage in which Marcus recounts his memories of LARPing, or Live Action Role Playing. He delves deep into his memories about how much he used to love to play these games, and how one specific game had led him to an extremely embarrassing moment. This passage was particularly important to me because it touched upon two prevalent themes in my life: the longing for childhood innocence and the difficulty of moving on from a moment you intensely regret. 

In this passage, Marcus begins to recount his fondest memories of childhood. He begins to talk about “scout camps,” which were weekend-long role playing games. He remembers how in the first one he was a wizard, and he was wholly invested in the character and he felt that his only goal was to seek out the one person who was his designated target. The important thing is that Marcus called these games his “favorite thing in the world.” This is something that keeps me up at night. I can’t think of anything in the world that could be more fun than being a kid, having the ability to totally forget the world around you and create a new world. I can’t think of anything more fun than truly believing that you are a wizard and that all your best friends were other magical beings. I often worry that I will never be able to achieve that level of release from real world problems. I often worry that I will never be able to achieve the level of innocence and bliss that I could play as a kid. I often worry that my best days are behind me. 

As Marcus nears the end of his story, he gets to the point. He used to play another LARPing game which involved pretending to be a vampire and running around a hotel. At one point, a reporter who was staying at the hotel where the game was being played asked Marcus what he was up to. Marcus responded with a funny lie about how his tribe was on a search for someone in their royal bloodline after they had lost their prince. The journalist, however, published this as a story. In the end, Marcus’s joke belittled this LARP game and more importantly made him the subject of great teasing and humiliation. Marcus begins to describe how it feels to think back on this memory of embarrassment, and there are few literary passages I’ve related to more than how he describes it. I know how it feels to have done or said the wrong thing and to think back on it. It is almost a physical pain. The intensity of regret mixed with the hopeless desire to change the past momentarily overcomes you. This is how Marcus felt about what he did, and it is how I felt about many of my words and actions throughout life. That is why this passage touched me. L

Good Bad Secrets

After San Francisco's security overhaul, one of the latent consequences were all the "not-terrorists" that were caught as a result of the increased surveillance measures. Marcus specifically mentions husbands and wives caught cheating, kids caught sneaking out, and one teenager whose parents discovered he had been visiting the clinic for AIDS medication. These people certainly aren't terrorists - in fact, they're not even drug dealers, thieves, or criminals to any extent. They aren't guilty people, just "people with secrets" (121).

I believe the ability to keep secrets, to some extent, is a completely necessary aspect of any society. I'm not saying that sneaking out is right or wrong, and I'm certainly not saying everyone should cheat on their spouses, but that these are things that should be discovered (or not) and dealt with by the family, not the government. The government has a duty to ensure the safety of its citizens, but only after obtaining consent from its citizens. And in this case, citizens did not give consent to having details of their private, personal lives exposed. Take, for example, a sexually active gay teen growing up in an extremely religious and conservative family. He may need to visit Planned Parenthood to obtain information and medication to stay safe; however, he may not have come out to his parents yet and may not want them knowing this information for a multitude of reasons. Though this case is nuanced, it represents a more broad category of secrets that are kept for the benefit of both the individual and the family. There will always be secrets that need to be kept and actions that need to be hidden, and it is not the government's duty to interfere.

Privacy VS. Security

For many years, the debate about encryption and hiding messages has come down to one trade off: personal privacy vs. communal security. In his article “Mining Student Data Could Save Lives,” Micheal Morris takes a strong stance on this debate. His argument pertaining explicitly to universities, he claims that if universities could prevent tragedies if they looked into student's data more. He believes that a technique called “Data Mining” could be used to prevent events like stalkings, suicides, and mass shootings on campuses.

Morris begins his article with an analogy to a school shooter and crystal ball. He portrays a vivid image of a student holding a glock and then states “If only there had been a way to look into a crystal ball and see that this horrific confrontation was about to occur, it could have been prevented.” This sets up his main argument that schools could prevent serious tragedies if they only had a closer look into the lives of their students. Morris then explains that this “crystal ball” is, in a way possible through data mining. Data mining would involve a similar process to, as Morris explained, credit card tracking. When a credit card company sees that you have an irregular pattern of spending, they will shut off your card because of the possibility that it has been stolen. Similarly, certain patterns of behavior online can be indicative to a university of potential real life actions. An online history of looking at automatic weapons might let the university know of a potential shooter threat. Knowledge of a google drive draft of a suicide note might allow the university know of a potential victim of suicide. With the right data, the university may be able to save lives. However, people have begun to value their data privacy so much that they have a problem with universities tracking these sorts of data. Still, Morris argues that it is worthy of losing some privacy. 

I completely agree with Morris’s argument. First, this system wouldn’t even involve a major sacrifice of privacy. It wouldn’t monitor students talking about drinking or parties or anything of that sort. It would only monitor for behaviors that could pose a serious threat to students. Second, I believe that most people fear systems like the one Morris describes not because they value privacy so much but because of how the government’s similar system has not worked out. In the post 9-11 world, the US government has become notorious for non-consensually taking citizens data and doing nothing good with it. People fear that it will be the same with universities. The difference is that a university can do far less to hurt a person than the government, and that the universities will be operating more smaller systems with a much more specific task. The potential for data abuse is much smaller. For those reasons, I believe that universities should be doing whatever they can to prevent these tragedies.

Mining Mystery: Should we mine Student Data for more Protection?

Morris’s central argument revolves around the incorporation of student data mining in order to counter possible future threats. He calls this “the next natural step” in using private information to prevent external threats. Morris goes on to detail how administrators could track social media usage, shopping patterns, and further online activity in order to make assessments on whether a credible threat exists. 

 

The central issue in this debate lies between privacy and security. Are students’ rights to privacy outweighed by administrators’ need to provide safety and security for their students? This question isn’t limited to college campuses, but can rather be applied to society as a whole. Discussing the role of authority, particularly governments, in our daily lives is of the utmost importance and a daily ideological struggle. I both agree and disagree with Morris’s argument. It’s important for administrators to do whatever is necessary to protect their students, but violating the privacy of their students is not the path to go. Aside from the obvious moral enigma, such an act could give more power to authority and reduce self-accountability. Allowing the administration to monitor what students do online would lead to mistrust; dangerous, secretive behaviors; and a need for students to “hide” what they are doing online. A common-sense solution would combine certain aspects of Morris’s argument with the other side. Allowing the student population to decide which aspects of their online life they want monitored would provide more credibility to the administrations’ efforts to increase safety, as well as provide increased trust and accountability of authority.

 

How much power we are willing to give authority is a central tenet of modern society, and no discrete answer exists. The best possible solution takes into account both sides’ arguments and will help administrators provide better security while also protecting student privacy.

 

A Slippery Slope

Shootings, suicides, and other similar acts of violence, especially on campuses, have become more prevalent in the last decade than ever before. The free internet (though not one of the larger reasons for this increase, in my opinion) has expanded the accessibility of the resources needed to commit such acts. And in most cases, in the "aftermath of every large-scale act of campus violence," officials and investigators discover warning signs that, had they found before-hand, could've provided reason for authority to intervene.

In his essay Mining Student Data Could Save Lives, Michael Morris argues for the use of data mining on campuses to prevent incidents of campus violence. Since the University essentially controls both the wired and wireless internet network, campus administration has the tools to use algorithms to identify at-risk student behavior. Morris believes that universities should take advantage of this ability to maximize campus safety. Since we already give up so much of our privacy and personal information through social media, what does it matter if we lose a little more?

I agree with this argument, but only to a slight extent. Giving universities the freedom to survey and monitor student activity on their network could be an extremely slippery slope if not taken extremely seriously and carefully. While I do agree that the benefits of preventing large-scale acts of violence do outweigh the need for complete privacy, universities should be controlled in how much access they have for student information, and how they use it. FERPA could possibly be modified to give universities more freedom when it comes to monitoring online student activity, but in a limited and controlled way. Contrary to how Morris makes it seem, there is a substantial amount of information on our computers that we haven't given up through social media. Though our lives are largely public, I do value personal privacy to some extent. Despite the continually growing need for surveillance and intervention to prevent violence, I do believe universities too much power could open a can of worms that may be difficult to close.

Drawing a Fine Line between Safety and Privacy

Ever since America was hit in the face with the realities of international and domestic terrorism starting with the tragic morning in September of 2001 or even as far back as Columbine in 1996, our country has had a skeptical outlook on the privacy and safety of our citizens and our country as a whole. Although everyone can agree that safety is one of our utmost priorities, many individuals become defensive when personal benefits and freedoms are at stake. Michael Morris is confident that while we continue to tug back and forth at where exactly the line should be drawn, that college campuses should take full advantage of what they have in hand to keep their students safe.

College campuses have the ability to use student data from their systems to track potential threats, particularly on-campus violence attacks and threats. Morris calls it the "crystal ball", which colleges can use to work towards campus safety in general. Morris goes on to talk about various points that require discussion, primarily the distinction between intent of safety and intrusion and all the sub-points that fall under that umbrella. In the past years, the Department of Education in cooperation with several universities has clarified policies such as Ferpa to give  universities more leverage when they feel they need to act on situations that cause any concern or threat.

Personally, I fully agree with Morris's argument primarily because as a college student in an age where society has become numbed to constant breaking news of shootings and acts of domestic terrorism, some action should be taken even if there is controversy and conflict about it. To ensure that our culture does not crumble into pieces, there should and must be an immediate action plan that allows campuses to do what they can in their power to provide safety for all their students. From there, we have the ability to build a new culture that works towards safety among all our citizens.

Can mining students' data work?

The central argument of Morris’s essay is: although mining students’ data can not perfectly predict the campus violence and may provide issues of getting student private information without permission, this method is still helpful to reduce the possibility of school violence event. I agree with it. By surveilling student’s data on the internet, the school can both protect the student himself and the campus. First, if the school uses the data mining method to get students’ searching frequency and their comments on the social media, the school can analyze the data and find out the reasons that are responsible for the wired behavior. After this, the school can send faculty to solve the metal problems and prevent the situation become more serious. What’s more, if the school solves the students’ metal issues, it will create a healthier environment for studying. With this improvement, it can create a loop that healthier environment will lead to less mental issues. Less mental issues can lead to less campus violence. Thus, mining students’ data creates more benefits compare to doing nothing. However, school should still keep the data private instead of telling the situation to the whole faculty to prevent the violence. The school should use proper methods to solve the students’ metal issues in order to build a safer campus life.

Weak or Bust: Why the Strength of an Encryption Matters

There are two reasons as to why a weak encryption can be worse than no encryption at all: the first being a sense of overconfidence that can prove fatal if the encryption is decrypted and the second tipping off decryptors who often become more cautious and further scrutinize your messages. As explicitly outlined in the book, Mary Queen of Scot’s overconfidence clearly demonstrates the negatives of not creating a strong cipher. By disregarding caution and placing misguided faith in a weak cipher, she inadvertently revealed more information than she would have had she exercised caution. It’s also necessary to note that Mary Queen believed she had a strong cipher, thus providing one more reason as to why caution must always be exercised even if you believe your code to be unbreakable. This strain of thought can actually be applied to a multitude of situations: when engaging in a secretive activity or one that you would not prefer others to know of, it’s better to err on the side of caution.

But perhaps more importantly, a bad cipher may warn the enemy of an impending code. A seemingly legible message that holds a deeper meaning may be more deeply scrutinized if the decryptor suspects a cipher at play. This can be even more dangerous as a heightened sense of awareness and caution could lead to both direct and indirect long-term effects for sender and recipient. Thus, no encryption can often be more effective than a poorly-made one.

It's necessary in an increasingly complex and secretive world that people realize that whatever codes they create can be broken by online tools accessible to billions. It is both important and necessary to exercise restraint and caution when sending hidden messages - failure to do so may result in harsher penalties than if you had not attempted to encode your message at all.

To Code or Not to Code

What Singh is implying to coders is that cryptographic messages of high importance should be done well. For instance, if the contents of the message are a correspondence between about a politician having an affair then; it would make sense for the code to be very strong so the politician’s job isn’t jeopardized. 

On the other hand, if it is a playful message being sent between friends with no real significance then the consequences of it being coded are not drastic. It doesn’t make a difference how strong the code is. If the stakes of being caught are high, then make a strongly coded message that nobody will figure out.

Singh means that if instead of a message being poorly coded and it was just straight up then it would be more of a; “Yeah, this is hard evidence against you and there’s no denying it.” Then it would be used and that would be that. But because it was supposed to be hidden it adds an extra layer of distrust to the case and provides further justification for conviction. In Mary Queen of Scots situation, she was doing the most incriminating offense possible towards the Crown and the fact that she made a code to hide her plans hurts her legacy in the end. 

Page 1 of 6

Powered by WordPress & Theme by Anders Norén