Cryptography

The History and Mathematics of Codes and Code Breaking

Author: konacr

Reimann Sum(feat. Technology)

Technology has quite literally transformed our lives. We live in an age of undeniable prosperity and freedom, where even our poorest live a better life than ancient kings. But in recent years the very technologies that we use for pleasure have been turned against us by governments and bad-faith actors. Of course we don’t live in an era of absolute freedom; we agree to cede some of our rights for safety and security. For example, we as a society agree on the use of surveillance cameras as a means of deterrence and protection, but are we ready to make the leap to facial ID? We agree that police should use DNA testing to solve crime, but what about an artificial intelligence reconstruction of a criminal that may present flaws?

One of the most striking paragraphs from Big Brother came up on page 42 when Cory Doctorow discussed how despite advancements in gait recognition software allowed recognition of individuals from their movements, the software’s success rate was reduced by any number of external factors including floor material, ankle angle measure, and your energy level. This variability can lead to errors in the system which can often have devastating consequences, especially when peoples’ lives and security hang in the balance. The title, I believe, accurately reflects our society’s desire to perfect our creations: we input more data points, update more software, create new tools, in a never-ending journey to create the perfect AI tool. But at what point do the ethical complications from such a tool lead to sufficient harm such that an objective cost-benefit analysis would overturn the progress of such a tool? No matter how many data points we inject, a piece of technology will never perfectly emulate the human mind. Every error/mistake that’s caused by the inaccuracy of technology threatens our stability, and is only magnified as the scope of the instrument exists. One particular example exists in the NSA. What would be the fallout of an inaccurate terror watch list that was compiled using the latest data points? Although this question is astronomical, it is important that we examine this issue with the utmost scrutiny.

Mining Mystery: Should we mine Student Data for more Protection?

Morris’s central argument revolves around the incorporation of student data mining in order to counter possible future threats. He calls this “the next natural step” in using private information to prevent external threats. Morris goes on to detail how administrators could track social media usage, shopping patterns, and further online activity in order to make assessments on whether a credible threat exists. 

 

The central issue in this debate lies between privacy and security. Are students’ rights to privacy outweighed by administrators’ need to provide safety and security for their students? This question isn’t limited to college campuses, but can rather be applied to society as a whole. Discussing the role of authority, particularly governments, in our daily lives is of the utmost importance and a daily ideological struggle. I both agree and disagree with Morris’s argument. It’s important for administrators to do whatever is necessary to protect their students, but violating the privacy of their students is not the path to go. Aside from the obvious moral enigma, such an act could give more power to authority and reduce self-accountability. Allowing the administration to monitor what students do online would lead to mistrust; dangerous, secretive behaviors; and a need for students to “hide” what they are doing online. A common-sense solution would combine certain aspects of Morris’s argument with the other side. Allowing the student population to decide which aspects of their online life they want monitored would provide more credibility to the administrations’ efforts to increase safety, as well as provide increased trust and accountability of authority.

 

How much power we are willing to give authority is a central tenet of modern society, and no discrete answer exists. The best possible solution takes into account both sides’ arguments and will help administrators provide better security while also protecting student privacy.

 

Weak or Bust: Why the Strength of an Encryption Matters

There are two reasons as to why a weak encryption can be worse than no encryption at all: the first being a sense of overconfidence that can prove fatal if the encryption is decrypted and the second tipping off decryptors who often become more cautious and further scrutinize your messages. As explicitly outlined in the book, Mary Queen of Scot’s overconfidence clearly demonstrates the negatives of not creating a strong cipher. By disregarding caution and placing misguided faith in a weak cipher, she inadvertently revealed more information than she would have had she exercised caution. It’s also necessary to note that Mary Queen believed she had a strong cipher, thus providing one more reason as to why caution must always be exercised even if you believe your code to be unbreakable. This strain of thought can actually be applied to a multitude of situations: when engaging in a secretive activity or one that you would not prefer others to know of, it’s better to err on the side of caution.

But perhaps more importantly, a bad cipher may warn the enemy of an impending code. A seemingly legible message that holds a deeper meaning may be more deeply scrutinized if the decryptor suspects a cipher at play. This can be even more dangerous as a heightened sense of awareness and caution could lead to both direct and indirect long-term effects for sender and recipient. Thus, no encryption can often be more effective than a poorly-made one.

It’s necessary in an increasingly complex and secretive world that people realize that whatever codes they create can be broken by online tools accessible to billions. It is both important and necessary to exercise restraint and caution when sending hidden messages – failure to do so may result in harsher penalties than if you had not attempted to encode your message at all.

Page 2 of 2

Powered by WordPress & Theme by Anders Norén