On page 99 of Little Brother by Cory Doctorow, Marcus delineates the flaws of cryptology and how ultimately cracking the Enigma led to the victory against the Nazis in WWII. One of the flaws was secrecy; after Alan Turing cracked the Enigma, any Nazi message could be deciphered because “Turning was smarter than the guy who thought up Enigma” (99). As a result, it sparked the thought that any security system is “vulnerable to someone smarter than you coming up with a way of breaking it” (99). Bruce Schneier also refers to flaws of a security system in his Afterword, explaining that it is useless for you to come up with a system entirely by yourself because there is no way for you to detect flaws in your creation. You are limited to your knowledge. Outsiders with different levels of thinking would help by suggesting different views in which people can think of in order to break the system.

I think that this concept is interesting; you are limited by what you know. And everyone around us knows something that we don’t. Recently I read a passage in Harvard Business Review on how companies and organizations should welcome people in different kinds of fields to evaluate an idea because they won’t think the same way that people in a particular company does; a mathematician thinks differently than a historian does, and the distance between their thinking has the potential to bolster ideas, limit flaws, and suggest new ideas that haven’t been thought of yet. Could this be the way to strengthen our current security systems? What kind of people do we need to evaluate them? How many people do we need (until we pass the point to where the security measure is too widely known and therefore ironically more vulnerable)?

I believe this is one of the fundamental qualities of Cryptology and all security measures: how do we know a system is safe to use? Truth is, we really don’t know, but we can always come closer by cross referencing and past experiences, allowing security to get better and better with each step of the way.