Cryptography

The History and Mathematics of Codes and Code Breaking

Author: dininoed

The Pursuit of Randomness

The section of Cryptonomicon that really caught my attention was the section between pages 422 and 427. This section describes the British interception of German messages from U-553. These messages are different from the previously intercepted Enigma messages. These messages are encrypted utilizing Baudot Code, a code that used thirty-two characters. The system was based off of a power of two and therefore each character had a unique binary representation that contained 5 binary digits.  As we learned in class, these digits were either 1 or 0.

My blog post for our last essay dealt with the Lorenz teleprinter cipher and the Lorenz machine. These new messages that Waterhouse has discovered are in fact encrypted with the Lorenz cipher. The idea behind the Lorenz cipher was that if the paper used in communication was pre-punched with a completely random set of excess or obscuring characters, the cipher would be unbreakable. However, both the sender and receiver would have to have this paper, which is impractical in wartime. In Cryptonomicon, Waterhouse figures this out and he and Alan conclude that the obscuring characters in the cipher text could only be pseudo-random. This lack of complete randomness, and German error, lead to the British being able to crack the Lorenz cipher without ever seeing a Lorenz machine.

This section also discussed the building of Colossus, the first electronic calculator. Colossus is ultimately used to decrypt many intercepted German messages, crack the Lorenz cipher and lead to many Allied victories. The issue of the Lorenz Cipher reinforces our class lesson on binary numbers and further discusses the idea of a one-time pad and whether something is truly random and unbreakable. This example in Cryptonomicon helped me understand how difficult pure randomness is to achieve, especially in a wartime situation.

Image: “Binary Blanket,” by quimby, Flickr (CC)

The Invisible Hand of the NSA

In the 1970’s, Internet was still new technology and cryptography was not even considered a legitimate field of mathematics. Cryptography was considered a pen and paper tactic for wartime security and the general public was not equipped to apply any sort of cryptography to computer technology. In the United States, cryptography was solely researched and discussed by the National Security Agency (NSA).

In this regard, the NSA wielded a considerable amount of knowledge and power. The National Bureau of Standards issued a request to the public for an encryption algorithm that would be made available to the public as a free encryption standard. The IBM labs answered this request by producing the Data Encryption Standard (DES). Of course, the DES needed to be reviewed and looked at by an outside company. The NSA was uniquely qualified and highly equipped to respond to this request. When presented with the DES, the NSA decided to abuse their power and alter the algorithm slightly and shrink the key size to half its original size, thus making the algorithm more susceptible to decryption.

People were outraged by the NSA’s ability to have an “invisible hand” in public security systems. The strength of any given cipher is directly related to the key length and the quality of the algorithm or mathematics. Thus, by shrinking the key length, the NSA intentionally weakened the DES. The NSA did the public a huge disservice by not presenting the most secure algorithm available. The NSA clearly overstepped their boundaries by tampering with the efficiency of the algorithm when their task was to analyze it and improve it. As opposed to improving it, the NSA selfishly left the algorithm at a stage simplistic enough that they could break it.

The NSA’s actions were unjustified and did not have the public’s best interest in mind. The NSA purposefully limited technological advancement and allowed the public to send confidential information utilizing an algorithm lacking optimal security.

http://news.cnet.com/Saluting-the-data-encryption-legacy/2010-1029_3-5381232.html

Image: “National Security Agency Seal” by DonkeyHotey, Flickr (CC)

Oh What a Tangled Web

A Tricky Web of Trust

The passage in Little Brother that really intrigued me was the passage about “a web of trust” found on pages 153 and 154. The previous passage talked about public keys versus private keys and the risks associated with these keys. It is very difficult to make the public key incredibly public and a middle man can easily confuse the two people trying to communicate by secretly intercepting, reading and changing messages. The only way to ensure that communication is secure is to meet in person and swap keys, thus creating a secure web of trust limited by the pure number of people you can meet up with in person. However, if people keep passing on all of their keys to people they trust the ring grows and encompasses a larger group where secure communication is possible.

I think this is incredibly interesting since it seems then that any terrorist or criminal group would use this to communicate. Most partners in crime meet in person and would be able to devise such a plan to evade any potential middle men trying to intercept their communications. The passage seems to say that if you trust someone enough and see him or her in person, you can absolutely ensure safe communication with him or her. This ties into our discussions on whether the cryptographers or decrypters are winning and if such strong cryptos should even be allowed. In this case, the passage seems to be claiming that cryptographers will always win if they employ this strategy. This leads to questioning whether these encrypted messages are truly protecting innocent people or if they are masking and hiding criminals and terrorists. The argument could beOh What a Tangled Web made that cryptography that is unbreakable unless trust is broken is considered too strong and can be used too easily for harm. While this cryptography method may be used to protect individual’s privacy, I assume it would also be used to enable dangerous communication and activity.

 

Image: Oh What a Tangled Web by Jenny Downing, Flickr (CC)

The Lure of The Beale Ciphers

The Beale Ciphers have stumped some of the greatest cryptographers for over a hundred years. The cipher is most likely based off of a piece of literature that serves as the key. This book or text may no longer exist and some believe that Beale himself may have written it. Despite this daunting history of the Beale Ciphers, people continue to try to decipher the letters to lead them to the treasure described in the second Beale letter. These people are incredibly motivated by wealth and fame. The idea of cracking a cipher that so many people have failed at is enticing, as is the promised reward of items worth millions of dollars. Curious treasure hunters, amateur and professional cryptographers can’t resist the urge to attempt the frustrating cipher.

The Great Cipher: 200 Years of Security

Louis XIV used the Great Cipher, invented by Antoine and Bonaventure Rossignol, throughout the seventeenth century. Following the death of the Rossignols, The Great Cipher remained an unsolved mystery until the nineteenth century, when new texts encrypted by The Great Cipher were discovered and passed on to a French cryptographer Bazeries. Bazeries struggled with the cipher for years, but eventually was able to successfully decipher several key historical messages and crack The Great Cipher.

The Great Cipher utilized 587 different numbers and was not a homophonic cipher, as Bazeries found after many failed attempts. Bazeries then explored the idea that the Great Cipher was based off of digraphs, or pairs of letters. Although this idea was wrong, it ultimately led him to his discovery that The Great Cipher paired numbers to syllables. The cipher proved to be even more complicated as certain numbers stood for single letters only while others stood for syllables. There were also tricks embedded in the cipher; for example, certain numbers meant that the number before it should be deleted.

The Great Cipher was protected for 200 years due to its great complexity and ingenuity for the time period. The manipulation of syllables as opposed to letters was revolutionary in the cryptography world. The added complexity through the use of single letters and nulls made Bazeries’s task even more difficult. The Great Cipher was a remarkably secure cipher that stumped the finest cryptographers for 200 years.

Frequency Analysis Utilized by Amateurs

When Al-Kindi first developed cryptanalysis it was groundbreaking and highly advanced for his society. Today, our society is highly intellectual and focuses on stimulating problem solving abilities in young children. With the knowledge of frequency analysis, many amateur cryptanalysts can easily employ the method to successfully decrypt text.

Society today is exposed to these necessary skills early on in development. The idea of using logic to solve puzzles, riddles and other games is common starting at a very young age. What may have been considered a significant level of scholarship when cryptanalysis was first invented is now merely accounted for as common sense.

Early on in the history of cryptography people were not accustomed to the idea of codes and secret messages. This concept was so foreign that they could not begin to understand how to solve them or decrypt them. No one paid attention to what common letters were or which letters were most frequently used in their language. Today, with basic reading and common experiences such as watching word guessing game shows, people subconsciously take note of what letters most often appear in words. When an amateur cryptanalyst approaches an encrypted message they instinctively look to substitute in “e”, “t”, or “a” for example, knowing these letters form many common words. In the past, this knowledge was not considered basic knowledge and people were unaware of this crucial information. Armed with the knowledge of frequency analysis and the logical thinking our society breeds, amateur cryptanalysts can easily use frequency analysis without prior training.

Powered by WordPress & Theme by Anders Norén