The History and Mathematics of Codes and Code Breaking

Tag: DES

How Darwin’s Theory of Evolution Applies to Cryptography

Public key cryptography was invented by the academic researchers Diffie, Hellman, Merkle, Rivest, Shamir, and Adleman. They’re the ones who came up with the idea, and they’re the ones who created functions that could work with it. Here’s the issue: British GCHQ researchers Ellis, Cocks, and Williamson did all of those things too. The only difference between the two groups is that the GCHQ researchers couldn’t publish their work because it was classified.

The phenomenon that occurred here happens in another science: biology. There, it’s known as Convergent evolution. Convergent evolution is the independent evolution of some biological feature by two different species. For example, echolocation evolved in dolphins and whales, but also independently in bats. Similarly, birds, bats, pterosaurs, and insects are not closely related to each other but they all have wings. They don’t all share some great winged ancestor, they just evolved to fly because that’s a useful thing to be able to do. The inability to fly was a common problem for all of these animals and independently, they solved it with the development of wings.

Similarly, the American academic researchers and the GCHQ researchers were each facing the problem of key distribution. Cryptography had advanced to the point where making a secure cipher was less challenging than arranging to share the key with the recipient of the cipher. Leading-edge cryptographers had arrived at the same obstacle at around the same time, and they each found the same (or similar) solution to it. That solution came to be associated with the American researchers because the Brits were under oath. They couldn’t even share their findings with their families, much less file a patent. The fact that one group came up with public-key cryptography doesn’t mean that the other didn’t. The two groups independently made convergent solutions.

For The Greater Good

The National Security Agency has been criticized for decades due to the very nature of its purpose; no one likes the idea that someone can read their emails, listen to their phone calls, or act as an observant third-party on any private two-way communication. But, at the end of the day, so long as the government in and of itself is not a bad actor, the NSA’s sole purpose is to facilitate the protection of the citizenry.

Enter the Data Encryption Standard, a new cipher for the computer age and employed up to 16 enciphering keys to encode blocks of text, designed as a joint venture between IBM and the NSA. While simple enough on the surface, the technique created billions upon billions of possible permutations, so many that the even the most state-of-the-art computers of the time would have trouble cracking it. So what’s the problem? Wouldn’t it be a good thing that after so many years, civilians finally had access to perfect privacy? Well, not if its the height of the Cold War; not if Russian agents could use that very same ultra-secure network to plot attacks or demonstrations to undermine western democracy.

The NSA, vigil as ever, took notice of this inherent risk of the system, and handicapped the DES, leaving it susceptible to brute force attack from their machines, but relatively impervious to commercially available computers. This way, the NSA could still intercept messages sent over private networks, monitoring their content while still allowing a degree of security from unwanted prying eyes. In this sense, the NSA’s decision to handicap the DES was justified, as their reasoning to do so was in line with their cardinal purpose: facilitate the safety and security of the citizenry. In allowing the DES to remain too complicated for commercial computers to crack, the NSA even allowed for the enhancement of civilian privacy while not contradicting their inherent purpose. To this end, the NSA was justified in their actions, as their building in a weakness was not to completely destroy the concept of digital purpose, but rather to better enable their ability to intercept and act on potentially malicious communications; their decision was ultimately for the greater good.

The NSA is okay … Technically.

The NSA seeks to act in its best interests. Therefore the release of the DES should come as no surprise to anyone. Though technically created by IBM, NSA was heavily involved in the creation process. At the center of the encryption are the substitution S-tables, the part where the NSA had the most involvement. Naturally this created suspicion that the NSA put a backdoor in the tables with which they were able to decode every message in seconds. However the NSA also intended the algorithm to be used for its own classified documents. Motivated by historical examples of supposedly perfectly secure ciphers, NSA knew that if it put in a logical caveat into the algorithm eventually it would be found. Therefore the only logical idea was to make it so that ONLY the NSA could break the cipher. One thing that the NSA had above every genius individual or organization was resources. Therefore it made the DES only solvable with brute force attacks, hoping that for the foreseeable future, only the NSA would have the necessary technology to conduct such an attack. Though potentially a moral grey area, the NSA did not do anything wrong technically, as a senate committee which investigated the project found. Making DES a government standard did not force any one business to use it. Interestingly, it seems that the NSA did not learn its lesson from all the backlash it received, as during 1987 it implemented the Capstone project which primarily created the SHA-1 hash function to use as a standard for password encryption. Though it has yet to be determined whether the NSA created a backdoor, SHA-1 is no longer considered secure, and just as the DES has been updated through a public competition.


The National Security Agency has one main priority, the protection analysis of communications, both domestic and foreign , that pose a threat to the United states of America. The NSA would be unable to do their job if they weren’t able to tap into communications that
the NSA developed the Data Encryption Standard (DES) weak enough to be broken by them using means that were well wicould lead to a legitimate threat to the US. In order to do their job most effectively and not waste manpower developing new ways to break codes,within their grasp. In deliberately weakening the DES, the NSA left businesses and personal messages with a standard that wasn’t as strong as it could possibly though it was strong enough to keep their secrets relatively private. The senders of the messages that used the DES were generally angry that they couldn’t have more secure encryption that had been created already, but the NSA was justified in keeping the security of the DES at a lower level the possible. In doing this, the NSA made it more difficult more threats against the US to develop within the US, which is the biggest threat to the security. While foreign attacks on the US are a more likely possibility, it is the home grown attacks that prove the most dangerous because security within the US is relatively weak in comparison to the security of getting into the US. Home grown attacks also more difficult to detect because there are a larger number of people that could be in on a plot and the members of a plot might be more diverse and harder to track. The solution to home grown attacks would be to either make it easier to identify attackers or make attackers jobs more difficult by increasing security; increasing security would first of all be a logistical nightmare because of the size of the US and secondly it would also cause mass protests amongst the US population who already despise the relatively simple security measures of airport security. Because of that, the NSA had to go with solution b and make it easier to identify attackers by making their communications open to the NSA if they ever become suspicious, while allowing the NSA to focus more time on investigating foreign communications.

Image: “Elderly Armenian Woman Guards Home” by United Nations Photos, Flickr(CC)

Safe Enough?

When Horst Feistel developed the Lucifer system for encrypting information on computers, it had an infinite number of keys that could be used to encipher so it would actually be beyond the code breaking abilities of the National Security Agency (NSA). So when the NSA decided to adopt Feistel’s system as the Data Encryption Standard (DES) they wanted to make sure they limited the possible number of keys so they would still be able to break the encrypted data just by using brute-force with their supercomputers, but at the same time civilians would not be able to break the code. They decided to limit the number of keys to roughly 100,000,000,000,000,000. This number of keys would provide privacy and security within the civilian community, but would still allow the NSA to break into messages if they needed to.

I personally believe that the NSA was justified in limiting the number of keys in the Lucifer cipher. I think it is vital for the NSA to be able to read certain messages if they really need to. If they didn’t limit the number of keys every message would be completely private and secure. This might sound great in theory, but would actually be relinquishing our security as a nation. Anything could be sent by anyone to anyone and no one would ever know about it even if the government were

suspicious. So if two known terrorists were communicating, we wouldn’t be able to read what they were saying. However when the keys are limited to a certain number, any of our messages are completely secure and private within the civilian community, but the government would be able to read it if they wanted to. I think that this is a reasonable violation of our privacy. Also, the government is not going to do anything with the information they read if it is innocent, so most people have nothing to worry about. They are only going to care about things that involve national security. So I think it was justified for the NSA to limit the number of possible keys to a number high enough that the correspondences within the civilian community would be secure, but a number low enough that only the NSA could break into the message if they really needed to.
Image: “Castello di Sermoneta” by Andrea Marutti, Flickr (CC)

The Cost of Safety

Though almost every American instinctively cringes at the mention of government limiting freedoms and invading privacy, I believe that often this invasion of privacy is a necessary evil to ensure safety. By limiting the DES, or Data Encryption Standard, to 56 bits or less for civilian business use, the NSA ensured that they would be able to crack an encryption through brute force if needed. Though this meant that businesses would be less secure, it also meant that the NSA would be able to investigate any dubious behavior by cracking the encryption. This is only a small example of the greater debate of privacy vs security. Unfortunately, it is almost impossible for a government to ensure both privacy and security; one must be greater than the other.

“Privacy” by Alan Cleaver

Though the business encryption of 56 bits is less secure than it could be, Singh states that 56 bits would be almost impossible for any civilian computer to brute-force break (250). Though some might argue that civilian computer power has increased to be able to break 56 bit encryption and the NSA has left businesses vulnerable, this is not true. Within the U.S., there is no restriction on the level of cryptography that one can use, and the only restrictions lie on exporting cryptography (Johnson 2002). This is because the NSA needs to be able to break encryption from possible terrorists or other groups that might want to harm the U.S. The government has even realized the weakness of DES and has encouraged a new encryption system called Advanced Data Encryption that can use up to 256 bits instead of 56 (Institute 2001). By increasing the standard encryption level, the NSA has shown that they are working to promote security for civilians, not intentionally limiting security to put people in danger.

A small amount of limiting of security, though it may put companies at risk, is a small price to pay to allow the NSA to, if necessary, break the encryption of data that would help protect the U.S. from a disaster that would cost lives. Though a break in security at a large company might cost them millions of dollars, the cost of lives lost from not being able to decrypt data is priceless.

Johnson, M. (2002, October 14). Where to Get PGP. Retrieved November 5, 2012, from

Institute of Standards and Technology. (2001, November 26). Federal Information Processing Standards Publication 197. Retrieved November 5, 2012, from

The Invisible Hand of the NSA

In the 1970’s, Internet was still new technology and cryptography was not even considered a legitimate field of mathematics. Cryptography was considered a pen and paper tactic for wartime security and the general public was not equipped to apply any sort of cryptography to computer technology. In the United States, cryptography was solely researched and discussed by the National Security Agency (NSA).

In this regard, the NSA wielded a considerable amount of knowledge and power. The National Bureau of Standards issued a request to the public for an encryption algorithm that would be made available to the public as a free encryption standard. The IBM labs answered this request by producing the Data Encryption Standard (DES). Of course, the DES needed to be reviewed and looked at by an outside company. The NSA was uniquely qualified and highly equipped to respond to this request. When presented with the DES, the NSA decided to abuse their power and alter the algorithm slightly and shrink the key size to half its original size, thus making the algorithm more susceptible to decryption.

People were outraged by the NSA’s ability to have an “invisible hand” in public security systems. The strength of any given cipher is directly related to the key length and the quality of the algorithm or mathematics. Thus, by shrinking the key length, the NSA intentionally weakened the DES. The NSA did the public a huge disservice by not presenting the most secure algorithm available. The NSA clearly overstepped their boundaries by tampering with the efficiency of the algorithm when their task was to analyze it and improve it. As opposed to improving it, the NSA selfishly left the algorithm at a stage simplistic enough that they could break it.

The NSA’s actions were unjustified and did not have the public’s best interest in mind. The NSA purposefully limited technological advancement and allowed the public to send confidential information utilizing an algorithm lacking optimal security.

Image: “National Security Agency Seal” by DonkeyHotey, Flickr (CC)

Sufficiently Safe

Although it is fair to say that businesses were forced to rely on security that was less than optimal, the security they were using was more than sufficient. The Data Encryption Standard (DES) has a maximum amount of keys of around 100,000,000,000,000,000. This is referred to as 56 bits because when it is written in binary, it consists of 56 digits. Although there is a cap to the amount of keys that can be used, the number is large enough that no civilianwould have a computer powerful enough to determine which key was used. The NSA, which has the most powerful computing abilities in the world, is able to determine which key is used.

I believe that the NSA is justified in doing this because I believe that the NSA has the country’s interests in mind. The DESis secure enough to prevent anyone with malicious intentions from deciphering a message; therefore it is affective. The NSA should have the ability to decipher something if it is a matter of national security.

It is comforting to know that in the most dire circumstances, high ranked officials in our nation’s government, who vow to protect all of us, have the ability and access to great resources to do whatever it takes to do so.

Limiting Lucipher

I believe that the NSA was justified in limiting the strength of the Data Encryption Standard (DES) so that they would be able to decipher any message that was sent using Lucipher. Lucipher was a complicated encryption system that relied on a keyword made up of numbers. The number of possible keys and the length of time it takes to crack the cipher text are positively correlated. Therefore, when the NSA limited the number to 100,000,000,000,000,000 keys, they made it so “…no civilian organization had a computer powerful enough to check every possible key within a reasonable amount of time” (250). It only makes sense that the leading security agency of a country should be able to decipher any message sent or received along its territory. This is for the good of the country and provides protection from possible attacks or illegal operations.

I think that as long as a secure standard is in use, there should be someone overlooking this, even though I am not in favor of the “Big Brother” type of government at all. Some may argue that this limit the NSA implemented also limits the advancements that can happen in cryptography, but the present advances in cryptography are all the proof needed against this.

Simon Singh, The Code Book

Powered by WordPress & Theme by Anders Norén