|
Cryptography's Role in Securing the Information Society
(1996) Computer Science and Telecommunications Board (CSTB) |
|
| |||||||||||
|
|
|
The following HTML text is provided to enhance online readability. Many aspects of typography translate only awkwardly to HTML. Please use the page image as the authoritative form to ensure accuracy. Page 414
E A Brief History of Cryptography PolicyIn the United States cryptography policy and information about cryptography were largely the province of the National Security Agency (NSA) until the 1970s. Although a small market existed for unclassified commercial cryptography, the most advanced cryptographic techniques were classified and were limited largely to military, diplomatic, and intelligence use.1 E.1 EXPORT CONTROLSOne policy mechanism for controlling the diffusion of cryptography is control of exports. The earliest U.S. use of export controls was in the Trading with the Enemy Act, passed in 1917 during World War I, which empowered the President to restrict economic activities with enemy countries.2 U.S. peacetime export control activities grew to a significant degree following World War II. The Export Control Act of 1949 gave the executive branch broad authority to determine what products or technical data are subject to export licensing, to run the licensing system, and to penalize violations. It also largely exempted the rule-making process, including 1 Office of Technology Assessment, Information Security and Privacy in Network Environments, OTA-TCT-606, U.S. Government Printing Office, Washington, D.C., 1994, p. 115. 2 Mitchell B. Wallerstein and William B. Snyder, Jr., "The Evolution of U.S. Export Control Policy: 1949-1989," in National Research Council, Finding Common Ground, National Academy Press, Washington, D.C., 1991, p. 308. |
|
|
The Open Book page image presentation framework is not designed to replace printed books. Rather, it is a free, browsable, nonproprietary, fully and deeply searchable version of the publication which we can inexpensively and quickly produce to make the material available worldwide.
For most effective printing, use the "printable PDF page" link available on each OpenBook page's tool block. The 300 x 150 dpi PDF linked to it is printable on your local printer.
More information on the Open Book is available.
[ Top of Page ] [ Home ] [ Contact Us ] [ Help ] [ The National Academies Home ] | ||
Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter.
Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 414
Page 414
E A Brief History of Cryptography Policy In the United States cryptography policy and information about cryptography were largely the province of the National Security Agency (NSA) until the 1970s. Although a small market existed for unclassified commercial cryptography, the most advanced cryptographic techniques were classified and were limited largely to military, diplomatic, and intelligence use.1
E.1 EXPORT CONTROLS One policy mechanism for controlling the diffusion of cryptography is control of exports. The earliest U.S. use of export controls was in the Trading with the Enemy Act, passed in 1917 during World War I, which empowered the President to restrict economic activities with enemy countries.2 U.S. peacetime export control activities grew to a significant degree following World War II. The Export Control Act of 1949 gave the executive branch broad authority to determine what products or technical data are subject to export licensing, to run the licensing system, and to penalize violations. It also largely exempted the rule-making process, including
1 Office of Technology Assessment, Information Security and Privacy in Network Environments, OTA-TCT-606, U.S. Government Printing Office, Washington, D.C., 1994, p. 115.
2 Mitchell B. Wallerstein and William B. Snyder, Jr., "The Evolution of U.S. Export Control Policy: 1949-1989," in National Research Council, Finding Common Ground, National Academy Press, Washington, D.C., 1991, p. 308.
OCR for page 415
Page 415
determination of what items should appear on the controlled list, from public comment and judicial review.3
The Export Administration Act of 1969 changed the name of the legislation and introduced the first attempt by Congress to balance control of technology for national security reasons with the goal of expanding U.S. exports. For example, Congress recommended for the first time that foreign availability of controlled items be taken into account in the licensing process. Under the Export Administration Act, the Department of Commerce is responsible for administering the Export Administration Regulations (EAR), including maintaining the Commerce Control List.
Cryptography is covered on this list. However, cryptographic products and data are also subject to licensing on the U.S. Munitions List, along with other items that are ''inherently military in character." The U.S. Munitions List is administered by the Department of State under the Arms Export Controls Act, which provides the basis for the International Traffic in Arms Regulations (ITAR). There is significant overlap between the ITAR and EAR with respect to cryptography. At present, however, most software and hardware for cryptographic systems (such as those with key lengths of more than 40 bits) remain on the Munitions List unless the State Department grants jurisdiction to the Commerce Department. As discussed in Chapter 4, the National Security Agency plays a strong advisory role to the Departments of State and Commerce in deciding issues of licensing cryptographic products for export.
E.2 ACADEMIC RESEARCH AND THE CONTROL OF INFORMATION ABOUT CRYPTOGRAPHY By the 1970s, interest in cryptography was growing not only in commercial but also in academic circles. This created conflicts due to government controls on the dissemination of information about cryptography, including at open scientific meetings. Legal basis for government control of scientific information exists in several sources. One of the first pieces of legislation addressing cryptography was a law, passed in the 1920s and still in effect, that prohibits publication of information about diplomatic codes and ciphers. This was a prior restraint on free speech that was considered justified on national security grounds.4
The Atomic Energy Act of 1946 created a category of information known as Restricted Data, which encompassed data on the manufacture
3 Wallerstein and Snyder, "The Evolution of U.S. Export Control Policy," 1991, p. 310.
4 James Bamford, The Puzzle Palace: A Report on America's Most Secret Agency, Houghton Mifflin, Boston, 1982.
OCR for page 416
Page 416
or use of atomic weapons or special nuclear material. Restricted Data is essentially "born classified," subject to secrecy from its creation even if created by a private person such as an academic scientist not involved in any federal research program. Applying these rules, a court issued a preliminary injunction against The Progressive's publishing an article on the working of hydrogen bombs, even though it was based on information from publicly available sources.5 (The injunction was later lifted when a newspaper published similar information.)
The EAR and ITAR prohibit not only the export of listed items without a license, but also the distribution of technical data about items that are subject to export controls. The restriction on technical data has been applied to restrict dissemination of academic research, for example, at open scientific meetings within the United States, because the accessibility of such data to foreign persons implies the possibility of "export" of the data.6
Prepublication review clauses in contracts and grants for government sponsored university research, the restricted contact between cryptographers and foreign visitors, and NSA review of material to be presented at open meetings have all provoked conflict between the academic and government cryptography communities. One result of such conflicts (not only in the area of cryptography) was a National Academy of Sciences review of scientific communication and national security, which concluded that policies of "security through secrecy" would chill scientific activity and ultimately weaken U.S. technological capabilities to the point of adversely affecting U.S. security.7 (The report, published in 1982, recommended limits on the use of contract clauses to control scientific information.)
In the late 1970s, academic research in cryptography achieved several major advances, prompting responses from NSA. For example, an NSA employee unofficially informed the Institute for Electrical and Electronics Engineers that a conference presentation by Stanford University researchers (including Martin Hellman) of work related to public-key cryptography could violate export control laws. After consultation with university counsel, the presentation went forward.8 NSA also imposed a secrecy order on a patent application filed by University of Wisconsin professor
5 Office of Technology Assessment (OTA), Defending Secrets, Sharing Data, U.S. Government Printing Office, Washington, D.C., 1987, pp. 141-142.
6 OTA, Defending Secrets, 1987, p. 142.
7 National Academy of Sciences, Scientific Communication and National Security: A Report, National Academy Press, Washington, D.C., 1982.
8 Susan Landau et al., Codes, Keys, and Conflicts: Issues in U.S. Crypto Policy, Association for Computing Machinery Inc., New York, June 1994, pp. 37-38; and Martin Hellman, communication with Computer Science and Telecommunications Board staff, December 1995.
OCR for page 417
Page 417
George Davida; the order was later lifted. However, at NSA's request, the American Council on Education formed a study group that recommended a 2-year experiment in which cryptography research would be submitted to NSA for review, on a voluntary basis, before publication. This procedure began in 1980 and remains in effect. Over this time, NSA has made only a few requests for changes, and there appear to have been no long-term chilling effects on academic research.9
Funding of academic cryptography has also been influenced by secrecy concerns. In 1980, Leonard Adleman (another of the RSA algorithm's authors) submitted a grant proposal for research, including work on cryptography, to the National Science Foundation (NSF). NSA offered to assume all responsibility for funding unclassified cryptographic research, in place of NSF; this would give NSA the opportunity to subject all research proposals to secrecy review. Interpretations vary about the extent to which this proposal reflected a power struggle between NSA and NSF; ultimately, a decision at the White House level determined that both agencies would continue to fund cryptographic research.10
E.3 COMMERCIAL CRYPTOGRAPHY Growing interest and technical capabilities in cryptography within commercial communities brought cryptography policy into public debate in the 1970s.11 The spark that began much of this debate was the National Bureau of Standards (NBS) 1975 proposal for a new cryptographic technology standard required for government useand recommended for commercial useoutside classified (military and intelligence) applications. This was the Data Encryption Standard (DES).
NBS proposed the DES under its authority, in the Brooks Act of 1965, to recommend uniform data processing standards for federal government purchasing.12 The proposed DES was based on an IBM-developed technology. NSA's role in recommending changes to IBM's original algorithm raised questions of whether the agency had weakened the standard. The reduction in key length from 128 bits in IBM's original version to 56 bits clearly weakened the algorithm considerably, all else being equal.13 Public debate also addressed whether the revised algorithm con-
9 Landau et al., Codes, Keys, and Conflicts, 1994, p. 38.
10 Landau et al., Codes, Keys, and Conflicts, 1994, p. 38; and OTA, Defending Secrets, 1987, pp. 144-145.
11 Landau et al., Codes, Keys, and Conflicts, 1994, pp. 37-38.
12 OTA, Information Security and Privacy in Network Environments, 1994, pp. 134-136.
13 Horst Feistel, "Cryptography and Computer Privacy," Scientific American, Volume 228(5), May 1973, pp. 15-23.
OCR for page 418
Page 418
tained a trapdoor or other vulnerabilities. A review led by Representative Jack Brooks, however, concluded that changes had been made freely by IBM. Apart from the key length reduction, some changes that NSA suggested appear to have strengthened the algorithm against a form of attack, differential cryptanalysis, that was not widely known at the time.14
In 1977, the DES was issued as a Federal Information Processing Standard (FIPS). Its promulgation as a stable, certified technology stimulated its widespread use in commercial applications. It has been reviewed every 5 years for continued suitability in the face of advances in computing power and techniques available to attackers. NSA subsequently has played an important role in testing and certifying products for conformity to the DES. By 1986, NSA had certified more than 400 voice, data, and file encryption products using the DES.
In the mid-1980s, however, NSA announced it would stop endorsing DES products after 1988, instead focusing on a set of classified, hardware-based standards for modular products that were incompatible with the DES. (This approach is reflected, for example, in the Fortezza card-based systems that NSA is now promoting.) These plans raised immediate concern about the cost of switching over to new equipment in industries such as banking that relied heavily on products incorporating the DES.
This controversy was one factor that motivated passage of the 1987 Computer Security Act, which placed responsibility for standards development and product evaluation for nonclassified applications in the National Institute of Standards and Technology (NIST), the renamed NBS. As an agency of the Department of Commerce, NIST has a mandate to support U.S. commercial interests. In cryptography policy making, therefore, NIST could be expected to take commercial factors into account more wholeheartedly than NSA. NIST recertified the DES in 1988, and NIST became responsible for assessing product conformity to the standard. (The DES was most recently recertified in 1993 and, according to NIST, may or may not be recertified in 1998.15) NIST also developed other cryptographic FIPSs, including standards for algorithms (such as the Digital Signature Standard) and for implementation of cryptographic systems.
Another factor leading to the Computer Security Act was the need to resolve conflicts in agency responsibilities among the Brooks Act, various
14 OTA, Information Security and Privacy in Network Environments, 1994, p. 123.
15 The announcement of the most recent recertification of the DES states, "At the next review (1998), the algorithm specified in this standard will be over twenty years old. NIST will consider alternatives which offer a higher level of security. One of these alternatives may be proposed as a replacement standard at the 1998 review." See NIST, Announcing the Data Encryption Standard, FIPS Publication 46-2, December 30, 1993; available on-line at http:/ /csrc.ncsl.nist.gov /fips.
OCR for page 419
Page 419
Office of Management and Budget directives, and the 1984 National Security Decision Directive 145 (NSDD-145), which created a new process for setting standards for federal systems to protect "sensitive but not classified" national security information. NSDD-145 also made the director of NSA responsible for evaluating vulnerabilities and reviewing and approving security standards and systems for government information and telecommunication systems.16
NIST and NSA signed a memorandum of understanding (MOU) in 1989 delineating the agencies' roles under the Computer Security Act with respect to cryptography and other issues. Under the MOU, NIST is responsible for, among other activities, developing standards and procedures for the protection of sensitive (but not classified) information in federal computer systems, drawing on computer security guidelines of NSA where appropriate, and for coordinating with NSA and other agencies to ensure that these standards are consistent with those for protection of classified information. NSA provides NIST and other agencies with technical assistance related to cryptographic algorithms and techniques and to endorse products for application to secure systems. The two agencies also agreed to establish a technical working group to review issues of mutual interest related to protecting unclassified information.17
E.4 RECENT DEVELOPMENTS NSA played a strong role in the development of the Escrowed Encryption Standard (EES), through the process outlined in the MOU.18 The standard was in part an effort to forestall a mass market for telephone encryption devices that would obstruct authorized wiretaps. In 1992, AT&T announced plans to produce the first encrypted telephone backed by the marketing strength of a major corporation, the Model 3600 Telephone Security Device, which used the DES for encryption.19 On April
16 OTA, Information Security and Privacy in Network Environments, 1994, p. 143.
17 Memorandum of Understanding Between the Director of the National Institute of Standards and Technology and the Director of the National Security Agency Concerning the Implementation of Public Law 100-235; reprinted in OTA, Information Security and Privacy in Network Environments, 1994, p. 197; reprinted also in Appendix N.
18 It has been a matter of debate whether NSA's influence over NIST in the development of the EES was so great as to exceed NSA's advisory role authorized in the Computer Security Act. OTA concluded that "interagency discussions and negotiations by agency staffs under the MOU can result in delay, modification, or abandonment of proposed NIST standards activities, without notice or the benefit of oversight that is required by law." OTA also noted that NIST and NSA officials disagreed with this conclusion. See OTA, Information Security and Privacy in Network Environments, 1994, p. 168.
19 Landau et al., Codes, Keys, and Conflicts, 1994, p. 45.
OCR for page 420
Page 420
16, 1993, the White House announced an effort to develop a new standard for encryption of digitized voice communications that would allow law enforcement access by recovering an "escrowed" decryption key. The standard would be based on a classified algorithm made available by NSASkipjackimplemented in a hardware device, the Clipper chip. (See Chapter 5 for technical details of Clipper, the Skipjack algorithm, and key escrow.)
In February 1994, following a formal comment period in which virtually all written comments received by NIST were opposed to the proposed standard, NIST announced the adoption of FIPS 185, the EES.20 As a voluntary standard, EES is available for federal agencies (and private firms that so desire) to cite in procurement specifications for encrypted voice products, in lieu of the DES. AT&T incorporated Clipper into its encrypted voice product, now called the Surity Telephone Device 3600. A second initiative led to standards for data encryption devices using a smart-card design called Fortezza. The Fortezza card includes a Capstone chip, which uses Skipjack for confidentiality and several other algorithms for integrity and key exchange. In 1995, Fortezza was specified in a large procurement (750,000 units) of data encryption products for the Defense Messaging System.21
Recent federal initiatives have sought to promote broader use of escrowed encryption technologies. On September 6-7, 1995, NIST sponsored a workshop to discuss draft criteria under which software products with escrow features for authorized third-party access to keys could receive expedited export licensing review on the Commerce Control List, as opposed to the U.S. Munitions List. One criterion allows export of escrowed key systems with key lengths up to 64 bits. On September 15, 1995, another NIST workshop sought comments from private industry on the development of a new FIPS that would allow for both hardware and software implementations of escrowed key cryptosystems. In both of these areas, additional workshops and discussions are expected to continue.22
20 Landau et al., Codes, Keys, and Conflicts, 1994, p. 48; NIST, Escrowed Encryption Standard, FIPS Publication 185, February 9, 1994, available from NIST via the Internet at http:// csrc.ncsl.nist.gov / fips.
21 Kevin Power, Government Computer News, July 31, 1995, p. 1.
22 NIST, "Media Advisory: U.S. Government Seeks Public Comment on Draft Export Criteria for Key Escrow Encryption," November 6, 1995, available on-line at http:// csrc.ncsl.nist.gov/keyescrow; and committee and staff attendance at workshops.
Representative terms from entire chapter:
national security agency, ota information security, computer security act, data encryption standard, states cryptography policy, escrowed encryption standard, data encryption products, data encryption devices, network environments, cryptography policy, information security, export administration act, computer security, united states cryptography, ota information, security act, security agency, data encryption, export controls, encryption standard, commerce control list, export control, export administration, escrowed encryption, export licensing, commercial cryptography, states cryptography, cryptographic research, escrowed key, encryption products