1997 ASIS Mid-Year Meeting
by Steve Hardin
Janlori Goldman, deputy director of the Washington-based Center for Democracy and Technology, discussed some of the issues surrounding privacy in an electronic world at the ASIS Mid-Year Meeting in Scottsdale, Arizona. Goldman, who is also a visiting scholar at Georgetown University Law Center, talked about the challenges presented by the Web, medical information, encryption, regulation and other issues during her keynote address.
Goldman commented that many civil libertarians once saw computers and communications media primarily as a threat to privacy. It is easy to see the Orwellian aspects of the Internet and the Web. In fact, she said, it often is the case that these developments threaten privacy. But the other side of the coin -- that technology must be developed for the good of the individual and ultimately the good of the community and society -- was not apparent for many years. We should not be pitting the individual against new technologies, she said. Rather, we should look at ways we can harness the new technologies to more effectively communicate and access information.
The traditional approach taken by regulatory agencies needs to expand as well to address issues posed by interactive media, she said. Regulatory agencies follow their traditional models; they hold hearings, get comments and issue rulings. Many representatives of Internet-based industries are saying it is better not to regulate this nascent environment; however, there is a role for national policy makers. But novel, informed approaches are required.
Goldman stated there is no way to talk about the "Internet industry" as a whole. Self-regulation, which she said works best for trade associations, has been offered as a way of staving off legislative solutions. People fear that Congress will get regulation wrong. But self-regulation has pitfalls, too. It is not legally enforceable and tends only to bind the "good guys." People unconcerned about doing the right thing are unaffected by self-regulation.
Change has been very dramatic in the Internet environment, Goldman noted. Two years ago, there were very few people doing anything on the Web. But now, ads, movies and many other commercial ventures have URLs. The assumption is that most people are going to use the Web to get more information. She related that when she was laid up with a recent foot injury, she did all her grocery shopping on the Web. Electronic commerce will not work if people's privacy cannot be guaranteed. This is true even though people will often negotiate away their privacy in exchange for a good bargain. The consequence, both for society and for commerce, is that people will change their behaviors to adjust to an environment that grabs up and reuses information about them without their knowledge or consent.
As far as legislation is concerned, things work best when there is an intersection of interests. For instance, when the Wiretap Act of 1986 was passed, there was a powerful coalition of civil liberties, business and privacy groups. When Judge Bork's video rental records were made public in 1988, the American Civil Liberties Union worked with the Direct Marketers Association and others to ensure a video privacy bill within six months. Again, there was a strong convergence of interests. However, the American Library Association was unable to get library records included in the bill because of the FBI's vigorous opposition.
Goldman says that the inclusions and exclusions in existing privacy laws create a "goofy patchwork" of privacy protection. There is no cohesive body of law for privacy. What we have seen on the Internet is that some real concerns about privacy have arisen. LEXIS/NEXIS put together a lookup service that allowed subscribers to get phone numbers, mothers' maiden names and other personal information. Many people, such as those who pay to keep their phone numbers out of the telephone directory, were angered. LEXIS/NEXIS took a lot of heat and bad press for that, she said. And consider the Social Security Administration, which just got in trouble with its online system to allow people to obtain their social security information. There had been no trouble reported, but the perception of insecurity was there.
"Isn't there a better way to do this?" Goldman asked. She suggested a feature that would allow users to state what information they were willing to give to a Web site. Users could refuse to give out any information about their identities if they chose -- although, of course, the Web sites could also refuse to let them log-on.
Then there is the issue of filtering Web sites. Software exists, of course, to let you control which sites people who use your system can see. Some people use the PICS -- Platform for Internet Content Selection -- rating system. You can choose various filters, such as those that filter out violence but not sex. The Federal Trade Commission is very interested in this approach, she said. It fits in with the concept of self-regulation and it is enforceable. The Internet Privacy Working Group (IPWG), launched in November, has been looking at privacy-enhancing technologies. A lot of progress has been made, she said.
Goldman agrees with those who say privacy is essential for the development of the self. If people are afraid that information will be shared, then they will be afraid to fully participate in society. People will be far less likely to take risks. People with diseases may not get treated because they don't want their health information to be made available to others.
We need to fight terrorism and other crimes, she acknowledged. But do we need to do it by throwing out everyone's privacy? She noted that we don't approach this issue by considering the harm done to society when privacy is not protected, but there is a real social cost.
Janlori Goldman can be reached by e-mail at email@example.com or by phone at Georgetown University Law Center, 202/662-9432, or at the Center for Democracy and Technology, 202/637-9800.
Someone may be watching you online!
Check out CDT's Privacy Demonstration www.cdt.org/privacy/
At the second plenary session of the 1997 ASIS Mid-Year Meeting, Herb Lin, a senior staff officer with the Computer Science and Telecommunications Board (CSTB) of the National Research Council, discussed threats to information security and their possible solutions. He outlined the results of two recent board studies: Cryptography's Role in Security in the Information Society (CRISIS), published in 1996, and For the Record: Protecting Electronic Health Information, published in 1997.
Lin presented several of his points in the form of threats and responses. First, he looked at Category 1 threats to information security, which he identified as unambiguous threats, including eavesdroppers on fax lines or corrupt or careless medical clerks. Hackers may sniff for passwords or medical records on the Internet. Foreign governments can intercept trade secrets. All of these are recognized as problems.
Then there are Category 2 information security threats. They may include insurance companies or employers who obtain medical records and use them to deny coverage or employment. Consumers may see this as a problem, but insurance companies can say they are making prudent financial decisions. Local police may conduct electronic surveillance on alleged civil rights groups, but maybe the civil rights groups are a cover for bombing groups. How do you know which is which? All of these Category 2 threats demonstrate conflicting objectives.
The CRISIS report notes that information assets are at risk. The threats, Lin said, are largely covert. If you see missiles in a hostile country, you know there is a threat. But if the hostile country assembles a team of hackers, you can't tell. You can't even tell you have been under attack if they do it correctly. When you have to reboot Windows 95, have you done something wrong or do you have a virus or have you been hacked? It doesn't take much money to develop an information threat. All you need is a PC, a modem and a graduate student. The threat is continuous in intensity from low (hacking one computer system) to high (hacking a national network). In addition, there is little necessary correlation between the amount spent to create the threat and the potential damage caused by it.
Electronic threats are most significant, Lin told his audience. The National Counterintelligence Center says that specialized technical operations account for the largest portion of economic and industrial information lost by U.S. corporations. CRISIS concluded that the information society is vulnerable to information threats. Cryptography is a good tool for defending against those threats, but, Lin noted, cryptography is available to both the good and the bad guys. There is also a greater use of information technology in health care.
There are integrated delivery systems and linkages between organizations. In addition, applications of information technology are expanding to include electronic medical records, corporate intranets and internet applications.
Lin said that use of information technology raises new questions about privacy and security that must be answered if patients are to share sensitive health information with providers. Solutions must be found that protect patient privacy while ensuring that providers have legitimate access to information for purposes of care. Electronic medical information is coming, Lin said; resisting it is not an option.
The For the Record report identified Category 1 threats as inappropriate releases of information from individual organizations (corrupt clerks, nosy neighbors) and unauthorized users breaking into systems to retrieve or alter information (hackers). The same report labeled Category 2 threats as systemic flows of information among organizations in health care and related industries for socially sanctioned reasons. For example, health insurance companies want to get information on you to see if you actually had the operation for which they are paying. Life insurance companies want health status information to make sound actuarial decisions. Are these threats? What about epidemiological researchers doing longitudinal studies? Health care organizations care about outcomes management. They want to know what works and what doesn't. Pharmacy benefit managers check for drug interactions. These situations all represent risks to privacy, but potential benefits, such as lower cost and more effective care, must be considered.
So, Lin asked, how do you deal with these threats? For Category 1, he said, the response is unambiguous. You can use technology and invoke policy and procedures to promote the use of technology to curb the threat. For Category 2 -- ambiguous threats -- you must define the threat with dialogue. There is no such thing as a technological fix for this threat, Lin said.
The Role of Cryptography in Information Security
Lin next turned his attention to information security and cryptography. He said that information security has four primary functions: confidentiality (also called encryption), which ensures that information is meaningful only to the proper parties; authentication, which verifies that the asserted sender/author/recipient of information is the true sender/author/recipient; integrity, which is a way of checking to make sure information has not been secretly altered; and non-repudiation, which is actually a combination of authentication and integrity to make sure the transaction cannot be denied in the future. He treated the audience to his 10 Minute Primer on Cryptography.
Lin stated that responses to Category 1 threats can include technology such as encryption, firewalls and locks, but technology is useless if it is not used! Policy and procedures are needed to encourage the proper use of technology. People implement, fix and upgrade technology. Good policy and procedures are at least as important as good technology, Lin asserted.
U.S. Government Policy on Cryptography
The current U.S. national policy on cryptography is based on two pillars, Lin said. The first includes export controls to deny cryptography to other countries (for national security). The second pillar involves promotion of key recovery encryption, in which a copy of the encryption key is stored in a place to which the government can gain authorized access.
CRISIS concludes that current national policy discourages the use of cryptography. The report says that export controls were overly restrictive, and confusion existed in the marketplace about what was permitted. CRISIS said government policy should encourage the use of cryptography. The benefits of the widespread use of cryptography include better protection for corporate information, higher security, increased privacy for U.S. citizens and continued U.S. leadership in information technology. The costs include making law enforcement surveillance and security intelligence more difficult.
An important role for law enforcement is crime prevention. Cryptography provides a good tool for preventing certain types of crime. So, according to CRISIS, the traditional view that law enforcement and cryptography are always opposing interests is not correct. CRISIS recommends there be no ban on the sale or use of any kind of cryptography domestically. The authors of CRISIS believe the government's current promotion of key recovery is overly aggressive. It is appropriate to relax, but not eliminate, export controls. In addition, the authors want the government to promote non-confidential applications and better security.
Security for Medical Records
The For the Record report, which included information on medical records, found a variety of practices that can improve protection and be implemented with reasonable cost and effort. Technical mechanisms must be accompanied by organizational mechanisms for developing access and release policies, training workers and penalizing violations of policy. One of the technical practices suggested by For the Record is individual authentication for each person accessing the system. (One site the team visited had all its doctors using the same password and ID.) Other technical practices include access controls, audit trails, encryption of external communications, physical security (including locks on doors), disaster recovery (files should be backed up, and the backups should be encrypted), protection of remote access points and software discipline (make sure no viruses are introduced) and system assessment.
For the Record also discussed organizational practices. The report states security and confidentiality policies should be developed before a disaster. Too many hospitals write their policies on the fly based on newspaper headlines. Security and confidentiality committees representing various stakeholders are needed. Information security officers should be hired, too. Education and training programs are also a must. There should be sanctions for violators; you have to be willing to fire doctors as well as clerks for security violations. Authorization forms should be improved. Patients should have access to audit logs to see who has had access to their records.
As far as the Category 2 threats -- the more ambiguous kind -- are concerned, For the Record says the only solution is to develop a consensus on tradeoffs; there are no technical solutions. Planners need to consider the value of privacy vis-a-vis other interests regarding medical information. There should be fair information practices for health organizations. Consumer and practitioner awareness should be promoted. The report went on to say that any method used to identify patients or link patient records should be accompanied by a policy framework that defines violations, specifies sanctions and facilitates the identification of parties who link records. It should also allow unidirectional linking of information, making it easy to go from identity to the linking of a record, but difficult to determine identity from records or to identify the scheme itself. However, other issues such as cost and speed are at stake, too.
Threats Not Static
CRISIS says that Category 2 responses should include consideration of the tradeoffs between law enforcement and national security benefits and costs (better crime prevention versus better protection) as well as the tradeoffs between establishing a key repository now versus later or never. Issues include the loss of confidentiality versus the loss of access, as well as harm to the market versus benefits to law enforcement and national security.
CRISIS notes that threats are not static. The report recommends a periodic increase in key sizes for easy export. For the Record also reminds us that threats are not static; future technologies will always be a consideration. For every move you make, there is a counter move. It will never end. Data should be collected on threats, there should be an ongoing process for evolving standards, and testbeds for experimentation should be funded. Information on Cryptography's Role in Security in the Information Society and For the Record: Protecting Electronic Health Information is available on the CSTB Web page at http://www2.nas.edu/cstbweb. The reports may be ordered by calling CSTB at 800/624-6242.
The original, unencrypted message is called plain text. The encrypted message is cipher text. Cipher text depends on the original message and the key. Decryption is going the other way, assuming you know the encryption key. If you do not know the key, there are only two ways of attacking the problem. First, you can analyze the cipher text, or second, you can do a "brute force" search of the cipher text.
Analysis of cipher text is done using structural regularities. For example, English has known letter frequencies: 14% of all letters used are e. This approach is easier if you have lots of cipher text and known plain text.
The brute force method tries to decrypt the cipher text with each possible key. You are done if something sensible appears. If the key has 10 bits, you may need to try as many as 210 or 1024 keys before you find the correct one. Each additional bit doubles the number of keys that must be tested.
A good encryption scheme has a good algorithm with nothing too explicit in cipher text to aid structural analysis and a long enough key to preclude practical brute-force searches. It is possible to develop an encryption scheme with a key so long it would be impossible ever to decrypt it by brute force. For example, a key of 1000 bits would take more than the age of the universe to decode if every atom in the earth were a computer.
Editor's note: Readers may want to review the recent brute-force decryption of the DES or Data Encryption Standard, a 56-bit key scheme, by a group of programmers and researchers using computers networked through the Internet. The answer was found after approximately 25% of the 72 quadrillion possibilities had been tested. See, for instance, "Group Cracks Financial-Data Encryption Code," Wall Street Journal, June 19, 1997, p. A3, col. 1.
The final plenary session at the recent Mid-Year Meeting of the American Society for Information Science featured a dialogue between Bellcore's Michael Lesk and Microsoft's Robert Frankston. Lesk authored many UNIX utilities and has worked on information systems; Frankston is a co-creator of Visicalc, the first personal computer spreadsheet, and has contributed to the design of numerous advanced technology projects. The session was co-sponsored by SIG/Library Automation and Networks (LAN). Moderator Ellen Sleeter posed several questions to which Lesk and Frankston responded. Some of their remarks follow.
Lesk stated that one of the big tradeoffs in the information world is that of efficiency versus privacy. Record-keeping permits targeted marketing and personalized payment, packaged products and quantity discounts, site licenses and shared costs. It makes things easier to find, but, by the same token, harder to hide.
Lesk listed the "four horsemen of the Internet" as drug dealers, terrorists, foreign espionage agents and child pornographers. If there is too much privacy, will Congress over-regulate the Net in response? Will that shut the whole thing down? Do we really want money laundering, anonymous libels, plagiarism, blackmail and electronic vandalism? Lesk believes we should discourage anonymity on the Web. He noted that telephone caller ID, while it can reduce privacy, has reduced the number of obscene phone calls in New Jersey. He is more frightened, he said, of the drug dealers than of the government taxing him inappropriately because it discovered how he spent his money.
Frankston took a different tack. He said he remembers the "post-McCarthy days," when the Post Office would take down the name of anyone writing to a Communist country. He asserted privacy is extremely important. People want to store their files and know they will not be accessed improperly. So he thinks we need to accept a certain amount of chaos. However, we must do it correctly. People will find ways to abuse any medium. You can choose how much information you want to give up. There is no simple answer; this will be an ongoing battle, Frankston said. He fears we will use the current bogeymen -- terrorists (he remembers when it was Communists) -- to give up our privacy.
Frankston observed that the basic problem with digital information is that people can access all your little transactions, add them up and do something with the information. You don't know the interpretation people will put on the data in the future. An innocent act years ago can be reinterpreted years later. He is less concerned about credit cards; we give out the numbers easily. Even before the Internet, there was a problem with people stealing credit cards when the cards were sent out through the mail.
Lesk noted that supermarket affinity cards tell what you have purchased. Supermarkets can figure out your diet; they can provide health information to insurance companies. That doesn't worry him, he said. We are willing to surrender some security in return for "a dollar off for a pound of fish."
More worrisome, Lesk said, is employers reading their employees' e-mail. As yet, there is no model on whether e-mail activities at work should be private. Policies requiring that e-mail should be used only for business can hurt commerce; a little friendly chitchat greases the wheels and makes business happen.
The big security issue on the Net is downstream copyright. Of the two terabytes of information on the Net, none of it is copyrighted books, he said. Publishers don't see how to protect their work.
What can help? Lesk outlined a technique of putting things on screens so that they can be read but not copied. The browser switches back and forth between two images; it fools the eye, but makes the information hard to copy. There are also encryption chips for printers; a person can give publishers the key and be permitted to print their copyrighted material. There's also the problem with people either stealing other people's work or damaging work to discredit the author. There are digital signature techniques to help with these problems, Lesk said.
Frankston countered that there are many technological solutions, but he doubts printer or screen encryption will work. Photocopiers used to be the big copyright problem.
The discussion continued for some time, with several members of the audience joining in as well. No one point of view seemed to predominate. However, a lot of thought-provoking information was exchanged, which was the idea behind the session in the first place!