June 8, 1998 Robert Biggerstaff POB 614 Mt. Pleasant, SC 29465 Jane Coffin Department of Commerce, Office of International Affairs, National Telecommunications and Information Administration (NTIA) Room 4898 14th St. and Constitution Ave., NW Washington, DC. 20230 (202) 482-1890. privacy@ntia.doc.gov Ms. Coffin, In the interest of truth in advertising, I would like to preface my remarks with some information about myself and my involvement in privacy issues. I was one of the panelists on the Federal Trade Commission's Privacy workshop last June. I am a degreed engineer and I have spent my entire professional career designing, developing, and working with computer networks and database systems, both in the private sector and for systems used by the United States government. While not a zealot or fanatic, I would describe myself as someone with a heightened sense of privacy issues. This is a direct result not of any personal experience where I have been the victim of any crime or invasion of privacy, but rather the result of my "inside knowledge" of the Internet, computer databases, their uses -- and misuses. I also run an Internet website devoted to educating the public about the misuses of information, and some steps that they can take to minimize their risks. I am also the President of the National Association Mandating Equitable Databases (the NAMED, Inc.), a nonprofit consumer organization chartered to help educate the public about unsafe data industry practices and to assist consumers in "opting out" and otherwise asserting their rights over the use of personal and private information. INTRODUCTION No business sets out to intentionally injure consumers' privacy. What I see, as a professional, is "collateral damage" - misuse (intentional and unintentional) of data gathered for legitimate purposes. Profiling users, with the user's permission, to see how your Internet site is being utilized is a legitimate use. Using it to build a profile of people to sell and use for other purposes is not a legitimate use. Using it to market to children is not a legitimate use (see e.g. Levi Strauss Mailing Targets Teen Girls, DM News, June 5, 1998. http://www.dmnews.com/articles/1998-06-01/1040.html). Collecting information necessary to process a subscription or billing is legitimate. Using that information to populate third party lookup services is not. Using a credit report to decide whether or not to make a loan to a consumer is a legitimate use. Using it to commit identity theft is not. As a counter to some of the invasive data practices, an unintended result is intentionally falsified information. If you can't assure me you are only going to use this information about me for legitimate purposes and that you'll never release it to anyone else, I won't give it to you. And if you require me to provide things like a name and address, I'll simply provide false information. I've done it as has nearly every other Internet user at one point or another. I have personally seen instances where over half of the names and e-mail addresses provided by users were fake. The end result is unreliable data that is nearly worthless. It is the legitimate businesses that are hurt by this intentional obfuscation. A few horror stories of people hurt by illicit use of data causes scores of people to falsify data, even when given to reputable companies. I recently filled a prescription at a drug store. The pharmacist asked for my social security number ("SSN"), address, phone, and birthdate. I provided false information because there was no explanation of the need or use of the information, and no guarantee of confidentiality. Even if such a guarantee was given, I would have still given false information since there is no real method to assure compliance or to redress a violation of that promise. The lesson to be learned from this is that in the long run, legitimate business that depend on information will be enhanced by laws protecting privacy and the use of personal information. By eliminating illicit use, consumers will be less apprehensive about providing accurate data to legitimate firms. The one instance of a national pharmacy chain compromising customer's privacy (CVS) has caused me and thousands of others to now distrust all pharmacies. RAW MATERIAL OF THE INFORMATION ECONOMY Sensitive personal and private information is the hazardous material of the information industry. And like a hazardous chemical, it demands rigorous controls and handling precautions from those who chose to use it. If a leak occurs, it can remain undetected for years, only to surface far away from the source of the leak, and cause damage decades later. I have never been one to foist unnecessary regulation onto business, but the status quo is unacceptable. We should not have to wait for the "Love Canal" of the information age before we take action. INFORMATION IS MONEY Now, more than ever before, information is money. Some companies, recognizing the future value of information are collecting vast quantities of data on consumers, even though they have no way to actually use the information.... at least not yet. Huge data 'vacuums' are hoovering up every piece of consumer data they can find.... partially for speculative or anticipatory use, but also as a hedge against future regulations on the collection or dissemination of such information. Data sellers are also pushing to obtain as much sensitive information as they can before laws and regulations stop their activities. I have seen numerous advertisements from data brokers, hawking their wares with phrases such as "get it now while you still can", and "buy your copy before new privacy laws close these records." This is not a new phenomenon, but it has been enabled by 1) the increase in power and capabilities of personal computers, 2) the explosion of digital information on the Internet, and 3) the inexpensive way that massive amounts of data can be archived, duplicated, and sent around the country - Computer CD ROM disks. For example, at least one vendor is selling financial data that is over 20 years old, since that data was grandfathered by privacy laws passed later. Another example is the state of South Carolina. In 1995, that state sold to direct marketers the driver's licence information (name, address, birthdate, height, weight, etc.) of all 3.6 million South Carolina drivers. Even thought the practice has now been made illegal and the database was only sold once, marketers continue to this day to sell a CD ROM with the 1995 data on it. This data continues to be used by other marketers and database vendors to populate and expand other databases and lookup services. For any effort at control to be effective, it must apply to all data regardless of when or how it was obtained. LOSS CONTROL PARADIGMS Security of sensitive information presents unique challenges in the information industry. Traditional loss-control paradigms don't apply. For example, a hardware store takes steps to physically secure a saw from being stolen, since if the saw is stolen, the store has lost a valuable piece of tangible property. The saw, once lost, can not be sold, and the store has to pay for another saw to replace it. With information as an asset, the rules are changed. If someone illegally accesses a computer with sensitive personal information in it, they don't actually take the information, they copy it. The original is still in its place. In general, the owner has not incurred a tangible loss (other than the loss of a potential sale to the person who stole the data rather than paying for it.) The traditional incentive - prevent theft to reduce tangible loss - is practically nonexistent with digital data. In fact, there is a disincentive to implement loss prevention where the cost of the protection exceeds the value of the potential loss. However the loss to the consumer is not taken into the vendor's calculations. For example, consider a manufacturer who has a tank where a chemical is stored. The chemical is inexpensive, and even if it all leaked out and had to be replaced, the cost to the manufacturer would only be a few hundred dollars to replace the lost material. Why would he spend thousands of dollars on a new, leak-proof tank in order to prevent a few hundred dollar loss? The millions of dollars in damage to the environment and his neighbors' ground water is not part of the manufacturer's equation. Similarly, the "leaking" of personal and private information is not a tangible "loss" to the vendor who sells it, but it can be serious loss to the consumer. Where is the incentive for a vendor to implement encrypted transfers, verify the identity of the recipient, keep an audit log, or audit for compliance? These costs do not improve their product or create more sales.... but they are an absolute necessity protecting his neighbors and society. Also, since a copy of information is as good as the original, once sensitive data is compromised, it can be mass duplicated and redistributed at will. Personal information truly is a genie that can not be coaxed back into the proverbial bottle. INDUSTRY EVASION Consider also the example of PublicData.com. In 1997, this Internet web site purchased the complete rolls of Texas drivers' license records and auto tags, and placed this data on the Internet - for free. Any person could visit this site, and get the home address, birthdate, height, weight, and other information on anyone in Texas, including President Bush and his wife Barbara. Anyone could find the name, home address, and other information with just a Texas licenses tag number. As a response, the state of Texas it illegal to provide such information on the Internet. The result? To the chagrin of Texas officials and the distress of Texas residents, the PublicData.com web site simply moved offshore, escaping the reach of Texas law, and is now a commercial information broker, charging a fee for access to its databases. Anyone with an interest in the regulation of personal and private information access should consider the example of PublicData.com very carefully. The Internet provides the perfect medium for doing two things - providing information and evading laws on providing information. Once information is released, it can not be controlled. The only protection, is to stop the release at its source. AN ANALOGY TO CABLE TV In considering regulations and other issues of Internet, consider an analogy of the Internet to Cable TV. You enter a monthly service agreement with a service provider, whereby you receive access to a number of channels (sites) and you have the ability to obtain other channels (sites) if you join/pay/register for them. There are also pay-per-view channels (sites). Your viewing habits can be easily tracked and recorded. There are some channels (sites) for all types of special interests... including some patently for adults only, and some offering information on controversial topics such as abortion, birth control, AIDS, drugs... the list is endless. Service providers, with the assistance of the cable operators, have the ability to identify individual viewers of each channel (site). A complete dossier of what you view (visit) can be compiled and used for innumerable purposes - some innocuous, and some malicious. Public disclosure or commercial use of information about your viewing habits (cable or Internet) has serious impacts on privacy, on your ability to exercise first amendment, and on other constitutional freedoms. We are all aware of intrusive marketing ploys, that collect all manner of personal information via a web site and then use it to further populate massive databases. There have already been examples of "front" sites (such as the Cult Awareness Network) purporting to be "for" or "against" a particular cause (such as abortion), when in reality they were being run by the "other side" as a scam to obtain names and addresses of "opponents" in order to target them for harassment, ridicule, or just to keep "on file" in case the subject ever ran for office or otherwise stepped into the public spotlight. This is not a situation unique to the Internet. Many companies have set up toll free "1-800" numbers, purporting to offer free information about some medical condition, the weather, or other service. However, the real purpose of these systems is to capture the name and telephone number of callers in order to build a massive database of people suffering from the disease or interested in the issue served by the 1-800 service. Computers never forget. Will a web site visit resulting from a one time curiosity with hemp production or a college term paper on abortion turn into weapon to be brandished by an enemy 30 years hence? I believe it should not. Should visiting a site about a militia group in the news put you on a dozen mailing lists and identify you as a supporter of neo-Nazi groups?. Absolutely not.... but this result is the state of technology and information practices today. Congress recognized with the Cable Communications Policy Act of 1984 (47 U.S.C.  521 et seq) ("the Cable Act") that viewing habits and other records associated with a consumers' cable TV account were deserving of extra protections, partially because of the high regard for personal liberty and privacy, and protecting the citizens right to seek entertainment and information on cable TV without fear that records of their viewing habits would be subject to misuse or disclosure. There are also similar federal laws covering other consumer data such as movie rental and telephone records. There is no logical reason for limiting these controls on release of personal information to cable TV records. Look at the recent headlines. Special Prosecutor Kenneth Starr is attempting to force a bookseller to disclose what books were purchased by a consumer. Such records... if available for the asking... would certainly chill a consumer's desire and ability to freely choose the books they wished to read. The same holds true for any publishing medium... including the Internet. The irony is that if he was a direct marketer, the special prosecutor could likely buy almost any information he wanted on a consumer's purchases. CONCLUSION I believe that the limits placed on cable television records would be well suited as a model for regulation of web site records. Indeed, with massive computerization of all aspects of retail sales and information collection (such as supermarket "shopper" cards, personal dossier database such as Axciom, and the explosion of irresponsible Internet based information brokers such as "Dig Dirt" and "Sherlock") I believe the Cable Act model may need to extend to all manner of commercial entities that collect, store, and release consumers' personal information not already covered by federal law (such as credit reporting agencies). I leave the reader with the following hypothetical. If judge Bork were nominated to the Supreme Court today, and instead of revealing his video tape rentals, his Internet site visits were revealed, disclosing an occasional visit to controversial sites, what would be the reaction? What about his book purchases at Amazon.com? His chat-room discussion on America on Line? What about his purchases at an on-line shopping site? What about records, attributed to Judge Bork, of the prankster pretending to be Judge Bork while visiting racist Internet sites? I believe the response to such disclosures would be the same as we saw after the release of Judge Bork's video rental records.... swift action by Congress. I hope we don't have to wait for damage to be done before reasonable action is taken. 1. The discussion paper sets out nine specific characteristics of effective self regulation for privacy: awareness, choice, data security, data integrity, consumer access, accountability, consumer recourse, verification and consequences. Which of the individual elements set out in the draft discussion paper do you believe are necessary for self regulation to protect privacy? To what extent is each element necessary for effective self regulation? What are the impediments and costs involved in fulfilling each element of a self regulatory scheme? What are the competing interests in providing each element? How would the inclusion of each element affect larger, medium sized, and smaller companies? What advantages or disadvantages does each element hold for consumers? What are the challenges faced by companies in providing each element? How do these challenges depend upon the size and nature of the business? A choice is only valid if the information and premises the choice was made on are valid. All the elements are part of the information foundation necessary to enable an informed choice. Most of these elements are already in place to varying degrees in different industries. However, they are implemented to the degree that benefits the seller of the data, and not the consumer. For example one of the credit bureaus implements a password protected computer system for subscribers to dial in and pull credit reports. However, the system's design is more in keeping with verifying billing information and ease of use rather than preventing illicit access. For example, the passwords are only two digits long and are never changed. In most cases, userIDs and passwords are assigned and remain the same for years. As a computer security professional, I find such practices woefully inadequate to protect such information. The same company publishes and distributes false and misleading information on their security practices. One can only assume that such mis-information is aimed as falsely assuring the public and staving off regulatory action. When considering application to businesses of varying sizes, regulation of the information industry is different than others. Where a small contractor or manufacturer has only a minimal impact individually on pollution or employment practices, those regulations often have legitimate exemptions for the small business. In the information industry however, things are different. A small information broker can set up shop in just a few days for less than the price of a new Yugo. Unlike a small retail shop who is limited in the number of customers it can serve by the location and size of the store, an Internet "store" can serve millions of customers a day. While one small store front selling credit reports over the counter can only cause a small leak, a similar broker set up on the Internet can cause a flood, selling thousands of credit reports in one day. This is why even a small number of companies who fail to comply with "voluntary" standards will render any privacy protections useless. Anyone who wants illicit data will simply use one of the brokers who don't comport to the voluntary guidelines. The explosion of fly-by-night information brokers on the Internet also causes me great concern. Many of these companies are not concerned with anything other than whether or not your check clears the bank. Anyone can sign up and get telephone records, credit reports, medical data, unlisted phone numbers... the list is endless. This explosion of information brokers is catalyzed by the growth in power of personal computers and the Internet's ease of collecting and distributing information. 2. The draft discussion paper notes that individual industry sectors will need to develop their own methods of providing the necessary requirements of self regulation. How might companies and/or industry sectors implement each of the elements for self regulation? Nothing has been proposed that is not either already in place or a simple adaptation of existing business practices. Collecting one more piece of customer information (can this data be re-sold) is simple. This isn't rocket science. Don't collect information you don't need. Don't collect or use data without first telling the consumer what you are doing. Give the consumer the right to opt-out and to see all information you hold on them. Protect all information you hold from misuse. If you decide to sell information, make positive ID of the recipient and keep accurate logs. 3. Please submit examples of existing privacy policies. In what ways do they effectively address concerns about privacy in the information to which they apply? In what ways do they fail? I have no individual examples of such policies to submit, but I would like to make three observations about those that I have seen. 1. They rarely exist. 2. Where they do exist, they are hidden away and difficult to find - the equivalent of the 'small print' on a contract. 3. They are grossly in favor of the maker, and against the consumer. One example is Microsoft, which says you can opt- out of solicitation from third parties, but you have no ability to opt-out of allowing Microsoft to sell to other database companies and marketers, any data you provide to Microsoft or anything else Microsoft learns about you. 4. Are elements or enforcement mechanisms other than those identified in the draft discussion paper necessary for effective self regulation for privacy protection? If so, what are they? How might they be implemented? In addition to the fair information practices and enforcement mechanisms stated in the discussion draft, are there other privacy protections or rights essential to privacy protection? Three things come to mind. First is anonymous access and access logs. If release of information without permission is to be regulated, how can illicit access be proven? How can it be audited? Just like inquiries on a credit report, you must keep an accurate log of all people you have provided my data to. Without it, there is no audit trail. Anonymous access must not be allowed. Similarly, recipients of data must be able to identify the source of data they obtain. For example, lets say I find my name and personal information being sold without my permission in a database of people who like jazz music. I suspect that data was sold by a jazz music website that I had told to NOT release my personal information. How can I prove it? Right now, I slightly alter the spelling of my name or address when registering products or websites, in order to trace the source of the leak. This should not be necessary. I should be able to contact the database, and require them to tell me from whom they obtained my data. Without this critical audit trail, any other regulation - voluntary or mandatory - is eviscerated. Third, is the creation of new data products from multiple sources. Congressman Frank Horton said in 1966 that: "One of the most practical of our present safeguards of privacy is the fragmented nature of personal information. It is scattered in little bits across the geography and years of our life. Retrieval is impractical and often impossible. A central data bank removes completely this safeguard." A single personal computer now has more power than all the computers owned by the entire US government in Congressman Horton's time. A single PC can be outfitted with off the shelf products for less than $10,000 that can hold the name, SSN, address, and birthdate of every adult in the country. By merging databases with seemingly innocuous data, new databases can be created that are a much larger threat that any of the subparts. These are in essence the "data banks" that Congressman Horton warned about. For example, a user visits a web site about allergy treatments which collects the user's E-mail address, and sells this info to a direct marketer. All this database says is that these E-mail addresses are interested in allergy remedies. By using a different database of global E-mail addresses, the marketer can determine the name and address of the E-mail address holder. With a name and address, they can obtain the phone number from any number of resources. With a name and address, they can obtain the user's SSN, and demographics such as income and race. Now they can re-publish a database of name, address, telephone, SSN, birthdate, age, sex, race, income, etc., of allergy sufferers. This compilation of information from multiple sources creates new comprehensive databases that are much more invasive than the sum of their parts. By combining otherwise innocuous information together into a central database can transform innocent tidbits of information that are not independently sensitive, into a highly sensitive dossier. 5. Should consumer limitations on how a company uses data be imposed on any other company to which the consumer's information is transferred or sold? How should such limitations be imposed and enforced? Traditionally, marketing databases are "seeded" with names and addresses designed to catch people violating terms of use. If you purchase a mailing list to send out a one-time mailing, but you re-use the mailing list in violation of the contract, these "seed" addresses will expose you. But if a marketer buys a database, and transfers it to a third party who uses it for a lookup service, there is no way for the original vendor to know of this use. Individual consumers do not have the ability to "seed" their personal information to track where it goes. They have no way of knowing when their personal information is added to a lookup service. When they do discover it, they have no way to determine from where the information was obtained. This is why leaks of information can never be found - there is no trail of bread crumbs to the source. I believe there is a need for a confidential brokerage service. If I have a database, and I have promised privacy to my customers in exchange for them giving me their information, I must not release it to third parties. A marketer wants to send free coupons to my customers, but I can't give him their addresses. He could give me the coupons and I could mail them, but then what proof does he have that I mailed them and didn't just throw them away? A confidential broker could take the marketer's coupons, and my mailing list, and confidentially use my mailing list to address his coupons to my customers. My customers' privacy is not compromised. However, a more sinister aspect of re-use beyond intended purposes exists. As more business is based on information and data, the value of information as an asset increases. A common phenomena is a business whose major asset is not the building it owns or the manufacturing facility - it is the data that it has compiled. A business may be able to compile a very comprehensive database of very personal and private information because of rigorous assurances to the subjects that their information will be used only inside the company and will not be released for any other purposes whatsoever. That database could be an irresistible fruit to a direct marketer who could buy the company (or enough stock to control it) just to obtain the database. Take for example Catalina Marketing in St. Petersburg Florida (http://www.catmktg.com/news.htm). Catalina Marketing contracts with over 11,000 supermarkets that, using "frequent shopper" or similar cards, collect and record every single item purchased by each consumer. What would happen if, in 10 years, an insurance company bought Catalina Marketing to get their database. Then, using this database, the insurance company could deny insurance to someone because they had bought too much fatty food in the last 10 years. They could sell the data as a lookup service to employers doing background checks to see if an applicant buys alcohol or cigarettes. They could even identify single women applicants who buy birth control for the "discriminating employer who prefers morally wholesome employees." And every bit of this is legal. 6. Please comment specifically on the elements set out in the draft discussion paper that deal with enforcement (verification, recourse, and consequences) and suggest ways in which companies and industry sectors might implement these. What existing systems and/or organizations might serve as models for consumer recourse mechanisms, and explain why they might or might not be effective? Would a combination of elements from existing systems and/or organizations be effective? How might verification be accomplished? What would constitute adequate verification, i.e., in what instances would third-party verification or auditing be necessary, and in what cases would something such as self certification or assertions that one is "audit-ready" suffice? What criteria should be considered to determine the kind of verification that would be appropriate for a company or sector? What constitutes "reasonable access?" What are the costs/impediments involved in providing access? What criteria should be considered to determine "reasonable access" to information for a company or sector? First, industry has often cried wolf with the perceived injury of new regulations. Second, when it is the right thing to do, it needs to be done when the cost is reasonable. The information industry has had a free reign to pillage personal information at will, and now it is time to accept more responsibility in return for their membership in the society of man. The "wild west" of unfettered access and exploitation of personal and private information must end. In fact, the longer the industry remains unregulated, the more Draconian eventual regulation will be perceived. Had reasonable regulation been imposed from the beginning, we would not even ask these questions today. Since comprehensive privacy protections - either voluntary or mandatory - have not been in place in this country, it is difficult to predict with any specificity the details of compliance verification systems. Being purely speculative, I believe an adaptive combination of both self-verification and third-party verification would be reasonable. For example, a self verification and minimalist third party verification along with compiling of complaints would be a first tier of "grading." Companies with high grades for compliance can continue self- verification and minimal third-party verification. Those with lower grades will be hoist of their own petard... and subjected to more rigorous third-party verification until such time that they can demonstrate the leadership to obtain higher grades. As a company's level of compliance changes, its level of verification will adapt to that level of compliance. An adaptive system that rewards compliance with lower costs of verification creates the incentive for companies to comply voluntarily. One excellent method of verification is empowering consumers as the primary enforcement mechanism. So called "private attorney general" statutes, such as the Telephone Consumer Protection Act (47 U.S.C. 227) work very well in this regard, as they effectively deputize 200 million Americans to enforce the law. Statutory damages such as the TCPA or the Cable Act provide an appropriate level of compensation both for the injury to the victim, and to compensate them for the time and effort in enforcing the law. 7. In the section on consequences, the draft discussion paper states that "sanctions should be stiff enough to be meaningful and swift enough to assure consumers that their concerns are addressed in a timely fashion." Identify appropriate consequences for companies that do not comply with fair information practices that meet this goal, and explain why they would be effective. One problem is that traditional market sanctions don't come into play. Consumers can not "vote with their feet" by taking their business elsewhere unless 1) the consumer is informed and 2) the consumer actually does business with the company. Consider credit bureaus. The consumer does not "pick" which credit bureau to do business with, or which credit bureau has the franchise to sell that consumer's data. If a local bank decided to sell name, address, SSN, birthdate and other credit "header" data on its customers to anyone who paid $2.00 for it, that bank would soon go out of business, since consumers would 1) know about it and 2) would take their business elsewhere. However that is exactly what the credit bureaus do because they have no allegiance to the consumer, and the consumer has no ability to take their business elsewhere. Another aspect of traditional penalties associated with regulation of information access, is demonstrated by the FCRA. A consumer can generally take action only against someone who obtains information in violation of the FCRA, not the company who provides it. In addition, a consumer must usually demonstrate actual damages. These hurdles combine to form a nearly insurmountable burden to anyone seeking redress against illicit use of their information. So few cases can surmount these burdens, that the companies can simply pay the losses from a rare successful complaint rather than put more effective access controls in place. a. Penalties for unauthorized release, not just unauthorized access. While a person who accesses information for illicit purposes is certainly doing a wrongful act, a company who does not perform due diligence in determining the identity of the recipient and the recipient's right to access the data is failing their duty to protect the data from unauthorized use. We don't allow dynamite or guns to be sold without a license check and positive ID of the recipient. Data such as credit headers can just as easily be used to commit crime. Similarly, records of web site visits, Internet purchases, and other Internet activities can be used to embarrass, extort, and otherwise injure consumers. b. Liquidated damages. The failure to follow safe information practices leads to many injuries of consumers. But in most of these cases, the injuries are not quantifiable. It is difficult to put a price on the feelings of fear and dread that sweep over you when you find out someone has obtained your credit report or DMV records without your knowledge or permission. The time spent in contacting banks and credit card companies to respond to potential security threats is time the victim should not have to spend. A $1,000 civil penalty for the release of information without verification of the identity and permissible use by the recipient should be available, independent of any necessity to prove any actual damages. Such "private attorney general" statues are well known and work well. As an example, the Telephone Consumer Protection Act (47 U.S.C. 227) provides for this type of enforcement. It has worked extremely well in practically eliminating junk faxes and automated telephone solicitation calls. The Cable Communications Policy Act of 1984 (47 U.S.C.  521 et seq) is another good model for private enforcement of privacy regulations. 8. What is required to make privacy self regulation effective? Self-regulatory systems usually entail specific requirements, e.g., professional/business registries, consumer help resources, seals of accreditation from professional societies, auditing requirements. What other elements/enforcement mechanisms might be useful to make privacy self regulation effective? How have these enhanced or failed to enhance a self-regulation regime? As explained earlier, seals of accreditation, registries, etc., only come into play when the consumer can choose whether or not to do business with that company. In the case of lookup services, information brokers and credit bureaus, such choice is non existent. It would be wonderful if enhanced privacy could be considered a value added service, and traditional market forces could operate, but that is not the case today. On the other hand, where licenses are required for a specific field (attorneys, private investigators, etc), having licenses contingent upon compliance to fair information standards can work given 3 things: a. Swift and sure enforcement. There must be a high likelihood of getting caught. b. Meaningful punishment. A slap on the wrist is insufficient. It must be considered a significant transgression that if repeated, would threaten the professional license. c. Regulatory Necessity. The license that is subject to revocation or suspension must be legally necessary for doing business in that field. 9. Self regulation has been used by the business community in other contexts. Please provide examples and comment on instances in which self regulation is used in an industry, profession or business activity that you believe would be relevant to enhance privacy protection. In what ways does self regulation work in these instances? In what ways does it fail? How could existing self-regulatory regimes be adapted or improved to better protect privacy? The measures in the draft are all premised on the consumer being to make a choice. But that choice, in order to be a real choice, has to be an informed one. I am unaware of any self regulation of privacy related issues that has been even moderately successful. The guidelines proposed by the DMA last summer are woefully inadequate. With regard to voluntary guidelines in the marketing industry, I think one example is worth noting. The Direct Marketing Association ("DMA") encourages its members to use the Telephone Preference Service ("TPS") in their telemarketing campaigns. This is a list of consumers who have taken an affirmative act to tell telemarketers that they do NOT want to receive telemarketing calls. The result? In 1991, the DMA testified before Congress that out of over 3,500 DMA members, less than 90 actually used the TPS. Many DMA members actively tell consumers that to reduce unwanted telemarketing calls, they should contact the DMA and register in the TPS. However many of these members who promote the TPS to consumers as a way to reduce the number of telemarketing calls, do not use the TPS themselves! 10. Please comment on the extent to which you believe self regulation can successfully protect privacy online. Are there certain areas of online activity in which self regulation may be more appropriate than in others? Why? At this time, I am unaware of any self regulation of online privacy related issues that have been or could be even moderately successful. This is dictated by several things. First, consumers are for the most part unaware of the perils of misuse of personal and private information. It took Love Canal to wake the country up to chemical pollutants. It will unfortunately take a similar catastrophe to demonstrate to the American people the dangers of information misuse. Second, as explained earlier, a single noncompliant information outlet can do very serious damage in a very short time. Self regulations, by its voluntary nature, will always be under inclusive, and fail to bring many companies into compliance. Third, the financial incentives are all wrong. An information broker can sell a header report on the Internet for $50.00, but the report only costs the provider $2.00. The entire transaction can be automated, no human hands have to be involved... the end user inputs all the data, and the web server does the rest. The results can be returned instantly. Why should anyone engaged in this business comply with "voluntary" standards when the result is a reduction in sales, and an increase in costs? With financial incentives like this, there will always be illicit providers who will flout voluntary guidelines. 11. Please comment on the costs business would incur in implementing a self-regulatory regime to protect privacy. How do these costs compare to the costs incurred to comply with legislation or regulation? With regards to websites, the vast majority of information collection is automatic or entered by the user themselves. Giving users the ability to delete their information, or a field to indicate that the information is not to be re-release or sold would be inexpensive to implement. Indeed, in most cases, web sites have gone to extra expense just to obtain and store information that is totally unnecessary to the operation of their web site... this information is being obtained solely for marketing purposes. A greater cost is the lost profits to those companies of sales of personal data from people who previously had no option to "opt- out." If 50% of the people opt-out, that reduces the income from the subsequent sale of the consumers' information. In some instances, the sale of consumer information is a significant source of income. As an example, some speciality magazines and other subscription services derive more income from the sale of subscriber info than they do from the subscriptions themselves. I do not believe however, that there is any right to continued unjust enrichment from the continued sale of information that should not have been sold, or even collected, in the first place. 12. What issues does the online environment raise for self regulation that are not raised in traditional business environments? What characteristics of a self-regulatory system in a traditional business environment may be difficult to duplicate online? Does the online environment present special requirements for self regulation that are not present in a traditional business environment? Does the traditional business environment have special requirements that are not presented in the online environment? What are these requirements? The power of personal computers and the connectivity of the Internet are major differences between Internet paradigms and those of traditional businesses. However, the key with respect to privacy is the nature of the medium - digital data. Digital data can be massaged, manipulated, indexed, sorted, and compiled at the speed of light. Traditionally, name, address, and other information was collected on paper application or membership form. The costs of actually entering all this data into a computer was prohibitive. The increase in the value of the data, and reductions in the cost of obtaining it in electronic form has change the equations significantly. By having the user enter the information themselves, this hurdle is removed. The digital medium is the single most important element that has enabled the assault on privacy. More data can be collected at more points than ever before. Another concern is the ability of an information broker to set up shop on the Internet with no physical place of business, no business license, and no other controls on them. While many reputable businesses operate this way, so do many disreputable ones. It is easy to find dozens of information brokers on the Internet... many having come into existence in the last few months. The anonymity of the Internet also presents certain problems. Without controls to require information providers to verify the identity of the recipients of the information the brokers are selling, anyone can obtain practically anything anonymously. I have personally obtained information under a false name, and had it sent to an anonymous e-mail address. It is a stalkers paradise. The Internet enables this type of surreptitious activity. 13. What experiences have you encountered online in which privacy has been at issue? In what instances has privacy appeared to be at risk? In what instances is it well protected? In what ways have businesses or organizations been responsive to privacy concerns? How difficult have you found it to protect your privacy online? What circumstances give rise to good privacy protection in a traditional business setting or online? Many times, web sites ask for personal information. Because of my professional knowledge of computers and databases, I personally refuse or provide false information. However, most users do not, and thereby expose themselves. I have refused to use America On Line and some other services who I know or suspect of rampant privacy abuses. While I as an expert find these precautions a mere inconvenience, most users do not share my level of knowledge or expertise and would find such precautions burdensome if not impossible to implement. They are generally unaware of the risks such disclosures may entail, and they are not able to take the precautions that I take. Placing the burden of 'protection' on the unknowing user is unacceptable. 14. The Administration's A Framework for Global Electronic Commerce cites the need to strike a balance between freedom of information values and individual privacy concerns. Please comment on the appropriate point at which that balance might be struck. What is the responsibility of businesses, organizations or webpages to protect individual privacy? To what extent do these parties have a right to collect and use information to further their commercial interests? To what extent is it the individual's responsibility to protect his or her privacy? Freedom of information does not mean that information must be made available anonymously and in bulk. It does not mean unjust enrichment. It does not mean invasion of privacy. It does not mean false light. Freedom of information means that government is not conducted behind closed doors. It means that the government's papers are open to inspection by the governed. It does not mean a website had carte blanch to record and report on everything I do without my permission. It does not mean that a business can make a condition of sale, the release of personal and private information beyond that legitimately needed for business needs. The doctrine of sic utero tuo ut alienum non laedas is still the law in this country. You must not use your property or business, to injure me. It also means that you must not knowingly enable others to hurt me with your products. The digital age has removed the protections Congressman Horton spoke of. Data is no longer scattered in little bits here and there. Identity theft, invasion of privacy, and threats to personal security are all real injuries that are enabled by lax data handling practices. The goal must be to protect consumers from the misuse of their information... regardless of who actually misuses it.