0001 1 UNITED STATES OF AMERICA 2 DEPARTMENT OF COMMERCE 3 AND 4 THE INTERNET EDUCATION FOUNDATION 5 - - - 6 7 ONLINE PRIVACY TECHNOLOGIES 8 WORKSHOP AND TECHNOLOGY FAIR 9 - - - 10 11 Auditorium 12 Department of Commerce 13 Building 14 1401 Constitution Ave., N.W. 15 Washington, D.C. 16 Tuesday, September 19, 2000 17 18 The workshop was convened, pursuant to notice, 19 at 9:18 a.m. 20 21 22 23 24 25 0002 1 P R O C E E D I N G S 2 (9:18 a.m.) 3 INTRODUCTION - GREGORY L. ROHDE, 4 ASSISTANT SECRETARY OF COMMERCE FOR COMMUNICATIONS 5 AND INFORMATION, ADMINISTRATOR, NTIA 6 MR. ROHDE: Good morning and welcome. My name 7 is Greg Rohde. I'm the Assistant Secretary of Commerce 8 for Communications and Information, as well as the 9 Administrator of the National Telecommunications and 10 Information Administration. I am very pleased to welcome 11 all of you here this morning at this Online Privacy 12 Technology Workshop and Technology Fair. 13 Central to the mission of NTIA is to promote 14 electronic commerce, as well as to promote the development 15 of new technologies. I am very pleased today that you can 16 all come here to join us as we carry out our mission to 17 look at how we can promote electronic commerce and develop 18 new technologies. 19 The purpose of today's workshop is to focus on a 20 particular area of electronic commerce and that is an area 21 that is so vitally important to a lot of consumers, and 22 that is privacy. You don't need to look at national polls 23 to tell you that consumers care deeply about protecting 24 their private information on line. You can simply look in 25 the comic strips. When it reaches a comic strip such as 0003 1 "Cathy," when she calls it "the information superhighway," 2 you know we've got an issue that is really close to the 3 pulse of the American people. 4 So I'm very pleased to today that we can be here 5 to discuss this issue. We have a number of panel 6 discussions we're going to have throughout the day. 7 They're going to explore various aspects of this issue. 8 We also are very fortunate to have a Technology Fair which 9 is occurring out in the foyer area of the building, and 10 this has been co-sponsored by the Internet Education 11 Foundation. We're very pleased at their participation in 12 this event. 13 Without any further delay, I would like to 14 introduce the new Secretary of Commerce. We're very 15 pleased that the new Secretary has chosen to join us here 16 for a few minutes and is going to give a keynote address. 17 As many of you know, shortly, this summer, President 18 Clinton nominated Mr. Norman Mineta to succeed William 19 Daley as the Secretary of Commerce. Mr. Mineta comes to 20 us from a long career on Capitol Hill. He also began his 21 career as the Mayor of San Jose, and I can tell you that 22 since coming on board in the Commerce Department he has 23 certainly energized and inspired many here, and we're 24 extremely honored and pleased that Secretary Mineta is 25 here with us today. 0004 1 So without any further ado, I give you Secretary 2 Norm Mineta. 3 KEYNOTE ADDRESS - HON. NORMAN Y. MINETA, 4 SECRETARY OF COMMERCE 5 SECRETARY MINETA: Greg, thank you very, very 6 much, and I want to thank you for the terrific job that 7 you are doing as our both head of Communications and 8 Information as well as your dual other job as 9 Administrator of the NTIA. 10 First of all, I want to thank all of you for 11 taking time from your own busy schedules to be here with 12 us today. This is a very, very important conference and 13 I'm really impressed with all of you being here. 14 Two years ago when we held a privacy summit, a 15 poll was released that said that more than 80 percent of 16 Americans were concerned about their threats to their 17 privacy when they were on line. Unfortunately, according 18 to recent reports, the numbers have not changed and a Pew 19 study shows that over 86 percent of Internet users are 20 still concerned about businesses or people they don't know 21 getting access to their personal information. 22 Now, what has changed, however, is this, and 23 that is the way that industry is responding to these 24 concerns and the new technologies that are available to 25 protect online privacy. That's why we are all here today, 0005 1 to explore what those might be. 2 In 1997, when a fraction of today's web 3 population was on line, President Clinton and Vice 4 President Gore issued their policy framework on e- 5 commerce. They understood that if e-commerce were to 6 succeed it would be under private sector leadership. It 7 was the government's role to foster an environment to 8 promote e-commerce and to work with the private sector to 9 ensure that the Internet would grow worldwide. 10 Since then the Internet without doubt has 11 flourished and technology has dramatically lowered costs. 12 More and more people and businesses are online and e- 13 commerce is projected to climb into the trillions of 14 dollars over the next several years. Now, all of us, both 15 in the private sector and on the government side, want 16 this to prosper. 17 The Internet has been a central part of the 18 longest peacetime economic expansion in our history. But 19 we don't think this growth will continue unless both 20 consumers and businesses are confident about their 21 experiences on the net. So we see privacy as a make or 22 break issue. 23 The administration is committed to continuing to 24 protect privacy and we have supported legislation to 25 protect children, sensitive financial information, medical 0006 1 and genetic information as well. And we have worked 2 closely with the private sector to implement meaningful 3 self-regulatory privacy regimes. 4 Now, there has been progress. The FTC found 5 that 88 percent of the Internet sites now have some form 6 of disclosure. We now have several third party seal 7 organizations that certify a web site is complying with 8 its privacy policy, and a number of companies working with 9 my predecessor Secretary Bill Daley prompt other companies 10 to adopt and disclose good privacy policies. 11 Now, these companies use their market leverage 12 by withholding advertising from web sites that failed this 13 test. But it is a long road that we are all traveling, 14 and it seems to me much work remains to be done. In the 15 FTC survey, the Federal Trade Commission survey, only 20 16 percent of Internet sites had policies that tracked fair 17 information practices. 18 Now, clearly we have a challenge, which I think 19 can best be met if we all work together -- government, 20 industry, and consumers. One, we need to be sure that all 21 content providers satisfy the fair information principles. 22 Among other things, they must provide Internet users with 23 information about who is collecting their personal data, 24 how it will be used, and how its disclosure will be 25 limited. 0007 1 Secondly, we need to empower consumers so that 2 they can make informed choices about data collection. 3 Internet users should have a choice about whether to share 4 information and with whom they want to share it. 5 Three, consumers need to have reasonable access 6 to information collected about them and an effective mans 7 of recourse. 8 Finally, we are here today to let consumers know 9 what tools are available to protect their data. 10 Now, industry is doing a good job in developing 11 privacy-enhancing technologies, but the word hasn't gotten 12 out to the consumer. Reports show that only one in 20 13 Internet users have used software that hides their 14 computer identity from web sites. Only 10 percent of all 15 users have set their browsers to reject cookies. 16 Now, there are a number of privacy enhancing 17 technologies on the market or in the final stages of 18 development. Many incorporate the Platform for Privacy 19 Preferences and this allows users to determine if a web 20 site meets their privacy preferences. 21 Let me indicate that all of us here at the 22 Department of Commerce are committed to making our web 23 site compliant with these standards. Now, online privacy 24 is a critical issue for Internet users, one that we all 25 have to address if the extraordinary potential of this 0008 1 technology is to be fully realized in the twenty first 2 century. So we have invited all of you here today to put 3 a spotlight on privacy-enhancing technologies and what 4 their capabilities are and to learn from you what might 5 stand in the way of their future deployment and use. 6 Tomorrow on Capitol Hill, Senator Hatch and the 7 Internet Caucus will be hosting similar demonstrations to 8 further highlight this issue. 9 Let me close by saying thank you very, very much 10 to all of you for taking the time to be here and to 11 participate, and I look forward to your comments and, more 12 importantly, your recommendations on how we can enhance 13 privacy online and make sure that the Internet is a secure 14 place to visit. 15 Thank you very, very much. 16 (Applause.) 17 MR. ROHDE: Mr. Secretary, thank you very, very 18 much. We know that the Secretary's schedule is very, very 19 tight and we are very pleased that he was able to be here. 20 Before we begin the panels, we wanted to provide 21 a general overview of privacy on the Internet. To do so 22 we have asked Dr. Lorrie Faith Cranor of AT&T Labs to do a 23 general presentation that will provide kind of a backdrop 24 for the discussion we will have throughout the day. Dr. 25 Cranor is a senior technical staff member at AT&T and has 0009 1 done a great deal of research that's focused on a variety 2 of areas where technology and privacy policy intersect, 3 including she's worked in areas of online privacy, 4 electronic voting, and spam. 5 She is currently the Chair of the Platform for 6 Privacy Preferences Project Specification Working Group 7 and the Co-Chair of the P3P Interest Group at the World 8 Wide Web Consortium. 9 So I would like to introduce for you Dr. Cranor. 10 (Applause.) 11 PRESENTATION: OVERVIEW OF ONLINE PRIVACY-ENHANCING 12 TECHNOLOGIES - LORRIE FAITH CRANOR, AT&T RESEARCH 13 DR. CRANOR: Good morning. I'm going to try to 14 take you through kind of a whirlwind tour of the different 15 types of privacy technologies which are currently 16 available and show you some samples of what they look like 17 and give you a brief overview of how they work. 18 As has already been mentioned, we know that 19 online privacy is important because it appears in the 20 comics. In this particular comic strip of "Cathy," which 21 had about a two-week series on online privacy, in this one 22 Cathy is finding out just how much can be found out about 23 her online. In this conversation she learns that Irving 24 has found out the following. He says: "You love old 25 movies where men are heroes, love is forever, and women 0010 1 get to wear little hats." And she says: "You know that 2 about me?" 3 He says: "Yes. You dream of snuggling up under 4 a vintage quilt, sipping cocoa from Southwestern mugs, 5 listening to the Three Tenors on CD." She says: "Irving, 6 you, you created an online profile of me." 7 He says: "Yes, and also your stress-relief 8 tablets are back ordered. 9 So Cathy wasn't too pleased about this and in 10 the next few strips we learn how Irving was able to find 11 all this information out about her, and we find out that, 12 first of all, he snooped her e-mail and he looked at the 13 files on her computer because her employer had actually 14 hired Irving, and he was able to observe the chatter sent 15 by her browser, all the extra information that her browser 16 sends, and he was also able to set cookies through banner 17 ads and to use web bugs that allowed him to track her 18 activities across web sites. 19 (Screen.) 20 So let's take a minute to look at what browser 21 chatter is. Browser chatter is basically all the things 22 that your browser is saying when you make little requests 23 on the web. This will include your IP address, your 24 domain name. It includes the referring page, and that's 25 the web page that you visited before the page you're 0011 1 currently at. 2 It also has information about your computer, 3 including your operating system, what kind of browser it's 4 using. It also includes information about why requests 5 you're actually making, so the URL, and if you're 6 performing a search what are the search terms that you're 7 using. And it also includes cookies. 8 Now, who gets to hear this chatter? Well, 9 obviously the web site you're visiting gets to hear this, 10 but there are other people who might hear it as well. 11 This includes your local system administrators, your ISP. 12 There may be other third parties involved, such as 13 advertising networks. And any of this information that 14 may end up in log files at any of these places might be 15 subpoenaed later. 16 (Screen.) 17 So here is an example of the typical request, 18 and this is an actual request that I did. I actually 19 shortened it a little bit so it would fit on the screen. 20 But I went to buy.com and I wanted to buy some beer, and 21 so I did a search for beer and this is what my browser 22 transmitted. 23 You can see here that it includes my actual 24 request for beer, it includes the fact that I am 25 communicating in English, that I'm from the United States, 0012 1 what web site I was at before I did the search, as well as 2 a cookie which right now indicates that I have an empty 3 shopping basket. But there is all sorts of fields where 4 they could put more information in there. 5 Since I performed this search on my employer's 6 network, they know that I was shopping for beer at work. 7 (Screen.) 8 Okay, what about cookies? We've heard a lot 9 about cookies in the past and cookies are actually a very 10 useful thing. I like to think of them kind of like 11 staples. If you're going in the physical world and you're 12 filling out a form and the form is multiple pages, often 13 you have multiple sheets of paper that are actually 14 stapled together. And if it's a well-designed form, you 15 only have to write your name on the front page of the 16 form. You don't have to write it on every page because 17 the form is stapled together and they know that all the 18 information on page 2, 3, and 4 applies to the same person 19 who wrote their name on page 1. 20 Well, cookies can perform that sort of function 21 in the online world, basically tieing together information 22 that is submitted on multiple pages of a form. 23 Another use for cookies is to identify you when 24 you return to a web site so you don't have to remember a 25 user name and password. And it also can be useful for web 0013 1 sites to get a better understanding of how people are 2 using the web site, so when you go to a newspaper site, 3 does everybody read the sports page first; they'd like to 4 know those sorts of things. 5 So there are some very useful uses of cookies. 6 But as we've heard, there are also some reasons where 7 cookies can be harmful. In particular, people are 8 reacting to cookies being used to profile users without 9 their knowledge. So they can be used to monitor users 10 across multiple web sites in the background, so that users 11 don't even know that they're being tracked. 12 (Screen.) 13 An example of how this can work. So imagine 14 that you're going to a search engine and you're doing a 15 search for, say, some medical information and the search 16 engine has an ad, and that ad is going to set a cookie on 17 your computer. When it sets a cookie on your computer, 18 basically it's sending this little bit of data to your 19 computer, and whenever you go back to the web site that 20 set the ad your computer is going to send that little bit 21 of data back. 22 In this case, it's not the search engine that 23 set the cookie, but it's the ad company. So after I've 24 done my search, now I'm going to go to a book store and 25 buy a book. It turns out that this book store uses the 0014 1 same ad company as the search engine, and so when it goes 2 to display the ad it's going to send the cookie back to 3 the ad company. So now the ad company knows that I went 4 to the search engine and then I went to the book store. 5 It has tracked me across multiple sites. 6 If the ad company has the cooperation of the 7 search engine and the book store, it may actually be able 8 to get some more information, because while the cookie 9 itself doesn't know who I am, when I bought that book I 10 had to provide my name and my address and so the book 11 store knows who I am, and if the book store cooperates 12 with the ad company now they have an identified profile of 13 me. 14 Another thing that you may have heard about in 15 the media is web bugs. Basically what a web bug is, it's 16 very similar to what I just showed you, where you have an 17 ad that's sending a cookie, except that a web bug is 18 invisible. So there's some invisible little dot on your 19 screen that has a cookie associated with it and it may be 20 tracking you, but not only do you not know that's 21 happening, but you can't even see that the dot is there. 22 (Screen.) 23 Other privacy problems have come up due to the 24 refer. As I mentioned, the refer is the address at the 25 last web site you visited before the current one. Often 0015 1 when you go to web sites that have forms, sometimes after 2 you fill out the form, if you look carefully at the URL 3 bar, you'll notice that it changed and that some of the 4 information you typed into the form is now actually in 5 that URL. So there's an example at the bottom of the 6 screen where you have an actual person's name and address 7 that actually becomes part of that URL. 8 Now when I go to the next web site, that 9 information is going to be transmitted to that web site. 10 So this is a big problem. Now, there are a lot of 11 companies that once they became aware of the problem they 12 changed the way their web forms worked so that doesn't 13 happen. But there still are a number where this is still 14 a problem. 15 (Screen.) 16 Okay, so what can you do about this? Well, this 17 is a slide actually that I borrowed from one of my 18 colleagues that he put together. He says, well, you know, 19 you can go to cyber cafes, only use the Internet while 20 you're drinking coffee at different cyber cafes, and of 21 course never go back to the same one. And you can use 22 free e-mail services instead of an ISP and keep changing 23 them, or set up a prepaid cash account with your ISP and 24 be sure to give all phony information. And you can forge 25 e-mail, and of course never go shopping online where you 0016 1 have to actually give out personally identifiable 2 information. And if you do all that, then you should be 3 pretty safe. 4 Fortunately, there are some other solutions. 5 There are a number of software tools, many of which you 6 are going to see demonstrated today in the foyer, that can 7 help people address some of these concerns. 8 The first set of tools I'm going to talk about 9 are anonymity and pseudonymity tools, which basically help 10 people surf the web without any of their actions being 11 linked directly to them. Another set of tools is 12 encryption tools and I'm going to touch on them briefly. 13 That's not really the focus of the session today. We 14 could do a whole another session just on encryption tools. 15 Then another set of tools are filters. Basically, these 16 are tools that are going to be making sure that your 17 computer is not sending out certain kinds of information 18 and so they may be blocking cookies, they may be making 19 sure that your child is not sending out their name or 20 their phone number over the Internet, things like that. 21 (Screen.) 22 There are also some tools that are helpful for 23 information and transparency, so they basically inform 24 users as to what's going on, what information is collected 25 about them, and what is going to happen to it. Then there 0017 1 are a whole bunch of other tools that sort of didn't 2 really fall neatly into these other categories, and I'm 3 going to talk about a few of those. 4 (Screen.) 5 Just to give you sort of a holistic picture of 6 how this all works, you can imagine that you have your 7 user and you have various services and web sites and there 8 is the Internet that's sort of this big cloud in the 9 middle, and the user lives somewhere where there's a 10 regulatory and self-regulatory framework and the services 11 that they visit may be following the same laws and the 12 same guidelines or they may be following different ones. 13 Then the user has the ability to send all their 14 messages on the Internet through a number of different 15 tools. So these might include an anonymizing agent or a 16 cookie cutter or a P3P tool. So the user can basically 17 get extra privacy features by using these tools. 18 (Screen.) 19 Then the other thing is that it's very important 20 that any data that the user does decide to send to a web 21 site is going to be visible only to that web site. So the 22 user needs to have a secure channel through encryption 23 tools to make sure that this data is just not picked up by 24 anybody else who happens to be listening. 25 (Screen.) 0018 1 2 So let's start with anonymizing proxies. This 3 is a tool that basically a user can set up their web 4 browser so that all the requests they make to the Internet 5 are going to go through this proxy server. Basically, the 6 proxy server takes the request, strips off identifying 7 information, and forwards it to wherever the user wants it 8 to go. 9 Then when the web site responds, it responds by 10 sending the request back to the proxy. The proxy knows 11 what user requested it and so they can send it back to the 12 user. So as a result, the end servers have no idea who 13 the original user is. However, the proxy can see 14 everything and so we have to trust that the proxy is going 15 to have a good policy and not be using that data in ways 16 that the user doesn't want. There are a number of 17 different proxy-based services. Some are free, some are 18 subscription, some are supported by advertising. 19 (Screen.) 20 One of the best well-known ones is called the 21 anonymizer, anonymizer.com. For example, if you are 22 visiting Yahoo using the anonymizer it would add this 23 little blue bar at the top of your screen to indicate that 24 the page was loaded by anonymizer. It includes a control 25 panel that lets people control some of the settings, for 0019 1 example whether or not you want the anonymizer to filter 2 out the cookies or let them through. There are a few 3 other settings that the user can control. 4 (Screen.) 5 There are also a number of tools that are 6 related to anonymity tools, but they have sort of an extra 7 thing. I call them pseudonymity tools. They have the 8 ability to automatically generate user names, passwords, 9 e-mail addresses, or other information, basically 10 pseudonyms for the user, and keep track of them. A lot of 11 users say that when they go to web sites and they're asked 12 to provide information that they lie and they make up 13 names. But actually, you have to keep track of what it is 14 you made up if you want to maintain a relationship with 15 the site. 16 So these tools will actually keep track of that 17 automatically for the user. So one example is iPrivacy, 18 and I believe they're going to be doing a demonstration 19 here. Basically, when I go to a web site and I want to 20 order something and have it shipped to me, I can type in 21 my information and iPrivacy will translate that 22 information into this essentially encrypted private 23 identity and then it will fill out the form for me with 24 this translated information, and then the web site will 25 get the encrypted information. 0020 1 They will have enough information to give to the 2 credit card company and to give to the shipping company so 3 that the transaction can be processed, but they web site 4 itself doesn't know what my credit card number is or what 5 my home address is, and in the end the shipping subsystem 6 can basically print an address label which will allow the 7 product to be delivered to me without the company actually 8 knowing who I am or where I live. 9 (Screen.) 10 This is the screen. They have to download some 11 software onto the computer to make this work. 12 (Screen.) 13 There's another company called Incogno which has 14 a similar tool, except it doesn't require that users 15 download it onto their computer. They work with 16 merchants, and if the user visits a web site that is 17 equipped with Incogno they can use a similar tool where 18 Incogno may have access to some of this personal 19 information but it is not actually released to the vendors 20 themselves. 21 (Pause.) 22 Now, this is where technology breaks down. 23 Let's see if I can revise this. Otherwise I'll wing it. 24 (Screen.) 25 Now we're in trouble. 0021 1 Okay, while my computer is rebooting, some of 2 the other tools that I'd like to show you pictures of, but 3 you'll just have to imagine, besides the anonymity and 4 pseudonymity tools, the next tools I was going to talk 5 about are encryption tools. There are encryption tools 6 which are useful to encrypt data in a variety of ways. 7 There are some which are focused on encrypting data as 8 it's being transmitted from your computer to the web site. 9 There are also tools which are focused on encrypting e- 10 mail messages. And there are some which are focused on 11 encrypting the data when it's on your computer or when 12 it's on the database out there. 13 I'm not really going to talk about those in 14 detail, other than just to say that they're really 15 important. If you have all of these other privacy tools 16 and your data is just floating out there unencrypted, 17 that's not really going to offer a whole lot in the way of 18 privacy protection. So it's important not to leave out 19 those things. 20 Now, fortunately, increasingly some of these 21 tools are actually being built into web browsers, being 22 built into database systems, so that's going to be very 23 helpful. 24 The next set of tools that I wanted to talk 25 about are the filtering tools and the first set of 0022 1 filtering tools are the cookie cutters. So these are 2 tools which can be configured to block all cookies or to 3 selectively block cookies, and often users will want to 4 allow cookies from some web sites where they see 5 themselves as actually getting some value from having the 6 cookie, and at other web sites they'd rather not have the 7 cookie. 8 Some of these tools also allow the users to go 9 ahead and accept the cookies and then to review them later 10 and decide which ones they want to keep and which ones 11 they want to discard. 12 Besides being able to filter cookies, a lot of 13 these tools also have the ability to filter out the refer 14 header and some of the other browser chatter that we 15 talked about. 16 Hang on a second. We'll be back with the 17 presentation. 18 (Pause.) 19 (Screen.) 20 Okay. The other kind of filter besides the 21 cookie cutter are the child protection software. Now, 22 most of this software was developed primarily to allow 23 parents to filter out material that they felt was 24 inappropriate for their children. But a lot of this 25 software also has a feature where parents can block the 0023 1 child from sending privacy-sensitive information on the 2 Internet. So for example, they can set it up so that the 3 child's name or phone number won't be sent on the 4 Internet. They can also limit who a child can e-mail or 5 chat with. 6 (Screen.) 7 Another type of tool that we talk about are 8 identity management tools. Sometimes these are referred 9 to as infomediaries. Some of these companies refer to 10 themselves as infomediaries and some of them tell me that 11 they absolutely aren't infomediaries. So I'll just call 12 them identity management tools. 13 But basically, the idea is that these are tools 14 that help people manage their online identities. 15 Basically, they generally offer some sort of an electronic 16 wallet or electronic archive where I can type in my 17 information in some sort of a secure storage and have the 18 ability to have my computer automatically send this 19 information when I authorize it. So they'll automatically 20 fill out forms or automatically send demographic 21 information to web sites, but only when I have authorized 22 it. 23 So some of them are essentially an opt-in to 24 targeted advertising. Some of them will pay consumers for 25 data. Some of them actually go and check for privacy 0024 1 policies at web sites and give consumers some indication 2 as to what kind of privacy policy the site has before they 3 will release the data. There are a number of different 4 examples. They all have slightly different models of how 5 this actually works. 6 One example is Persona. In Persona, the 7 consumer will in advance fill out this profile with 8 personal information and they will indicate for each field 9 when they want to allow it to be shared. And then Persona 10 Valet has this tool bar that will allow users to have some 11 control over cookies and control over when to provide 12 data, and they are also planning on building in P3P so 13 that they can give users alerts about web sites' privacy 14 practices before data is released. 15 (Screen.) 16 There's another company called Privacy Bank that 17 has an interesting system. Here if I go to a web site 18 that's equipped with Privacy Bank -- in this case this is 19 the Starbucks Coffee site -- if I want to order some 20 coffee beans, I will click on my Privacy Bank bookmark 21 which I set up when I subscribed to Privacy Bank, and it 22 will pop up this window which provides a snapshot of what 23 the Starbucks privacy policy is. There are little symbols 24 that you probably can't see, but they basically give me - 25 - there are five symbols and depending on which ones 0025 1 appear I have some idea of what their privacy policy is. 2 3 Then if I'd like to fill out the form, then I 4 can click on the "My information" and drag it onto the 5 form, and at this point it gives me an alert, because when 6 I registered for Privacy Bank I indicated what my personal 7 preferences are. There's a conflict here, and so it says 8 "This site does not meet your privacy preferences. Would 9 you still like to fill out the form?" 10 I can say yes or no. I can also click on the 11 button for policy details and get a much more detailed 12 indication of what kind of data they collect and what 13 exactly they do with it, and then I can make an informed 14 decision. 15 (Screen.) 16 The next thing I'd like to talk about is the 17 Platform for Privacy Preferences, or P3P, and I'm just 18 going to give you a brief overview of that and that will 19 be discussed more on the next panel. But basically, P3P 20 is designed to give web sites an easy way to take their 21 privacy policies and convert them into a standard machine- 22 readable format. Then once this is done, we can build web 23 browsers and plug-ins and other tools that can 24 automatically go fetch these machine-readable policies and 25 read them for the user, and then they can compare the 0026 1 policies with the user's preferences and alert the user 2 when there are conflicts. 3 4 (Screen.) 5 So basically a web site that wants to use P3P, 6 it's fairly straightforward. First they have to have a 7 privacy policy. Then they need to translate it into P3P 8 format, and there are a number of tools that are available 9 to assist with the translation process. Then they take 10 that translated policy and they put it on their web site, 11 and then they create another policy -- another file which 12 indicates what parts of the web site this policy applies 13 to. 14 So a web site can say: We have one P3P policy 15 for the entire site, or they might have different policies 16 for different parts of the site and they can indicate what 17 policy applies where. 18 (Screen.) 19 The P3P vocabulary basically has a number of 20 different fields which capture what we felt were the 21 important parts of a privacy policy that people would be 22 interested in knowing. So basically they are who is 23 collecting data, what data is collected, what purpose will 24 it be used, is there an ability to opt in or opt out, who 25 are the data recipients, what kind of access is provided, 0027 1 what kind of data retention policy is there, how will 2 disputes about the policy be resolved, and that includes 3 third party seals, laws, anything along those lines, and 4 then finally where is the full human-readable policy if I 5 want to go get more information. 6 (Screen.) 7 One of the advantages of P3P is that not only 8 can I easily find out about a web site's privacy policy, 9 but I can also find out if the web page has any other 10 objects in it that might have different privacy policies. 11 So for example, here you see this page which is on the 12 AT&T web site and as a user I look at it and it looks like 13 it's just one whole page. But in fact it actually has an 14 ad in it and this ad is served from another company, and 15 the privacy policy associated with that ad is the policy 16 of that other company. 17 So using P3P, my user agent should be able to 18 automatically discover this and check both privacy 19 policies to make sure they match my preferences. 20 (Screen.) 21 I'm going to show you a prototype P3P user agent 22 which was designed at AT&T in conjunction with Microsoft. 23 This is a plug-in which is designed to work with Internet 24 Explorer, and it allows users to configure their 25 preferences. This is the preference configuration screen, 0028 1 which I don't expect you to actually read here, but you 2 can see it's a fairly small number of questions. The 3 users can use that to specify what their personal 4 preferences are. 5 After they have set their preferences, whenever 6 they go to a web site they can click on the privacy button 7 and it will check the results. So here are examples at 8 two different web sites. The top web site, there is a 9 warning that comes up because there was a mismatch with 10 the user's preferences and it says: "This site does not 11 allow you to find out what data they have about you." 12 Basically, the site provides no access and I have said 13 that I need access. 14 Then the bottom screen, we're visiting a site 15 where there are no warnings. We can see here that this 16 site has a seal from TrustE. That's one of the things 17 that comes up as part of a policy. 18 Now, if a user changes their privacy preferences 19 and goes back to that same site, now all of a sudden they 20 have a warning that this site may collect data that does 21 identify you for profiling. So basically now the user has 22 said: Hey, I actually don't really want profiling, and so 23 that warning will come up. 24 (Screen.) 25 Another P3P user agent is by IDcide. It's 0029 1 called the Privacy Companion and it's a plug-in that works 2 for both Internet Explorer and for Netscape, and it 3 provides cookie management capabilities. It actually 4 started out as a cookie management tool and by adding P3P 5 to it they've been able to enable people to have much more 6 fine-grain control over when to allow cookies. 7 So it adds these little symbols to the top of 8 the browser bar and the symbols indicate red for when 9 there's a P3P policy that's not acceptable, green when it 10 is acceptable, and grey when there's no policy at all. 11 Then when I visit a site that's trying to set a cookie, if 12 the privacy policy doesn't match I can get information as 13 to exactly why that pops up here. 14 Another tool is called YOUPowered Orby Privacy 15 Plus, and this is actually a toolbar that sits on the 16 user's desktop rather than in the browser window, and it 17 has similar features. It also has this trust meter. You 18 can see in the upper right-hand corner of the screen, it 19 gives you a little visual graphic indication of how close 20 the policy is to matching your preferences, and then you 21 can click on it to get specific positive and negative 22 flags to see specifically where the web site may have 23 problems or where it's doing good things. You can also 24 get it to prompt you when web sites want to set a cookie 25 and they don't necessarily match your preferences. 0030 1 (Screen.) 2 IBM has a tool that they're demonstrating here 3 that lets web sites create their P3P policies. This is 4 something that's currently available from the IBM web site 5 and it's a drag-and-drop interface that basically somebody 6 can pick up these little elements on the screen which 7 represent different kinds of data and drag them onto the 8 other page to indicate we collect this kind of data, and 9 then they can fill out various information about each type 10 of data and what is done with it. 11 IBM also has a number of templates, so rather 12 than starting from scratch you can find -- 13 (Screen.) 14 -- a template that follows similar practices as 15 your web site and then just edit that to customize it for 16 your web site. 17 (Screen.) 18 Another tool which allows web sites to create 19 P3P policies is called PrivacyBot. They're actually a 20 web-based interface. I go to their web site as a 21 webmaster and I can fill out their lengthy questionnaire 22 about my web site and I can pay I think it's $30 with my 23 credit card and create both a P3P policy as well as a 24 human-readable privacy policy for my web site. 25 (Screen.) 0031 1 Finally, YOUpowered, which I showed you a minute 2 ago, also has a tool for webmasters to create their 3 privacy policies as well. 4 5 (Screen.) 6 Now I want to mention a few other tools. As I 7 said, these are sort of the miscellaneous ones. There are 8 some privacy-friendly search engines. One example is 9 TopClick. These are basically search engines which are 10 committed to not using cookies, not tracking users, and 11 basically trying to be as privacy-friendly as possible. 12 Another type of tool are computer cleaners. 13 When you're surfing the web there's all sorts of little 14 files that are created on your computer to kind of keep 15 track of things while you're surfing and many of these 16 files are no longer really needed once you're done. So 17 there's a tool called WindowWasher which goes through a 18 removes all of these files, therefore removing the traces 19 of what web sites you visited when you were on line. 20 People say that it actually makes your computer run 21 faster, too. 22 (Screen.) 23 Another type of tool is tools to facilitate 24 access. We're going to hear today from a company called 25 Privacy Right, which is focused specifically on access. 0032 1 So for example, they have, for example, this tool. I can 2 go to a web site that is equipped with their tool and 3 indicate what kinds of data uses are acceptable, basically 4 opting in and out of various things, and then I can 5 actually go and view what data they have collected on me 6 and who it has been disclosed to and when, and I can view 7 that on line. I can also request that they send me a 8 complete audit trail of where my data has gone. 9 So that's pretty much the end of our whirlwind 10 tour. I think I even managed to stay mostly within time. 11 The final thing that I want to leave you with is that 12 we've seen a variety of different tools, but there's no 13 one tool which is going to solve all the problems. Really 14 what we need is tools that can work together. So I think 15 each tool has its strengths and it has something that it 16 can contribute. 17 So P3P tools are useful to help users understand 18 privacy policies, but they don't deal with, for example, 19 enforcing privacy policies. Seal programs can help with 20 enforcement as well as regulations can also help with 21 enforcement. Anonymity tools and filtering tools can 22 reduce the amount of information that I actually reveal, 23 which is great when I don't need to reveal information, 24 but if I want to actually go purchase something there may 25 be information that I do need to reveal, and so then some 0033 1 of the other tools are going to be useful. 2 Encryption tools are of course useful for 3 securing data both in transit and in storage. Finally, 4 the laws and codes of practices are going to do what tools 5 can't, which is provide a baseline level of acceptable 6 policies. 7 Thank you. And if you would like a copy of any 8 of these slides, they are all available on my web site. 9 (Applause.) 10 MS. LEVY: Thank you so much, Dr. Cranor, for 11 the extremely informative overview. 12 We're now ready for our first panel of the 13 morning, P3P Implementation. I ask all the panelists to 14 come up. 15 PANEL DISCUSSION: P3P IMPLEMENTATION 16 MS. LEVY: Our moderator for this panel is 17 Elliot Maxwell, the Special Adviser to the Secretary of 18 Commerce for the Digital Economy. Among other things, 19 Elliot advises the Secretary on and helps coordinate web 20 site of Commerce activities regarding electronic commerce 21 and the Internet. I'm going to ask that Mr. Maxwell 22 introduce his own panelists. 23 MR. MAXWELL: We are under this time pressure 24 and we are just going to hustle. So Lorrie just did a 25 wonderful job. It's like having a whole sort of panel to 0034 1 herself, but giving us an enormous amount of information 2 about privacy in general. We're going to focus right now 3 in this panel on P3P. 4 I will give a sort of blast fax introduction of 5 the panelists and then they will have three and a half 6 minutes. You will see a clock counting down at the end, 7 so the last 30 seconds you count down like it's a NASA 8 countdown, so they'll know that they don't have any more 9 time. 10 We're going to try to have questions at the end, 11 so we'll try to go very quickly. If we are able to do 12 that, there are microphones in the aisles, so that if you 13 will use those we can have a recording of it. 14 I will give the introductions in order of, in 15 alphabetical order, though that will not match the order 16 of appearance, and I'll give that quickly at the end. But 17 let me just briefly introduce the panel: Marc Berejka is 18 a Senior Federal Affairs Manager and Senior Corporate 19 Attorney at Microsoft's corporate office in Washington, 20 responsible for developing and advocating the company's 21 positions in telecommunications issues and in particular 22 in this regard in the area of privacy. 23 He chairs the Information Technology Industry 24 Council's working group on telecommunications policy and 25 the Information Technology Association of America's 0035 1 working group on information policy. 2 Karen Coyle is a digital library specialist at 3 the University of California. She speaks and writes often 4 on issues relating to cyberspace and is active with 5 Computer Professionals for Social Responsibility, a public 6 interest group based in Palo Alto. She's written 7 critically about P3P, which she refers to as "pretty poor 8 privacy," so she carries a particular perspective on this. 9 Dierdre Mulligan is Staff Counsel at the Center 10 for Democracy and Technology and works with groups around 11 the world on fair information practices and strengthening 12 individual's control over their personal information. 13 Dan Jaye is CTO and Co-Founder of Engage, part 14 of the CMGI Group, where he is involved in delivering 15 interactive database marketing and information service 16 products. He's also involved with a group itself in 17 facilitating communications about technology and privacy. 18 Ed Mierzwinski has been a consumer advocate with 19 the Public Interest Research Group, U.S. PIRG, since 20 January of 1989. He's a frequent participant in public 21 policy forums, has testified often at both the state and 22 federal levels, and is a member of the steering committee 23 of the Trans-Atlantic Consumer Dialogue, which just 24 recently met in Brussels. 25 Ron Perry of IDcide is the company's CEO. 0036 1 IDcide is a company that provides a bridge between privacy 2 and profiling services to Internet users and tries to 3 address the privacy needs of both e-businesses and 4 consumers. They just released "The Privacy Companion," 5 which is a product that reports cookie-based tracking, 6 automatically blocks third party cookies, and eliminates 7 data spillage problems. 8 Mel Peterson of Proctor and Gamble has served in 9 a number of different positions with the company. In 1998 10 he joined the Interactive Team at P and G -- it's now 11 called I-Ventures -- and he helped implement P and G 12 systems to manage online privacy and to develop P and G's 13 global privacy guidelines. Effective this year, just 14 about a month ago, Mel became P and G's global privacy 15 manager, responsibility for building and enhancing P and 16 G's privacy management capabilities globally. 17 Martin Pressler of IBM is an advisory programmer 18 at IBM's facility in Research Triangle. He's been deeply 19 involved in P3P and is co-author of the P3P specification. 20 Danny Weitzner, on his right, is Director of the 21 World Wide Web Consortium's Technology and Society 22 Activities, responsible for the development of technology 23 standards that enable the web to address social, 24 political, and public policy concerns, and has been again 25 heavily involved in P3P. Before joining W3C he was at the 0037 1 Center for Democracy and Technology and was also at EFF. 2 So I think I got my three and a half minutes. 3 You won't hear much from me any more. So Danny, why don't 4 you take it away. 5 6 MR. WEITZNER: Thanks very much, Elliot. 7 Because we have a big issue and a big panel, I'm 8 going to speak in sort of a cryptic, skeletal way. I 9 really just want to highlight, following Lorrie's 10 presentation, which I think gave us all a good background 11 on what P3P is and how it works, I want to just make three 12 very quick points about why we need P3P on the web, and 13 three points about why I think it's actually going to 14 happen. 15 First of all, I think there's a tremendous need 16 for what Lorrie referred to as machine-readable privacy 17 policies. That is, anyone who looks out on the web today 18 and looks at privacy policies knows that they're 19 relatively long, they're written, even for lawyers, in 20 sometimes somewhat confusing language, and I think that 21 some tend to feel that this is maybe done to confuse or 22 deceive users. 23 I actually think to a large extent it reflects 24 the fact that privacy practices, the practices of handling 25 personal information on the web, are becoming increasingly 0038 1 complex. Therefore, I think we desperately need to put 2 privacy policies in machine-readable format, in the same 3 way as everything else on the web is in machine-readable 4 format, so that the tools that we use to access the web, 5 the browsers and all the other user agents out there, can 6 help us to decipher these policies and make intelligent 7 decisions about them. 8 Secondly, I think that we badly need P3P because 9 we need to harness the entrepreneurial energy that has 10 made the web itself work to help users make intelligent 11 decisions. To me one of the most extraordinary things 12 that's happened in the development of P3P is that, now 13 that the standard is more or less complete, we're seeing 14 the whole range of companies come out with a variety of 15 products that give users a variety of different options 16 about how to make choices about their personal 17 information. 18 I think it's safe to say that the designers of 19 P3P didn't even think about a lot of these products or 20 about these possibilities. But we need to provide that 21 kind of flexibility to users which in practice in the web 22 arena is provided when we let software developers go out 23 and do innovative things. 24 Third, I think that, for the reasons I've said, 25 P3P really, in the words of Alexander Dix, who's a data 0039 1 protection commissioner from Germany, is necessarily but 2 not sufficient to guarantee online privacy. As Lorrie 3 indicated, P3P is part of a broad array of tools, 4 services, self-regulatory practice, regulation, that all 5 have to come together to make privacy work. But P3P is 6 really necessary for that. 7 I think there are three reasons following those 8 why web sites will adopt P3P, why software developers will 9 build P3P products, and why we'll really see this kind of 10 enhanced user control over privacy on the web. Number 11 one, web sites, particularly those web sites that are 12 trying to have a commercial relationship with users and 13 sell them something, very badly want to ensure that users 14 have a seamless browsing experience. They don't want 15 users to be distracted by going off and having to find the 16 privacy policies and take however many minutes or hours it 17 takes to understand it. Seamless browsing is critical. 18 Second, I believe that as the major browser 19 vendors begin to integrate P3P into their products, and 20 two of the major browser vendors, Microsoft and Netscape, 21 have announced plans to integrate P3P, users will come to 22 expect to see a P3P policy. I doubt very much that many 23 more than about a tenth of one percent of users will ever 24 say, I want my P3P, but they will see, as you saw on 25 Lorrie's slides, an icon that's an icon that's in grey 0040 1 instead of in green and they'll wonder what's happening. 2 Third, I think that P3P will become a part of 3 the web and needs to become a part of the web, because web 4 services are fundamentally becoming more complicated and 5 they are becoming more integrated into our lives in a 6 variety of ways, and the fact of the matter is that as we 7 live our lives, whether it's online or off, we share, 8 trade, quite a lot of personal information, and users need 9 a way to control that. 10 Certainly anonymity has its place. It's 11 critical for protecting basic human rights and civil 12 liberties. But in the world that the web is increasingly 13 becoming part of, there is lots of personal information 14 moving around and users need control of it. 15 MR. MAXWELL: Danny, we're going to move on to 16 the next person because autocracy is the rule here. 17 Marc Berejka is next from Microsoft, talking 18 about this from the viewpoint of the browser manufacturer. 19 MR. BEREJKA: Thanks, and I will try to keep 20 this on schedule -- 21 VOICE: Microphone. 22 MR. BEREJKA: Can you hear me now? 23 If we get network connectivity, we'll have live 24 connections to show the tools that I'll show you 25 graphically here. 0041 1 MR. MAXWELL: Can we get that mike, please, Marc 2 Berejka's mike, please. 3 (Screen.) 4 MR. BEREJKA: So and Lorrie and Danny have said, 5 there's a basic chicken and egg issue here in delivering 6 P3P. Microsoft looks forward to helping to deliver in the 7 near term both a chicken and an egg. We've been committed 8 to P3P for some time. Our first manifestation of that 9 commitment or tangible manifestation was in the 10 development of a privacy statement generator. Again, we 11 have that out in a mockup in the lobby, but if we have 12 network connectivity it'll be live. 13 (Screen.) 14 If we could go to the next slide, you'll see 15 that what's been sort of the longstanding beta of this 16 privacy statement generator, a beta that's been running 17 for 18 months now, has gotten about 20,000 companies to 18 use it and to walk through, to walk through a basic 19 questionnaire as to what the web site's basic information 20 practices are. 21 What we do with the web site generator is try to 22 guide users to follow the fair information practices. 23 This slide indicates that we're actually asking web site 24 operators to consider how they provide access. At the end 25 of the process, you press the "Done" button and it 0042 1 generates a sample privacy policy with some indicators as 2 to where the web site operator might want to seek some 3 further guidance or where we believe information might be 4 lacking. 5 Again, this longstanding beta, if you will, has 6 gotten 20,000 companies plus to use it. The game plan is 7 to update this privacy statement generator so that it's 8 compatible with the current version of P3P and our privacy 9 statement generator that is compatible with the current 10 version of P3P would be released in the December to 11 January time frame. 12 This version that's up now is compatible with an 13 April or earlier, April '99 or earlier version of P3P, so 14 it's not quite up to snuff, but is nonetheless useful for 15 basic purposes. 16 In terms of the client experience, if we could 17 go to the next slide -- 18 (Screen.) 19 -- we also want to make that as simple as 20 possible. Right now we have just, as Lorrie showed, 21 developed a basic tool, and you can see this better out on 22 the screens in the lobby, but what we're struggling with 23 is how to simplify the P3P experience for the end user. 24 In this mockup we ask the individual whether they only 25 want to go to a site that collects data necessary for the 0043 1 processing of a specific request, whether they only want 2 to go to a site that does not reveal your identity, and 3 whether they only want to go to a site that does not 4 identify you for profiling. 5 So we're trying to make these choices simple for 6 the consumer so that the process of enabling the browser 7 to read XML statements is not difficult. 8 If we could go to the next slide -- 9 (Screen.) 10 The next process, as Danny and Lorrie pointed 11 out, is for the P3P-enabled browser to talk to the XML- 12 enabled privacy statement. If we could go to the next 13 slide -- 14 (Screen.) 15 -- this is a sample -- it's similar to what 16 Lorrie showed -- of what gets spit back. We ran this test 17 live against Microsoft.com and what came back was that 18 Microsoft.com, the fact that Microsoft.com is a bearer of 19 the TrustE seal, that there's one-click access to the 20 Microsoft.com privacy policy. But you know what? 21 Microsoft.com does collect, does use cookies. It's 22 disclosed in their privacy policy that they do use 23 persistent cookies to do some level of tracking. 24 What we're really looking for over the course of 25 the next nine months or so, because this process is 0044 1 complicated, is input from interested people who want to 2 help us in this boiling-up process to make the consumer 3 experience in implementing P3P easy. The game plan is to 4 release the privacy statement manager -- excuse me, the 5 privacy manager for the consumer, integrated into the next 6 version of the operating system, which is codenamed 7 Whistler and which is due out some time next year. 8 MR. MAXWELL: Thanks very much, Marc. This is a 9 world record for 11 slides in under 4 minutes. So we 10 appreciate it. 11 Next, Ron Perry from the standpoint of those 12 people developing applications that try to serve 13 individuals and increase their control over privacy. 14 (Screen.) 15 MR. PERRY: Thank you, Elliot. In compliance 16 with the government regulations you set out here regarding 17 timing, I'll rush through my three slides. 18 I wanted to give you our perspective on how we 19 think P3P can help applications treat privacy better, give 20 you a little overview of what we've done and what we think 21 are the possibilities that would be opened up by P3P. So 22 first of all, what we've done with P3P is to utilize it in 23 order to simplify decisionmaking by the user. Instead of 24 having the user read through a complex privacy policy and 25 then decide whether cookies should be enabled for this 0045 1 site or not, understanding how the site uses information 2 collected, we are trying to automate this decision. So 3 the Privacy Companion with P3P reads the privacy policy 4 for a site, analyzes it, tries to find out whether it 5 matches the user's preferences or not, and decides whether 6 to allow the cookie from the site or not. 7 We also show the user, as Lorrie showed in her 8 slides earlier, we give the user a visual indication of 9 whether the policy is acceptable or not according to that 10 user's preferences. 11 (Screen.) 12 We see the major advantage of P3P in its opening 13 up of a wide range of possibilities for application 14 developers. The first type of applications that we can 15 see using P3P are privacy-enhancing technologies, such as 16 ours. But we also see many possibilities for other tools 17 like search engines that can take the privacy policy of a 18 site into consideration. 19 Various privacy policies -- the fact that a 20 computer application can do something with a privacy 21 policy really opens up the issue of privacy into a 22 competitive advantage, so that the major challenge we see 23 in the development of P3P is being able to actually 24 enhance its vocabulary to emphasize the good practices 25 used by various companies and really give them a 0046 1 competitive advantage, turning privacy into a competitive 2 advantage, something that is not possible today without 3 such technologies. 4 Thank you. 5 MR. MAXWELL: Thank you, Ron. 6 Next will be Mel Peterson. Proctor and Gamble 7 is sort of the mother church of brand management and so 8 someone who takes the job of worldwide privacy policies 9 for a firm like that has a big challenge. Mel. 10 MR. PETERSON: We were sort of a guinea pig for 11 P3P a few months ago. We participated in the June interop 12 demos, created a P3P policy statement based on our privacy 13 policy that we have displayed on the pg.com web site. 14 Neither I nor the developer that worked on this had even 15 read the P3P spec ahead of time, so we were a pretty good 16 test of what does it take for a company to actually do 17 this. And we didn't cheat and use one of the nice 18 generators that are out there, either. We wrote the code 19 by hand. 20 The net was it was very easy to do and very 21 inexpensive. Over the course of a couple of weeks, we 22 spent maybe three elapsed or three effort days to create 23 the P3P statement, learn what it took to do that, debug 24 it, put it up on the site. So it's a very straightforward 25 thing for content providers and web sites to do. 0047 1 (Screen.) 2 I was asked to speak to, so what do you think 3 other content providers and web sites are going to react 4 to P3P? I would make four points. The first is simply 5 that my suspicion is most web sites and content providers 6 haven't spent a lot of time learning about P3P yet and we 7 need to get the word out that this is not an expensive, 8 time-consuming thing to do, particularly if you create a 9 single policy statement in P3P. 10 Secondly, while it was easy to create and write 11 the code, companies do need better guidance for how to 12 implement P3P. For example, we heard earlier that it does 13 make sense in some cases to have multiple P3P statements 14 for your web site. That's counterintuitive to most 15 companies. You think you want to have one statement, keep 16 it simply. And by the way, that's more supportable as 17 well. Companies need guidance on how to implement this in 18 a supportable way and also in a way, though, that's most 19 helpful for consumers. And I think we're still learning 20 that. 21 As far as what else will spur companies to move 22 forward, certainly seeing a critical mass of consumers out 23 there with the capability to use P3P and evidence that 24 they're using it will spur. What gets measured gets done. 25 Similar to the kinds of measures that were publicized 0048 1 about privacy statements in the past, we need to do the 2 same thing with P3P. 3 Secondly, organizations that are interested in 4 moving this forward need to create sort of their own 5 project management around this. How many of the Fortune 6 500 companies, how many of the top 100 web sites have done 7 this? If they haven't, call them up and find out why not. 8 Thank you. 9 MR. MAXWELL: Great. 10 Ed, from a consumer standpoint do you want to 11 comment about what you've heard? 12 MR. MIERZWINSKI: Thank you, Elliot. The Public 13 Interest Research Group has views that represent those of 14 consumers, but more and more consumer protection has come 15 to take into account issues such as privacy and the 16 development of fair information practices. Quite simply, 17 from our perspective we don't feel that the notice and 18 choice provisions enacted by the code of the machine are a 19 substitute for the full panoply of fair information 20 practices that includes so much more than notice and 21 choice that were originally embodied in the early 1970's 22 by the HEW task force that led to the inclusion in most of 23 the U.S. laws governing U.S. government uses of 24 information a set of practices that says, collect the 25 least information possible, collect it suitable for a 0049 1 specific purpose, give consumers control over their 2 information, not choice but consent-type control over 3 their information, ensure that the information is 4 protected by security standards, give the consumer the 5 right to correct the information and to know all about the 6 information in the database about him or her. 7 I quite frankly don't see how P3P meets the fair 8 information practices test. The notion that notice is a 9 privacy policy is patently absurd, and the notion that 10 notice and choice, what I call FIPS Lite, which is all 11 most industry groups want to give us, are adequate for 12 what the American public are clamoring for -- and this is 13 the American public that doesn't only include the members 14 of the so-called liberal groups such as the ACLU or the 15 Nader groups -- we're not really a Nader group, but 16 everybody thinks we are -- such as my group, but also 17 includes organizations such as Phyllis Schafly's 18 organization, the Eagle Forum, very conservative 19 organizations. Senator Shelby is aligned with Congressman 20 Markey supporting bills that represent really strong fair 21 information practices. So it's a broad section of the 22 public wants privacy protection. 23 Ultimately, I think P3P will fail for a number 24 of reasons. First of all, it's not really a negotiation. 25 It's not a privacy-enhancing technology. It's at best, 0050 1 it's a -- I'm sorry. At best it's a tool that allows the 2 companies, as Mark Rotenberg has pointed out, it allows 3 the web site to instantly know your privacy price. 4 What do you learn about them? Virtually 5 nothing. Any consumer who wants to take advantage of very 6 high P3P protocols is going to end up, is going to end up, 7 I think, subject to all the nuisance screens that we get 8 when we look at cookies. 9 How many of you have set your cookie preference 10 to "Notify Me" or "Reject All Cookies" and then had 11 difficulty trying to surf? You can't surf when you've got 12 those cookie pop-up windows coming up all the time, and so 13 people are going to give up on high privacy protection 14 under P3P and they're going to end up with low privacy 15 protection, and that's what the industry wants and that's 16 disappointing. 17 I think its supporters are going to say P3P INP, 18 P3P is no panacea, but it's something we can do now. It's 19 not good enough and it's going to prevent the development 20 of better privacy protection policies that enhance 21 anonymity, and that's why we don't like it. 22 Thanks. 23 MR. MAXWELL: Karen. 24 MS. COYLE: Now we need this one on. 25 Notice is an important feature of any privacy 0051 1 program and it is notice that is addressed by P3P. 2 However, as you just heard, notice does not by itself 3 provide any amount of privacy. With P3P it's like we now 4 have the axle, but we are still lacking the wheels, the 5 cart, and the horse. We do not have a privacy solution. 6 Proponents of P3P claim that the notice provided 7 by web site privacy policies gives imminent users a 8 choice. This presumes that there will be comparable 9 services that differ significantly only in their privacy 10 statements, and I see no indication that this will be the 11 case. 12 The invasion of privacy is deeply is entwined 13 with the reliance on advertising for revenue. In a highly 14 competitive environment like the Internet today, the 15 winners are all under the same pressure to play the 16 customer profiling game. In a world of information, what 17 is, after all, a comparable product? If I want to read 18 the New York Times on line, but I do not want to give ut 19 information about myself to do so, reading another 20 newspaper that doesn't require me to sign up, say the San 21 Francisco Chronicle, definitely does not give me the same 22 content. 23 Unlike other products, information resources 24 tend to be unique. As a matter of fact, that uniqueness 25 is encouraged by our copyright laws. Where will the 0052 1 reader turn for a choice? 2 But even worse in this approach is that the 3 approach places the burden on the Internet users to 4 essentially shop for their own privacy. I believe that 5 privacy should be a right, not a bargain hunt. I'm 6 dismayed when P3P is touted as a solution. It's only when 7 we create the rest of the vehicle that we will actually 8 enhance privacy on the Internet. There are a number of 9 commercial products that now address this issue. 10 But my hope is that we'll turn our attention to 11 the root of the problem and implement a baseline of 12 privacy that is the default for all users and in all 13 situations. It's only then that privacy will be a right 14 and not a privilege enjoyed by the technologically elite 15 few. 16 Thank you. 17 MR. MAXWELL: Deirdre. 18 MS. MULLIGAN: I hope you all aren't getting the 19 double echo that we are up here, because I've heard Karen 20 and Marc's presentation in like stereo from three 21 different directions. 22 I am in a position which is not as unusual as it 23 may seem from the panelists. I'm a privacy advocate and I 24 fully support P3P. I completely agree with people on both 25 sides of this issue if you could identify two of them, but 0053 1 there actually aren't. P3P is clearly not the silver 2 bullet, but I don't think you've heard anybody up here 3 suggest that it is. 4 P3P is a very positive step in the direction of 5 what Karen has called notice, Danny has called machine- 6 readable access to information, and what I call 7 transparency. I can tell you, if I'm expected to shop 8 with my feet and identify good privacy choices for myself 9 and be an engaged consumer, certainly the ability to 10 access information in an easy fashion that doesn't burden 11 me with having to read the fine print -- I'm sure you've 12 all tried to look at the back of your Fair Credit 13 Reporting Act notice on your credit card and that's really 14 easy to do -- steps that promote transparency and that 15 enable consumers to diminish some of the costs of 16 protecting their privacy I think is a very positive step. 17 We've seen numerous surveys that say consumers 18 care, not a tiny little bit, but feverishly about their 19 privacy. But it's very, very difficult for them to gain 20 access to the information needed to take steps to protect 21 it. One of my favorite stories is Elliot Spitzer, who is 22 the attorney general in New York State, talked about three 23 of his senior staff attorneys, one of whom is a good 24 friend of mine, spending an awful lot of time trying to 25 decipher what a privacy statement meant at a major portal 0054 1 site. 2 I think if three very smart attorneys with a 3 background in privacy can't understand a privacy policy, 4 which is supposed to be at least the initial step in 5 figuring out whether or not they want to exchange 6 information with a business, that we have a real problem. 7 Does P3P undermine privacy? I personally don't 8 think so. I think we've seen more privacy activity at the 9 state and federal level in the past two years than we have 10 in a very long time. Most of our privacy laws are dated 11 from the 1970's. We've seen increased pressure on the 12 private sector to develop standards. There is certainly 13 an awful lot of effort needed to ensure those standards 14 are actually a race to the top, not a race to the bottom. 15 But I think that when we think about a little 16 sunlight as both disinfectant and also as motivation, that 17 transparency that P3P can bring is part of that sunlight. 18 So I think that we have to continue to say that this is a 19 step forward. It's clearly not the horse and the cart. 20 But to suggest that it's a rock in the road and not an 21 axle I think really undermines all of our efforts to move 22 forward. 23 MR. MAXWELL: Dan. 24 MR. JAYE: As I'm here to speak somewhat from an 25 industry perspective, first of all I totally concur with 0055 1 Deirdre that we have to look at progress towards a 2 solution as not being inherently evil because it 3 undermines the cause that we need to make bigger efforts 4 to get to a final solution. You never get to the final 5 destination if you don't start making steps forward, and I 6 think that, once again, everyone is in agreement that P3P 7 is a mechanism for making it easier to understand and 8 process notice and to enable tools that actually will make 9 it easier to support the other fair information practices. 10 So it's a step forward, it's not a complete 11 solution. But I think one of the most important aspects 12 of P3P is just the discipline it imposes upon the industry 13 and policymakers to codify privacy practices. The P3P 14 vocabulary is enormously important because the process of 15 going through, creating a P3P policy statement, refines a 16 company's understanding of exactly what it is committing 17 to. 18 It's also turning out to be enablers of many 19 other things. So for example there's another standards 20 effort going on called CP Exchange. Actually, I have high 21 hopes that CP Exchange will be a mechanism for addressing 22 another major concern in privacy, which is onward 23 transfer. A site may make many representations to a 24 consumer, but once that site transfers data to a third 25 party, even with the best intentions and contracts, the 0056 1 horse is out of the barn. 2 I think the ability of having additional 3 standards that leverage the initial work of P3P to, for 4 example, bind consumer data, encrypt it, and tie it with 5 strong semantic information about what can and cannot be 6 used with that information, such as you can use this 7 information to ship a product, but you can't use it for 8 anything else, will be very useful, but once again not 9 complete solutions. 10 Once again, the final area that makes P3P very 11 important is that as a company that has tried to find the 12 balance between consumer privacy and the marketer's need 13 to be effective, to have an advertising-supported Internet 14 economy, is that we've tried very hard to follow solutions 15 that basically don't require us to need to know what an 16 individual is. 17 You can market effectively without knowing who 18 an individual is, but just understanding certain 19 preferences. You have to minimize data, you have to 20 follow specific practices. Without a solution like P3P, 21 there's no ability for a company like ours to distinguish 22 ourselves from other companies who have very intrusive 23 data collection approaches. 24 MR. PRESLER-MARSHALL: Thank you. Is this one 25 on? 0057 1 I'd like to point out that the World Wide Web is 2 part of the real world and users have an expectation when 3 they are dealing with an organization that they want to 4 have trust in that organization. That organization may be 5 a corporation, that organization may be a government, that 6 organization may be a nonprofit organization. But in 7 either case, individuals want to have trust in an 8 organization that they're working with. 9 As part of that process, they should expect to 10 know what an organization is going to do with their 11 information. So if I come to a web site I want to know, 12 okay, what is this organization going to do with my 13 information. For a web site, you've got to be part of 14 establishing that trust regime. 15 In any case, whatever kind of site it is that 16 you're running, you do need to be able to establish that 17 trust relationship so that you can see that, so that you 18 can see that person come back to you again and again. 19 P3P comes into this picture because P3P gives 20 users useful, actionable, understandable privacy 21 statements. It lets people actually quickly and easily 22 understand what a site is going to do, what information 23 the site is going to make use of, and what choices the 24 user has in interacting with that site. 25 P3P is also very useful for the web site as 0058 1 well. It is useful in that, as Stan pointed out, by 2 codifying your site's privacy statements you understand 3 what it is that you're living up to. In many 4 organizations that can be complicated. A large company 5 has very many different branches or arms or portions of 6 the company that may be interacting with users in 7 different ways. If a company is going to make a unified 8 statement, P3P can help them understand, this is our 9 statement, and then the applications that are making use 10 of information can be written to back up that statement. 11 When you put statements in machine-readable 12 format, you can process them automatically at a web site 13 as well or at a corporation as well, so that corporations 14 can really enact what they say they're interacting -- 15 they're enacting, excuse me. 16 P3P is also useful in that it is implementable 17 and deployable at web sites at a reasonable cost. As Mel 18 pointed out, it doesn't require that you reprogram your 19 web site, it doesn't require that you replace large 20 amounts of infrastructure at your web site. P3P is 21 realizable and that's very important for a lot of web 22 sites. I've heard tell that one web site managed to 23 deploy P3P in ten minutes based on an existing privacy 24 policy. I was very impressed by that number. 25 Last of all, I want to point out that this is 0059 1 P3P Version 1 and web protocols, networking protocols and 2 software all will evolve based on the needs of their 3 users. In this case, those users are end users, those 4 users are web sites and corporations. This will move 5 forward as it is needed, and I look forward to seeing 6 what's going to happen with it in the future. 7 MR. MAXWELL: Thanks very much to all of you. 8 It's really heroic to try in this period of time just to 9 try and sort through some of the basic issues. I'd like 10 to open it up to members of the panel if they have 11 comments that they would like to make. 12 I have one question. If we sort of stipulate 13 for the moment that it's not a solution and if I stipulate 14 for the moment that it's a tool and if we stipulate for 15 the moment -- and we don't have to believe this, but -- 16 that everybody here is interested in increasing 17 individuals' control over their own information, one of 18 the issues -- I look at this audience and say, how many of 19 you have made adjustments to the settings of your browser? 20 How many? 21 (A show of hands.) 22 So call it in this case about two-thirds of an 23 audience which is interested enough to come to the 24 Department of Commerce for a session on privacy 25 technology. Now let's assume for the moment that we have 0060 1 a much broader audience, 100 million people in the United 2 States, 300 million people around the world. What can we 3 do to give people sufficient encouragement, tools, 4 simplicity, ease of whatever, to make this tool useful, to 5 make sure that people feel comfortable with it, that if 6 they choose to approach it this way that they will say, I 7 can do it and I can do it in a nanosecond and it's easy 8 and it's just like buttering the toast in the morning? 9 How are we going to make it possible for people to really 10 make use of this tool if it does give people a greater 11 sense of empowerment? 12 MS. MULLIGAN: I direct people to something 13 that's called the P3P Guiding Principles. I think Elliot 14 is highlighting the importance, A, of educating consumers 15 about the existence of tools and the fact that they can 16 use them. There are some efforts under way kind of in the 17 broader technology community, but also efforts like this, 18 to educate the public. 19 But it also really emphasizes the focus on how 20 does the product come out of the box, what does it look 21 like, what are the defaults, how configurable is it, how 22 obvious is it to the consumer? And because this is what 23 we'd like to call, I think, a social protocol, it's not 24 just about technology, it's about a pressing social issue 25 -- privacy. 0061 1 There was a lot of thought within the P3P 2 working groups about how to give guidance to web sites, to 3 implementers, about when they were designing products how 4 they should think about designing them, make sure data 5 isn't transferred unless a consumer explicitly wants to. 6 Now, P3P doesn't transfer data, but P3P might be built 7 into a product that does. It could be in a product that 8 provides for anonymization, a product that reads P3P 9 statements, and a product that provides a wallet. 10 Well, the guidance there is make sure that, even 11 though the P3P policy has been read, that the tool doesn't 12 automatically blast consumers' data away from them without 13 actually requiring some affirmative steps. It's not a 14 one-click, it's a two-click. If you look at the guiding 15 principles, there's a lot of direction. Some of it's 16 should, some of it's must's, some of it's, we'd really 17 like you to. But it's a lot of forward thinking about how 18 should the product come out of the box. 19 I think that there's going to need to be a lot 20 of vigilance from consumers and people what care about 21 privacy in evaluating products and make sure that they 22 actually meet the goals of advancing privacy. But I think 23 it's a process and it's a very iterative one and we're 24 just in the beginning of that. 25 MR. MAXWELL: Marc, then Danny, then Karen. 0062 1 MR. BEREJKA: I would just like to second 2 Deirdre's point about the importance of defaults and also 3 reiterate my invitation to interested parties to work with 4 Microsoft on our P3P implementation for the operating 5 system. We know that there are a lot of conflicting 6 tensions, but one of the things that Microsoft does care 7 about enormously is the user experience. We bring people 8 in off the street and we have them sit down and we have 9 them hack around and we have them -- we get real consumer 10 feedback. 11 We also run public betas and try to get a lot of 12 feedback that way. My ultimate point is that this is not 13 going to be an easy balancing act, but at least from our 14 perspective it's a balancing act that we look to 15 accomplish within the very real near-term. 16 MR. WEITZNER: So you didn't ask people how many 17 buttered their toast in the morning. That should have 18 been the question. 19 But I think that the real question should have 20 asked, or rather the question that you should ask to the 21 audience of web users out there, is how many of you set a 22 preference in AOL or whatever Internet service provider 23 that you use? Granted, most people don't go and set their 24 browser preferences. However, people develop expectations 25 about their browsers. 0063 1 People, I think through a whole lot of public 2 information effort and just common sense, are reluctant to 3 enter credit card numbers when they don't see that little 4 locker key icon closed. That's not because they know what 5 SSL is or because they went and changed some settings or 6 because they downloaded more security. It's because, 7 through a pretty complex process, which we obviously all 8 have to come to understand better, their expectations 9 changed as a result of technology tools that were 10 available to protect them. 11 That's the kind of dynamic I think we're looking 12 for, and I think based on the experience with tools like 13 SSL we can expect to be successful. I think it takes a 14 lot of effort, no question, and I don't think we should 15 just assume that people are going to become technological 16 geniuses. But I think there is evidence that people 17 gradually do use new technology when it actually offers 18 them something. 19 MR. MAXWELL: Karen. 20 MS. COYLE: Well, I think there's even a prior 21 question that we need to ask here. The Pew study which 22 Secretary Mineta alluded to basically says that a large 23 number of people are concerned about their privacy, but a 24 huge number of people, although a majority have heard 25 about cookies, they have no idea what it is. 0064 1 I think the first question is who is going to 2 educate the Internet users as to how it actually is that 3 their privacy is being invaded, because it's only with 4 that information that they will have the knowledge to turn 5 to something like P3P. 6 MR. MAXWELL: Martin. 7 MR. PRESLER-MARSHALL: Elliot, I'll give you a 8 direct answer to one of the questions you asked, which is 9 how can we help see this get rolled out and then be usable 10 to individuals. There are a wide variety of opinions 11 among the people in the world as a whole about what 12 constitutes an acceptable level of privacy. There are a 13 large variety of opinions within my own household as to 14 what constitutes an acceptable level of privacy. 15 Part of this discussion is to give individuals 16 that flexibility. There's a need for organizations to be 17 able to, for public interest organizations to be able to 18 express not necessarily defaults, but settings: This is 19 what we believe is an acceptable level of privacy. So if 20 you happen to be in the Ralph Nader camp, you may go to an 21 organization, a similar organization, and see, these are a 22 reasonable set of settings. Or if you happen to be more 23 interested in getting highly personalized content on the 24 Internet, then maybe you can find different settings. 25 So this really needs to be a broad reach, with 0065 1 many people offering opinions, because there are a wide 2 variety of opinions within the United States and around 3 the world. Tim Burners-Levy, inventor of the World Wide 4 Web, is fond of reminding people that the first "W" in 5 "WWW" stands for "World," and there's a lot of opinions 6 out there and we need to make sure that those are all 7 supported. 8 MR. MAXWELL: Thank you all very much. I'd like 9 to be able to take questions from the audience if you have 10 them. The only thing that you have to do is to respond to 11 the question about buttering. 12 If you'd identify yourself, please. 13 MS. WOODARD: Yes, my name is Gwendolyn Woodard 14 and I would like to thank each of you for the information 15 on the tools that the consumer have to work with. 16 However, I would like for you to talk about how these 17 tools will work on a voicing browser for individuals who 18 have physical challenges, and just could you talk about 19 that issue. 20 MR. WEITZNER: Let's see. It's an excellent 21 question and I'm not going to be able to give you a 22 complete answer. You're probably familiar with the World 23 Wide Web Consortium's web accessibility initiative. One 24 of the benefits of having machine-readable privacy 25 policies that I didn't mention is of course that once a 0066 1 browser or a user agent has overall abilities to 2 accommodate people's different disabilities, that browser 3 can then present the information in the way that the user 4 is comfortable with once it's encoded in machine-readable 5 format. 6 So the fact that, as Dan said, this is 7 semantically structured information, the browser that 8 knows to send that information through a braille reader or 9 not to present it in image format but to do it in some 10 other format will be able to make those accommodations 11 much more effectively than a policy that's just written 12 out in English. 13 I'd also point out that the fact that these are 14 machine-readable policies means that they can be presented 15 in any number of natural languages that the user happens 16 to be comfortable with. 17 MR. WEITZEL: David Weitzel from Mitretek. 18 We're a systems engineering nonprofit here in the suburbs. 19 My question relates to a report or reports that came out 20 of GAO last week about the United States Government and 21 its interaction with the citizens. Is there a role here 22 for the government to step up quickly to be an early 23 adopter and to lead the way as a good Internet citizen in 24 its interaction with the American populace? 25 MR. MAXWELL: Rather than shunting that question 0067 1 to any of the other people who might be willing on the 2 panel to answer it: Yes. The Commerce Department is 3 committed to having its pages P3P-compliant. It's already 4 well on its way to do that. 5 That report from the GAO I think was very 6 troublesome, not because of what it concluded, but because 7 of sort of what it asked and what it ignored. What it 8 ignored was something that Ed mentioned, which essentially 9 is since the early seventies the government has been under 10 the restraints of the Privacy Act and we're all very 11 pleased that it does. So the web sites that were queried, 12 were queried on the basis of a set of principles that were 13 not appropriate for the question. 14 So we feel, I think, in the administration quite 15 proud of the steps that have been taken to increase the 16 privacy protections of the citizen and to be able to work 17 with consumer groups, with the industry, to think about 18 how to increase individual control over information. That 19 was I think a sort of fairly bum rap. But sort of the 20 easy answer to the question is we are committed to P3P 21 implementation here and we're on our way. 22 Over here. 23 MR. CLARK: Drew Clark with National Journal's 24 Technology Daily. 25 Mr. Jaye mentioned that P3P can help you compete 0068 1 on privacy, on strong privacy protections. But Mr. 2 Mierzwinski and Ms. Coyle seem to say that there really 3 won't be competition on privacy within certain industry 4 spheres. I'd like to understand exactly how privacy -- 5 how P3P will help you compete on privacy and to get the 6 viewpoints of some of the other panelists on whether P3P 7 will facilitate market competition on better privacy 8 policies. 9 MR. JAYE: Thank you. First of all, the ability 10 to conduct advertising on the Internet requires certain 11 mechanisms like cookies to be able to, for example, when 12 somebody clicks on an ad send them to the appropriate web 13 site, so we can basically staple together the resulting 14 page and the ad together, in Lorrie's analogy. In 15 addition, advertisers want to know how many visitors saw 16 an ad. So there are statistical purposes that are not 17 privacy invasive, that are not used to make decisions 18 about individuals, for which this information is used. 19 To be able to express that that's what we're 20 doing in certain cases and to be able to do that, because 21 we've been able to express it and distinguish ourselves 22 from unknown policies, is an enabling capability for an 23 advertising-supported model. 24 But specifically with regard to privacy and 25 competitive advantage, major brands respond to consumers. 0069 1 Consumers care about privacy. Major brands and 2 advertisers don't want to be associated with bad actors. 3 Being able to maintain a position that is strong with 4 regard to privacy has been a competitive advantage. It's 5 a position we've taken for five years, and it's our story 6 and we're going to stick to it because it's working very 7 well for us. 8 MR. MAXWELL: Because of time, let's just turn 9 to the last question. 10 MR. STAMPLEY: Dave Stampley from the New York 11 AG's Office. 12 For anybody who is of the mind that there should 13 be certain defaults and that the best thing to do for 14 consumers would be to recognize those defaults and hand 15 them to consumers, I guess my question is is there such a 16 thing as an identifiable default privacy value that in 17 fact is not a choice or value judgment itself that might 18 usurp some other consumer's power? And might it be more 19 important instead to focus on where is a good baseline and 20 should all persons collecting information be obligated to 21 provide abilities to then vary from that baseline or let 22 consumers set their own preferences? 23 I'm just curious if there is a sense that there 24 is a way we could say, this is privacy and we know what it 25 looks like and we'll set it here at this point in time. 0070 1 MS. MULLIGAN: I think it's a great question and 2 I think it's a place to distinguish between substantive 3 defaults and process defaults. What I mean is, my 4 decisions about privacy and what would be an appropriate 5 disclosure of information are going to vary depending upon 6 the situation. 7 For example, if I'm trying to get a driver's 8 license on line, yes, they're covered by the Privacy Act. 9 B, I have no choice. They're not going to give me the 10 driver's license unless I give them two forms of ID. For 11 me in that situation, that's going to be acceptable 12 privacy. 13 Now, if CVS said that they're not going to let 14 me buy Rollos without two forms of ID, that's not going to 15 be acceptable. That's a substantive privacy decision. 16 I think some of the process privacy decisions, 17 for example we were talking about does the information get 18 transferred without the consumer's affirmative action, 19 that same process I think you could apply in both 20 situations. So even if I'm applying for a license, I 21 should have to actively hand over my two forms of ID. 22 They shouldn't automatically be sucked out of my computer. 23 So I think there may be some areas, we may have 24 substantive defaults that we think apply in commercial 25 transactions, we may have substantive defaults that we 0071 1 think apply in government transactions. I think it would 2 be very, very difficult and it's a whole other process to 3 create those. 4 I think Martin made an excellent point: 5 Consumers are going to feel differently because they have 6 different concerns and they have had different 7 experiences, and it's not going to be big broad cuts, you 8 know, commerce, government. it's going to be certain 9 companies that I already do business with, or you're going 10 to have lots of variety. 11 But I do think on the process defaults that I 12 think that there is some progress to be made in thinking 13 about how to implement some of those in a broad way across 14 different kinds of implementations. 15 MR. PETERSON: P3P is available, the client 16 tools at least in beta are available, to make consumer 17 research very possible. So starting with what we know or 18 what we think we know and then putting it in the hands of 19 consumers worldwide, not just in the U.S., and seeing how 20 they react and whether it's useful or not is something we 21 ought to definitely be doing to really decide where these 22 things ought to net out. 23 MR. MAXWELL: I'd like to thank the audience for 24 their questions. I'd like to thank the panel. It's 25 really a quite extraordinary group of people who I think 0072 1 have worked very hard on this issue. 2 This privacy technology is only one piece of the 3 puzzle. It's clearly only one part, and when people sort 4 of, I think, look at this and see it's sort of this or 5 that, it's really not the kind of sophisticated analysis 6 that you'd expect. It's about law, it's about self- 7 regulation, because self-regulation is just another word 8 for what the companies will be doing for themselves and 9 how they will think about it. It's about consumer 10 awareness and education. It's about how the technology 11 provides tools. 12 I think when we put all of these together, while 13 there will be differences, I think we all are committed to 14 exploring the issue of how can we give people more control 15 over information about themselves so that we can in fact 16 harvest the incredible technology that is available to us 17 now. I think everybody working here makes a huge 18 contribution to that effort. 19 So thanks again for your time and effort and for 20 your attention. 21 (Applause.) 22 MS. LEVY: Thank you for your insights on P3P. 23 Thanks for the panel. 24 I'd like to now invite everyone to go to a 15- 25 minute break. I invite you out to the lobby to see the 0073 1 exhibits and to enjoy some refreshments being hosted by 2 the Internet Education Foundation. Our next panel will 3 begin at 11:15. 4 (Recess from 11:01 a.m. to 11:28 a.m.) 5 MS. LEVY: Good morning. We're going to get 6 started with our second panel this morning. This panel is 7 on the role of privacy -- 8 VOICES: Your mike's not on. 9 MS. LEVY: It's not on? Can I try to get a mike 10 here? 11 (Pause.) 12 PANEL DISCUSSION: IMPLICATIONS FOR 13 FAIR INFORMATION PRACTICE PRINCIPLES 14 Hello, can you hear me now? Is that working? 15 We just want to start the second panel this 16 morning. It's going to be on the role of privacy- 17 enhancing technologies and the fair information practice 18 principles. We're pleased to have as our moderator for 19 this panel Dr. Lorrie Faith Cranor, what spoke this 20 morning and was introduced by Assistant Secretary Rohde. 21 So I'm going to let Dr. Cranor go forward. Thank you. 22 DR. CRANOR: Thanks. Let me start by 23 introducing the panelists and then we are going to -- I 24 think it's on. We are going to go through a series of 25 questions and answers, rather than having the panelists 0074 1 each give a presentation. So let me go through the 2 panelists. They're all here now. 3 First, we have Brian Adkins, who is Director of 4 Government Relations for the Information Technology 5 Industry Council, where he handles privacy, intellectual 6 property, and other e-commerce issues. He is also Co- 7 Chairman of the Privacy Leadership Initiatives Technology 8 Working Group. 9 Next we have Scott Beechuk. He is Co-Founder 10 and CEO of Privacy Right, Inc. His previous technical 11 career in embedded systems design, object-oriented 12 programming, and engineering management served as the 13 basis for his deep interest in and understanding of 14 Internet privacy and security technology. 15 Next we have Glee Cady, who is Vice President of 16 Global Public Policy for Privada, Inc. She brings over 20 17 years of technology and Internet experience as a respected 18 author, educator, technology executive, and policy 19 adviser. 20 To my left is Caitlin Halligan, who is Chief of 21 the New York Attorney General Elliot Spitzer's Internet 22 Bureau. The Bureau coordinates statewide law enforcement 23 -- ooh, now I can really be heard -- statewide law 24 enforcement efforts regarding online consumer fraud, 25 privacy, securities trading, and other Internet-related 0075 1 issues. 2 Now I have an echo of myself in both ears. 3 To my left we have Lance Hoffman, who is filling 4 in for Joel Reidenberg, who had a family emergency. Lance 5 is Professor of Computer Science at the George Washington 6 University. He is in charge of the computer security 7 graduate program in computer science. he is the author or 8 editor of five books and numerous articles on computer 9 security and privacy, and he founded the School of 10 Engineering Cyberspace Policy Institute. Lance and I both 11 served on the Advisory Committee on Online Access and 12 Security at the FTC. 13 Next we have Gary Laden. Gary joined BBBOnline 14 on October 1st, 1998, as Director of the BBBOnline Privacy 15 Program. From 1994 to September 1998 Gary served in the 16 Federal Communications Commission's Cable Services Bureau, 17 first as Chief of its Policy and Rules Division and most 18 recently as Chief of the Consumer Protection and 19 Competition Division. 20 Prior to his service at the FCC, Gary was at the 21 FTC for 21 years as an attorney and Assistant Director of 22 the Marketing Practices Division. 23 Next we have Stephanie Perrin, who is the Chief 24 Privacy Officer of ZeroKnowledge, formerly the Director of 25 Privacy Policy for Industry Canada's Electronic Commerce 0076 1 Task Force. Stephanie Perrin manages ZeroKnowledge's 2 public affairs activities and acts as the company's 3 primary liaison to government and nongovernmental 4 organizations. An internationally recognized expert in 5 freedom of information and privacy issues, Stephanie was 6 instrumental in developing Canada's privacy and 7 cryptography policies over the past 15 years. 8 Finally, we have Ari Schwartz, who is a policy 9 analyst at the Center for Democracy and Technology. Ari's 10 work focuses on protecting and building privacy 11 protections in the digital age by advocating for increased 12 individual control over personal information. He also 13 works on expanding access to government information via 14 the Internet and online advocacy in civil society. 15 Ari is a leading expert on the issue of privacy 16 on government web sites and has testified before Congress 17 and Executive Branch agencies on the issue. 18 We're going to start today with Ari, who is 19 going to give us an overview of the fair information 20 practice principles so we're all on the same page and know 21 what we're talking about. 22 MR. SCHWARTZ: I'm going to stand mostly because 23 I feel as though I'm facing that way and I'm speaking to 24 you. Although I only have one slide, I could have 70, but 25 after Lorrie's experience I think I'm better off narrowing 0077 1 it down to one. 2 (Screen.) 3 The Fair Information Practice standards are the 4 basic standards by which we measure data privacy. Many 5 companies come to us and ask, what are the basic standards 6 that we should address. We hand them a list of Fair 7 Information Practices. The discussion doesn't end there, 8 mostly because the standards -- the list of different 9 kinds of standards cover different issues. There are many 10 different areas, many different kinds of these standards, 11 and they're often portrayed in different lights. 12 Sometimes we've heard recently that a lot of 13 these Fair Information Practice standards, the sets that 14 we've been seeing, have been portrayed as international, 15 not American focused. But in reality, the Fair 16 Information Practices are an American idea. As was 17 mentioned on the last panel, in 1973 Health, Education and 18 Welfare Departments put together the first set of Fair 19 Information Practices. Those were a set of four 20 practices, more like statements than actual bulleted ideas 21 as we see before us today. 22 Since that time we have seen these practices 23 grow and shrink in different formulations, usually ranging 24 between 4 and 12 different practices. The most popular 25 that any self-respecting CPO, chief privacy officer, or 0078 1 privacy analyst or policy analyst working on privacy 2 should know by heart are: 3 The Organization for Economic Cooperation and 4 Development standards. They have eight standards. Those 5 were made in 1980, agreed upon by all the OECD countries. 6 The FTC standards. That's the Federal Trade 7 Commission standards, which have come out more recently. 8 They have a set of five standards, and sometimes these 9 standards overlap, sometimes they don't. 10 These, the ones you see here, these are the 11 Department of Commerce's standards that they put forward. 12 We see that they have chosen six. The list that the 13 Center for Democracy and Technology covers chooses seven. 14 If you want to see those, go to cdt.org. That's my 15 commercial for the day. 16 But I've been asked to show the Department of 17 Commerce standards. They're somewhat similar to the CDT 18 standards, and we will also hear from other people where 19 these are lacking, I'm sure, later in the panel. But I 20 will try and do as good a job of covering them as I can. 21 Can people in the back read the full slide? 22 VOICES: No. 23 MR. SCHWARTZ: Then I will reread what it says 24 on the slide. I'm sure that you can read some of the 25 bold. The first practice listed is: "Awareness. 0079 1 Companies should raise consumer awareness and should post 2 privacy policies that articulate the manner in which it 3 complies with other fair information practices." 4 I've also noted here that this term is also 5 called "notice" and many people on the panel will refer to 6 it as notice, rather than awareness. That's the way it's 7 listed in the FTC principles, for example. 8 That's the most basic of Fair Information 9 Practices, just the idea that individuals should know 10 what's going on with personal information. 11 Second is what's called choice here: "Companies 12 must give the opportunity to exercise choice with respect 13 to whether and how their personal information is used." 14 In other words, that individuals just are given some kind 15 of choices. Consent, opt-in, opt-out, are often terms 16 used when referring to this practice. 17 Consent and opt-in are often used synonymously. 18 Opt-in is the idea that individuals should be given the 19 ability to affirmatively consent to uses beyond the 20 transaction at hand. So if the information is being used 21 for another purpose, the individual should be able to 22 consent for those uses. 23 Opt-out is when individuals -- when the 24 information is by default used for other purposes and the 25 individual is given the opportunity to get off of the 0080 1 list. There are many ways to frame opt-in, opt-in and 2 opt-outs. 3 Most of the debate that goes on today about 4 choice are about whether it should be opt-in or opt-out. 5 I'd like to also put out the idea that there's a third way 6 here, which is not just opt-in or opt-out, but give the 7 individual two choices -- I want my information shared, I 8 don't want my information shared -- right up front, 9 getting beyond this kind of idea that there is some kind 10 of default that needs to be set. 11 Third on this list is security: "Companies must 12 take reasonable precautions to protect data from loss, 13 misuse, alteration, or destruction." This is a pretty 14 straightforward principle that's in almost every set of 15 fair information practices. 16 Data integrity: "Companies should only collect 17 and keep personal data relevant for the purposes for which 18 it has been gathered. The data should be accurate, 19 complete, and current." This is covering two or three 20 different types of practices: one, the first set, which 21 some consider to be collection limitation, and there are 22 different ways that that can be framed, the basic idea 23 that you should only be collecting information relevant 24 for the purpose; and that the data should be accurate and 25 complete, is also in most fair information practices and 0081 1 often called data integrity. 2 Access: "Companies should offer consumers 3 reasonable access to information about them and a means to 4 correct or amend inaccurate information." This is an oft 5 hotly debated subject. This was, as Lorrie mentioned, 6 this was covered recently in an Access and Security 7 Working Group that the Federal Trade Commission held. 8 They put out a very good report on this subject and on 9 security, which both Lorrie and Lance were on, I think. I 10 don't know if anyone else on this panel was, but some of 11 the other panelists have been as well. 12 That really goes through each of those issues 13 about where people lie on this. It ranges from people 14 should just know some basic information held about them, 15 what has been collected in the past, to that individuals 16 should have access and be able to correct anything that is 17 held about them, and there's different places in between 18 there, which I'm sure will be addressed. 19 Accountability: "Companies must be accountable 20 for complying with their privacy policies." This one is 21 also known as enforcement and it is also somewhat basic 22 and straightforward. 23 DR. CRANOR: Thank you. 24 Stephanie, I was wondering if you would care to 25 comment on some of the other principles that are not part 0082 1 of this set that you think we should be thinking of as 2 well. 3 MS. PERRIN: Thanks, Lorrie. I guess, should I 4 speak here or go up to the podium? 5 DR. CRANOR: From the seat. 6 MS. PERRIN: From the seat, okay. I think 7 Lorrie's afraid I'll get up there and talk for half an 8 hour. 9 In Canada, about ten years ago, recognizing the 10 difficulties that we were facing in this area and frankly 11 the lack of compliance to the OECD guidelines, we formed a 12 committee under the Canadian Standards Association, which 13 is a recognized standards development body, and created 14 basically a management standard for privacy based on the 15 OECD guidelines with a view to making what are good 16 statements of principle auditable. 17 We came up with a list of ten. Just briefly, we 18 pulled some of them forward and gave them the kind of 19 depth that is impossible to get in an international body 20 such as the OECD when you're working on these things, and 21 we put what we thought was the first one forward, and that 22 is accountability: Every organization shall be 23 accountable for the information under its care and shall 24 put in place procedures and practices to give effect to 25 that accountability, shall educate, shall make people 0083 1 aware, shall name someone within the organization 2 accountable. It's a pretty full principle. 3 I should just add that Canada went ahead once we 4 had agreed on this voluntary standard, which took 5 basically about four years -- we had an industry, a 6 consumer rep, and government committee, 47 members. Some 7 of you have some idea what that would be like. And we 8 pounded this out. Once it became a standard, we move 9 ahead and legislated. 10 So if you're interested in looking at it, it is 11 the Schedule 1 of the recently passed Personal Information 12 Protection and Electronic Documents Act, which cleared 13 Parliament in April and will come into force in Canada for 14 all organizations engaged in commercial activity on 15 January 1. 16 So we pulled the accountability up forward. We 17 gave a lot more emphasis to the pieces that are necessary 18 in management systems to make it auditable and make people 19 accountable. 20 The next one was purpose specification. You had 21 to state your purpose. There were huge fights over 22 whether you could -- whether we would get into this 23 standard the concept of whether a purpose was legitimate. 24 Obviously, businesses weren't keen on that, consumers 25 were. So we at least said that you had to state it. I 0084 1 think that complies partly with your awareness, only it is 2 more fuller in this standard. 3 Consent. One of my problems with the opt-in, 4 opt-out is it's a totally opt-in, opt-out. You opt in for 5 something, you're there for the ride. Really, with the 6 complexity of the personal information flows in the data 7 age, we've got to be able to articulate that consent on a 8 data element basis. 9 There are certain provisions under this consent 10 clause. One that I would pull to your attention is you 11 are not required -- and this is now law, of course -- to 12 give more information than is required for the delivery of 13 a particular product or service, and if the company denies 14 you the product or service based on a failure to give 15 information not necessary for it, you've got a justified 16 complaint. 17 That kind of thing doesn't come across in a 18 simple opt-in, opt-out. So a little more articulation on 19 the consent. 20 There would be limits to collection. We see 21 different thresholds here. Instead of putting collection, 22 use, and disclosure all in one bailiwick, there's 23 collection tied to purpose and then use and disclosure 24 tied to purpose. So they're in two separate principles. 25 You limit the collection first and then once you've got 0085 1 that data you limit the use, the disclosure, and the 2 retention of that data, because as long as data's hanging 3 around of course the temptation to use it for another 4 purpose arises. 5 Accuracy, a little different. This insists on 6 accuracy, this data integrity principle. In our debates 7 we discovered that, frankly, it wasn't in a consumer's 8 interest to insist on accurate information because it gave 9 companies an excuse for a fishing trip to go back and get 10 recent, more accurate data. If you don't need it, you 11 don't need to make it more accurate. In other words, if 12 you've got a loan ten years old, don't go back and look 13 for my new income if I'm making my payments. 14 So that's a little different. We have a 15 safeguards principle that I think is a little fuller, 16 although that's a good security principle there. 17 Basically, industry standards is what we're looking for, 18 and that's of course where our companies come in, is to 19 make this real. 20 Openness, similar to your first principle, but 21 basically a company has to make policies, procedures, down 22 to the detailed collection instrument level available to 23 people on request. That doesn't mean you have to have it 24 all there in your office, but if an individual inquires 25 that stuff has to be made available. That's a little 0086 1 fuller, I think, than just an education sort of imperative 2 that you see in that first principle. Again, under the 3 law, of course, people can complain on any of these 4 things. 5 Individual access, hotly debated, of course, in 6 Canada, as it is everywhere. But it's a fundamental right 7 of privacy. If you don't have the right to get your 8 records and to change them and to have whatever your view 9 of the story is travel with the records wherever they go, 10 then you don't really have a human rights-based approach 11 to this in our view. So that's pretty strong under this 12 bill. 13 Challenge. Many data protection statutes and 14 indeed the OECD guidelines gave people a right to 15 challenge the accuracy of their own information and some 16 of them the use that's being put to it. We broadened that 17 because we recognize that a lot of the problems in privacy 18 arose over security issues, over the way the company 19 handles the data. 20 So under this standard and now piece of 21 legislation, you have the right to challenge any of the 22 practices, and of course in the regime we have you can 23 take that to a privacy commissioner and have it 24 investigated. 25 Now, this standard has been put forward for 0087 1 several years to ISO as the basis for a management 2 standard. It's had, I think, three or four unanimous 3 resolutions of the COPOCO committee -- that's the Consumer 4 Policy Committee of ISO -- endorsing it, but it keeps 5 getting blocked. So the ad hoc advisory group that was 6 looking at this for a potential for an international 7 standard has now been disbanded. There's work going on in 8 Europe looking at standards, but I think that's where it 9 ends. 10 DR. CRANOR: Thanks. 11 Now that we have the overview of what the 12 principles are, let's focus on technology, which is the 13 focus of this panel. I'd like to turn to Glee and ask you 14 to describe how the technology that your company offers 15 can support some of these principles. 16 MS. CADY: Good morning. Is this on? Ah, good. 17 I'm now getting the echo that Lorrie was talking about 18 earlier. 19 I find it very difficult to answer this question 20 because we don't see our technology as other than parallel 21 to the Fair Information Practices. What we're trying to 22 do is to build an infrastructure where you don't have to 23 count on the other party being good. Do we support 24 philosophically all of these things that Ms. Perrin so 25 eloquently described? The answer to that is yes. 0088 1 But we also support basically the idea that, 2 because in an Internet environment what we have is a 3 bottom-up structure of making it relatively easy for new 4 companies to come online all the time and making it 5 relatively difficult for us to get the word out, us 6 collectively as those people being interested in the 7 privacy community, about what is responsible behavior, 8 what we're afraid of is that the trust that comes that 9 enables good things to happen, whether you're sharing 10 information or you're buying products, won't be based on 11 the necessary experience that we find in the real world. 12 So what we're trying to do at Privada is to 13 build privacy into the infrastructure. That's somewhat 14 hubristic at this point, all right. The current product 15 sets that we offer and are offered by other anonymizing 16 and pseudonymizing companies don't provide a total 17 package. We're all working toward that, but again this is 18 a bottom-up process where we're working in conjunction 19 with top-down standards like P3P in order to provide the 20 accurate notice, so that you could tell what someone's 21 privacy policies are if you visit their site, but also 22 that you don't have to count on it because you don't have 23 to tell them who you are in order to achieve what you need 24 from them. 25 So our motto is "Privacy Under Your Control." 0089 1 We encourage responsible behavior on the part of 2 individuals certainly and on the part of other companies, 3 but we're trying to make it so the consumer doesn't have 4 to know the four principles of so on and the five 5 principles of someone else or the six principles of the 6 Department of Commerce in order to be both protected and 7 effective. 8 DR. CRANOR: Thanks. 9 Caitlin, I'd like you to talk about what you see 10 as the limitations of technology in addressing these 11 principles. 12 MS. HALLIGAN: I think there are a couple of 13 different ways -- the echo is disturbing, right. There 14 are a couple of different ways of getting at -- no echo. 15 There are a couple of different ways of getting 16 at that question. I think all of the technologies we've 17 been looking at today only address discrete components of 18 the Fair Information Practices, and I don't think any of 19 them purport to address all of them. So if we take P3P, 20 for example, it I think does take some important steps 21 towards improving notice and improving consumer ability to 22 understand the information practices that a site might 23 have. 24 By doing that, it also facilitates choice. It 25 doesn't, I think, do as much with respect to enhancing 0090 1 access and security. I think that's okay. I think, 2 secondly, if you look at how these technologies function 3 to protect consumers with respect to particular aspects of 4 Fair Information Practices that they try to get at, we 5 could all identify limitations and I think some of those 6 are a function of where the technology is today and where 7 it might be tomorrow. 8 For example, again if you look at P3P, one of 9 the pieces that doesn't seem to be on the table right now 10 in this current iteration is the ability for consumers to 11 negotiate on a real-time basis with a site with which they 12 might want to interact. So if a site's privacy policies 13 don't match their preferences, there's not an opportunity 14 to offer to engage in some kind of trade. But again 15 that's a function, one would hope, of where the technology 16 is today. 17 I think the third way of looking at this 18 question and maybe the most important is whether 19 technology, whatever it's able to do for us, is sufficient 20 to fulfil Fair Information Practices. I think the answer 21 to that is probably not. I think that we do need some 22 sort of statutory guideline that puts rules in place and 23 creates incentives for the technologies to develop in a 24 way that promotes those principles. I think that's true 25 for a couple of reasons. 0091 1 First of all, these technologies, as wonderful 2 as they might be, are not self-adopting. They're not 3 self-adopting for businesses and they're not self-adopting 4 for consumers, either. For consumers to take advantage, 5 for example, of Privada they have to understand that there 6 are issues out there about information practices that 7 might trouble them and that there is an option out there 8 for them to protect their privacy. 9 Also, I think that these technologies don't 10 address and don't again purport to address a second very 11 important component here, which is Stephanie's first 12 principle, and that's accountability. There aren't a lot 13 of ways in which the enforcement is really enhanced by 14 these technologies. There aren't ways in which there are 15 audits readily done by these technologies which allow for 16 -- whether it's a regulatory entity or watchdog 17 organization, to easily monitor whether in fact sites are 18 in compliance with the practices that they set forth to 19 the public. 20 So I think that there is a bigger picture 21 question out there that's worth thinking about. 22 DR. CRANOR: Before we go on, does anyone want 23 to disagree with any of those limitations or point out any 24 technologies that we haven't maybe thought of? 25 (No response.) 0092 1 DR. CRANOR: Okay. Then what I want to do now 2 is -- oh, go ahead. 3 MS. PERRIN: I hate to grab the mike again, 4 Lorrie, because I did have quite a bit of time there. But 5 I think we have to be clear. I agree with our last 6 speaker, a technology is not a replacement for a law. It 7 is a heck of a good way to implement the requirements of a 8 law and, frankly, that's why I came to ZeroKnowledge, 9 because the next challenge is to build the principles of 10 law, the principles of what we agree on here, into the 11 infrastructure. 12 I think that many of these things become web 13 enforceable when you empower consumers through really good 14 technology. But let's not expect each discrete technology 15 to cover all ten principles. 16 DR. CRANOR: I think that's a theme that we're 17 going to hear over and over again, is that there is no one 18 silver bullet, there is no one technology that's going to 19 address everything. 20 I'd like to go through some of the individual 21 principles and talk about how the various technologies 22 support them. So let's start with notice and awareness. 23 Brian, I was wondering if you could give us some ideas of 24 how this is supported in technology, notice and awareness. 25 MR. ADKINS: I guess we spent most of our 0093 1 morning, long before too -- I think we spent most of our 2 morning talking about probably the best practice for 3 notice through technology, which is P3P. The limitation 4 on P3P in providing notice is that it does require the 5 participation of a critical mass of web sites if we want 6 to have effective notice for consumers. 7 Once there is a critical mass of web sites who 8 are willing to implement P3P or some version of it, then 9 there will be market pressure on other web sites to also 10 implement. They'll see that they'll be the one left out, 11 the one that's not getting the "Good" rating. So P3P is 12 probably the best prospect for that. 13 At the Privacy Leadership Initiative we tried to 14 kind of compile a poster, which is actually available out 15 front, that shows you what some of the technologies do and 16 don't do, and not broken down into the Fair Information 17 Practices, but things like surfing honestly, purchase 18 without revealing your identity to the merchant, manage 19 cookies, encrypt e-mail, manage your identity. 20 Then there's one section on Platform for Privacy 21 Preferences. Some of these go into notice as well. I'm 22 thinking about managing cookies and managing your 23 identity, managing cookies, letting the consumer know when 24 someone is trying to track down and actually giving them 25 the ability to frustrate some of the methods used to track 0094 1 people's activities on line also goes to notice. 2 DR. CRANOR: Gary, can you speak to the role 3 that privacy seals play in providing notice? 4 MR. LADEN: Is this on? Yes. 5 Well, we started BBBOnline with the premise that 6 there's enough work to go around for all of us -- the 7 technology sector, private sector, government and the seal 8 programs -- and that the problem was complex enough for 9 all of us. We are as a seal program completely supportive 10 of technological solutions, although historically the 11 Better Business Bureau system does not endorse particular 12 products or services, although P3P is more of a spec, and 13 we are certainly fully supportive of that. 14 It seems to me that there is an opportunity here 15 for technology and the seal programs to work together to 16 do different but complementary things. For example, we 17 require sealholders to disclose whether other 18 organizations are operating on their sites. Well, 19 technology can help us do this and verify this. So we see 20 that technology can help us with the implementation of 21 some of our information practice requirements. 22 But as Lorrie has said, we'll hear this over and 23 over again, that there are things that, whether it's this 24 set of principles or the FTC principles or the OECD 25 principles, there are things that technology will not be 0095 1 able to do, and there are things that seal programs can do 2 which technology thus far has not done effectively. 3 Caitlin pointed out some of these limitations, 4 dispute resolution for example, and diagnosing systemic 5 problems with an organization's information practices. 6 There might have been a data spill, but technology may or 7 may not be able to tell us how it happened. It might have 8 happened as a result of technology or there might have 9 been an errant employee. If it was an errant employee, 10 were there proper training procedures in place. 11 We as an organization, one of our seal 12 requirements is that our sealholders train their employees 13 in appropriate information practices. So a seal program 14 would be able to investigate and determine why the problem 15 occurred, help devise a solution, and prevent future 16 occurrences. 17 So I guess the bottom line is that we can 18 provide some accountability and some enforcement tools 19 that technology can't, but technology can help us do some 20 other things that we can only approach. So I view this as 21 a great opportunity for us to work together to get our 22 arms around the problem. 23 DR. CRANOR: Great. 24 Now let's turn to choice, and I'd like to ask 25 Ari to talk about this a little bit. I think most of the 0096 1 technologies we've talked about have some aspects of 2 assisting with which choice, allowing consumers to make 3 decisions, and I was wondering if you can comment on 4 exactly what kind of a role they play and whether there is 5 more to it than that that can't be covered by the 6 technology. 7 MR. SCHWARTZ: I knew you were going to give me 8 the hard one. Let's start with P3P. P3P covers choice in 9 several different ways. First of all, as we said, notice 10 itself gives consumers some choices, basically to leave 11 with their feet. If there is a market for privacy out 12 there, that is a useful choice. If there is not a market 13 for privacy -- and we don't have a market for privacy 14 today -- that is not a useful choice. 15 But what P3P can do is also allow users to set 16 their preferences and work with other tools to give users 17 choices. We have the cookie control tools again that give 18 users choices about what cookies to delete individually, 19 which to set for third party cookies, etcetera, to give 20 the kind of granularity. If those tools are well 21 designed, the choices are easy. If the tools are not well 22 designed, the choices are hard. 23 That's the same with some of the simple opt- 24 outs that we see today. Oftentimes there will be a 25 checkbox where you check off, where you can just check 0097 1 something saying "I don't want to receive e-mail about 2 these services." In some cases it's much more difficult 3 than that. You have to write in to the company, you have 4 to e-mail in. In some cases you have to write in by 5 postal mail, even though the information's been collected 6 online. 7 So really there's just such a wide range of 8 choices and the technologies themselves have to be 9 designed in order to give the users control. Without that 10 kind of standard about what kind of choice we're talking 11 about, the technologies themselves can only go part of the 12 way. 13 DR. CRANOR: Now I want to move on to access, 14 and I'd like to call on Scott to tell us about the 15 technology that his company offers and how it helps assist 16 in providing access. 17 MR. BEECHUK: Thank you. 18 Before I answer the question directly, I'd like 19 to give just the framework for my answer. That is that we 20 all know, everybody in this room understands that on the 21 Internet information is king. It's really what's driving 22 the whole economy. Buyers ideally would have the best 23 intimate knowledge of the sellers and sellers of the 24 buyers. It was painted in the book "Net Worth" about a 25 year and a half ago and they called it the infomediary. 0098 1 But in reality what this is really causing is 2 this is causing a conflict over the ownership of 3 information. If personal information, personally 4 identifying information, is indeed regarded as a property 5 right of the individual, or is it of the business -- the 6 question is who actually owns the property right to that 7 information. 8 To quote Simon Garfinkel -- he's the author of 9 "Database Nation"; I spoke with him the other day -- he 10 said: "In our economy today, it's not a question of 11 property rights or who owns the data. It's just a fact 12 that businesses own your data and there's nothing you can 13 do about it." 14 That's a fairly aggressive opinion, but it does 15 illustrate the fact that there is a pretty nasty conflict 16 going on. It's quickly becoming one of the most costly 17 and consuming issues for online businesses to deal with. 18 So Privacy Right, the company that I founded back in 19 December of '98, we focus on looking at the issue as not 20 an us versus them problem, where consumers are trying to 21 hide themselves from the businesses or hide their 22 information, but to create a classic win-win situation. 23 The way to do that is to extract the 24 inefficiency out of what we call the value chain of 25 personal information. It's a concept that we use 0099 1 internally to describe the flow of personal information 2 from the time that it's originally created by the 3 individual to the time that it's passed on to the 4 business, and the business then repackages that and either 5 resells it or redistributes it. 6 Now, getting to the question, the question was 7 about access and how does Privacy Right address the access 8 issue of the Fair Information Principles. Well, Privacy 9 Right was initially founded based on these principles. We 10 set out to create a business that would actually indeed 11 give consumers access, control, all the things that we 12 really want. But as we worked through this we realized 13 that there's two very difficult, two very, very difficult 14 points on that list. One is access: How do you give 15 users true access to the information that's been collected 16 about them? 17 You know where that data sits? It sits on 18 thousands of databases across the Internet. If you're 19 talking about Amazom.com, you're really looking at 20 numerous, numerous databases: one logical database 21 controlled by Vignette, there's personalization databases 22 by Net Perceptions, and registration, transaction. It 23 gets very complex. 24 So the reason why businesses don't just give you 25 access to your information, it wouldn't cost them anything 0100 1 if it were easy, right? Well, it's not easy, and that's 2 one of the problems that we've set out to solve, is how to 3 create a common platform for all businesses to use so that 4 they can give their users access to their information and 5 very granular control. 6 I forget who was on the panel. It might have 7 been Stephanie that said, when it comes to control it's 8 not just an on or an off; it's what level of granularity 9 do users have over their information. 10 So where P3P comes in and really gives us choice 11 of whether or not to go to that web site, wouldn't it be 12 nice for users to be able to say, I want to go to this web 13 site, but not under the conditions that the web site 14 initially set forth? That's what Privacy Right does, is 15 we've created what we call the Unified Customer 16 Permissions Platform. 17 It's a platform for preferences that businesses 18 can get from Privacy Right for free. Now, on top of this 19 platform we've developed applications, and these 20 applications, one of which is actually called the Audit 21 Server. The Audit Server is what gives users access to 22 their information. It works in conjunction with the other 23 application called the Trust Filter and these monitor the 24 exchange of personal information on business sites, 25 recording those, logging those transactions, and 0101 1 presenting users with a sort of credit report, much like 2 an Equifax or a TRW, for every time that their personal 3 information was accessed. 4 If you want to see the types of information that 5 the web site holds that you may not even be aware of, 6 things like browsing data, e-commerce sites use your page 7 views to calculate personalization parameters and call you 8 by name depending on where you visit in the web site. 9 These sorts of things are equally as important and they 10 will become more important as data mining technology 11 continues to evolve, get faster, and more real time. 12 So Privacy Right's Unified Customer Permissions 13 Platform allows businesses to extract these conflicts over 14 personal information property rights issues by assigning 15 control over the movement or the exchange of data and 16 beginning to optimize the personal information value 17 chain. 18 DR. CRANOR: Thanks. 19 Lance, you and I served on the FTC Advisory 20 Committee which talked about access quite a bit, and we 21 discussed a number of tradeoffs in providing access in 22 terms of the cost to the company, but also a very big 23 issue that we talked about was the security issues raised 24 in providing access. 25 I was wondering if you could fill us in on what 0102 1 some of these issues were. 2 MR. HOFFMAN: Well, the security issues are in 3 some sense the same issues that we've seen before in 4 security of computer systems, but they're now applied in 5 essence to e-commerce and therefore they're writ large, 6 because more and more people are confronted with issues of 7 access, issues of identity, issues of secure 8 communications, encryption, and that sort of thing. 9 As people have said here on the panel, there is 10 not just one solution; there are many solutions that have 11 to be dealt with. I do commend reading the FTC report on 12 that. 13 In general, I think if we look to next year 14 ahead or two years ahead or certainly five years ahead, 15 we're going to see several key themes. One is building 16 controls in. Built-in is critical. That's why P3P is 17 such a crucial step, though it's not the only step. It's 18 a crucial first step. 19 Getting security and privacy controls, because 20 they are so interrelated, as a default option is even more 21 critical. That's why I agree with a number of the 22 panelists who have said opt-in versus opt-out is not good 23 enough any more, it doesn't make any sense any more. We 24 will be getting to controls at the data element level. 25 Some of these tools demonstrated out in the 0103 1 demonstration area will be so good and so valuable to 2 consumers and to businesses that they'll be incorporated 3 into good operating systems down the road, that is in the 4 next year, in the next probably two to three to four 5 years. The question is what operating systems and what is 6 an operating system, as we see the operating system go 7 into telephones and that sort of thing. 8 But these controls will be there. They may not 9 look like anything we have today. They may be voice 10 controlled. There may be a whole bunch of other things, 11 but they will be incorporated, the good ones, the best of 12 the breed, will be incorporated into future operating 13 systems. 14 One thing I was taken by in the discussion, Gary 15 talked about data spills and then Scott talked about the 16 flow of personal information. We're going to see more and 17 more logs built in. We haven't seen many of those yet, 18 audit logs and transaction logs. We're going to see more 19 lawyer-friendly privacy logs and programs to use them. 20 They're going to be coming because they're going to be 21 necessary to resolve the disputes. 22 We'll also see some security controls that we've 23 already seen some, like thumb print scanners and the like. 24 They'll become more commonplace, but in fact what really 25 happens is that ease of use always beats out everything 0104 1 else. Convenience, I must tell you, beats out security 2 and beats out privacy for most users. So things have got 3 to be extremely easy to use, built in, invisible when 4 possible. 5 We're seeing the first steps toward some of 6 these solutions incarnated as products here today in the 7 hall outside. We're going to see them built in, just like 8 a hundred years ago cars did not come with seatbelts, now 9 they come with airbags. Same thing. 10 DR. CRANOR: Thanks. 11 MR. SCHWARTZ: Can I just add to that? 12 DR. CRANOR: Yes, sure. 13 MR. SCHWARTZ: I think that Lance's discussion 14 about logs really does move us toward where technology can 15 help and enforcement can't solve all enforcement 16 questions. You still need to have the baseline laws and 17 standards. But if we have these audit logs and if we have 18 P3P, which puts a vocabulary, common vocabulary, on top of 19 data collection, then we can have real statements that are 20 lawyer-friendly, that are database system administrator- 21 friendly, and that can really help the companies know that 22 they're complying with their own practices. 23 DR. CRANOR: Thank you. 24 MS. PERRIN: Thanks very much. I'd just like to 25 follow up on one thing that Lance brought up there, and 0105 1 that is ease of use will trump privacy and just about 2 everything else every time. One of the things I think is 3 key here is consumer burden. Frankly, I think a lot of 4 these systems are going to get crashed under their own 5 weight. They may be very good and we can get there. The 6 question is will the average consumer that's using the net 7 be able to understand the impact of their choices or read 8 the notification statements or click over to the web site 9 policy? 10 Years in government taught me -- we aimed at a 11 grade six reading level. I don't want to, even as a mom, 12 contemplate how low that's dropping now with the influence 13 of the Internet. But I just don't think we're getting 14 informed choice. The intelligent geeks that are working 15 on this -- and I don't mean just the people on this panel 16 and the people in this room. But the elite are going to 17 understand this. What about the rank and file? 18 Once you get there, if they don't then you're 19 violating a lot of things that are encompassed in law. 20 For instance, P3P by asking for information, asking for a 21 consent there for information not necessary for a 22 transaction, which a web site could do, would be violating 23 the Canadian law. Most of the European data protection 24 laws have fair and deceptive practices. Once you start 25 figuring out what "fair" means, you get back to your grade 0106 1 six reading level and there you are trying to justify 2 whether people can meaningfully understand that their 3 transactional data is being captured for life in some of 4 these situations. Who knows that? I guarantee you 99.99 5 percent of people don't. So you're going to flunk that, 6 that fairness test. 7 I think that -- here's the plug for 8 ZeroKnowledge -- the decision you reach is, if you want to 9 get out of this whole mess, is surf pseudonymously. 10 You've made the decision, you've opted out. You can get 11 all the products and services and you can get materials 12 sent to you. You can enjoy the experience, but you're not 13 leaving a data trail. 14 That's a simple decision that somebody at a 15 grade six reading level can make. 16 MR. SCHWARTZ: Actually, I really want to 17 comment on something she said about P3P. She said that 18 P3P allows sites to say that they are collecting 19 information that violates Canadian law, that violates 20 various laws. Well, that's the case today in handwritten 21 privacy policies or on privacy policies on web sites. 22 What this does, though, what P3P can do, is it can help 23 those countries test for compliance. That's something 24 that's way too difficult to do under the current situation 25 that we have. 0107 1 So at least we know, and it allows the countries 2 also to set a set of preferences that can become defaults 3 for consumers, something else we don't have today. 4 MS. HALLIGAN: I think Stephanie put her finger 5 on what is really a very critical question, and that is do 6 consumers really understand any of this. Danny Weitzner 7 said something this morning which set off a lot of bells 8 for me. My office, in addition to doing enforcement, 9 actually does consumer outreach work. So we go out into 10 communities and try to help people understand what's going 11 on when they go online. 12 As you can imagine, that is, especially when 13 you're talking to anyone over the age of 20, a daunting 14 task. I'll tell you one thing that's really easy, though. 15 Explaining security is made much simpler by the fact that 16 we can put up a slide with either the lock or the key 17 symbol and show it open and show it closed, and people get 18 it. I'm sure they probably don't get SSL, but they 19 understand the choice that they are presented with. 20 I think that until consumers understand not just 21 the choice that they're presented with in the immediate 22 moment -- in other words, do I want to interact with this 23 site -- or even at the level of generality, what should my 24 privacy preferences be, but so that they also understand 25 in a meaningful way what the consequences of those 0108 1 decisions are. 2 In other words, if I decide today that my 3 privacy preference is X, what does that mean six months 4 down the road in terms of the amount of data, whether it's 5 personal data or click stream data, that's associated with 6 either my name or some other identifier? How much has 7 been accumulated about me? What can people do with that? 8 I think that until folks understand that, that 9 the technology really can't do the job of facilitating 10 meaningful choice. 11 DR. CRANOR: Scott. 12 MR. BEECHUK: Ms. Halligan brings up a really 13 good point and that is that there's a significant 14 difference between security and privacy. And while 15 security may be an on-off type operation, where either I'm 16 secure or I'm not, I don't think that privacy has the 17 equivalent. I don't think there is an on light for 18 privacy and an off light for privacy. 19 I think even to the most basic third grade 20 audience, you still can communicate different levels of 21 privacy, because think about it. There's a big difference 22 between a cookie being tracked so that when you come back 23 to the web site they know that you've been there before, 24 and compare that to a medical web site that collects all 25 of your family's medical history, your medical history, 0109 1 and attempts to offer you products and services via e- 2 mail and via the web that correspond to your background. 3 Now, those are two totally different situations 4 and I think in some cases the anonymization technologies 5 do work quite well in that kind of on or off, I just don't 6 want to participate at all. But in other cases you do 7 want to participate, you do want to have your information 8 conveyed, and I think in that case it becomes a user 9 interface issue. 10 DR. CRANOR: Brian. 11 MR. ADKINS: I guess I agree, I agree with what 12 was said before, that ease of use is going to be king. 13 It'll be interesting to see in the panel I think that 14 comes after this whether it's possible for a law, federal, 15 state, or what have you, to suck out the complexity for 16 the consumers and make it easy for them as well. 17 I think there are some developments coming down 18 the road that may actually go towards some of the ease of 19 use and some of the complexity and meaningful choice 20 issues. Most consumers are not going to want to answer 21 all the questions that you could possibly answer under P3P 22 and all the questions that go beyond what P3P covers. But 23 there are trusted parties, there are people -- there are 24 parties that consumers do trust that could answer those 25 questions for you, and if you believe that the ACLU has 0110 1 your interests at heart when they answer the P3P 2 questions, the ACLU could come out with a template. I 3 haven't spoken to them. I don't know if they're doing 4 that. I know that my mother would probably use the 5 template of the U.S. Conference of Catholic Bishops if 6 they made one. 7 There are parties out there that people do trust 8 who could address the complexity themselves and kind of be 9 a trusted intermediary. 10 DR. CRANOR: Glee. 11 MS. CADY: Thank you. 12 I find myself in the interesting position of 13 agreeing with and trying to tell you that I have a 14 technology that actually provides the pseudonymity that we 15 were talking about earlier and yet allows you to have the 16 personalization. So I'm going to wander off on two 17 seconds of commercial. 18 What we're trying to do in the Privada sense is 19 to allow you to achieve the benefits of personalization by 20 trapping all of that at the Privada network level with 21 your Privada ID, which we do not have the ability to 22 connect to your real world ID. A perfectly reasonable 23 accusation therefore to make of any of us who are 24 providing anonymity or pseudonymity services is that 25 therefore bad people could use your service and do evil 0111 1 things, and we also like to think that we have an answer 2 to that, which we'll be happy to cover at some later time 3 when we want to talk about that. 4 Still, I think that there is a wide range of 5 technology here. Our problem is as technologists and as 6 an industry, is working together in such a way to present 7 that wide range at the level that Ms. Halligan talked 8 about so that people can make reasonable choices about 9 which one fits them, because privacy is such an intensely 10 individual -- Martin Presler-Marshall this morning talked 11 about the differences in his home and, since I happen to 12 know he has young children and he hopes he doesn't think 13 that that's a terrible privacy disclosure on my part about 14 him, it's probably a lot of that is about behavior, about 15 opening doors at appropriate times to do things. 16 We all like to be able to control the door that 17 we're going to open and to be able to say who can come in 18 and who can go out. I think a lot of us in technology are 19 stumbling toward, not necessarily in a smooth path because 20 we're making jumps here depending on what's available at 21 any given time, toward that end of all of us being able to 22 control that door simply. 23 Thank you. 24 DR. CRANOR: I want to give Lance an opportunity 25 to say something else. But before I do that, if anybody 0112 1 has questions in the audience, if you'd line up at the 2 microphones. 3 MR. HOFFMAN: Very quickly, I wanted to -- can 4 you hear me in the back? I wanted to take issue with 5 something that was said. I don't think security is on or 6 off, and I'm not sure it was said, but that impression may 7 have been left. Security, there's no such thing as 8 perfect security. All we do in the computer security 9 business is provide tools which work with various degrees 10 of efficiency and various degrees of effectiveness. 11 Richard Smith, who is here, has demonstrated 12 that on numerous occasions by looking at products and 13 programs that were supposed to do something and in fact 14 didn't do them or did more than they were supposed to do. 15 I think if you haven't read Larry Lessick's book 16 on "Code and the Laws of Cyberspace," the central model 17 there is terrific, balancing architecture, in this case 18 computer architecture, laws, norms, and the marketplace. 19 They all have an effect and I think they all push us 20 toward things. 21 I think we may even, to throw a fourth thing in, 22 we may be going toward more of an economic model where the 23 various agents --the Catholic Bishops' agent or the ACLU 24 agent or whatever -- really have roles to play. And lo 25 and behold, there is some work done in economics on this, 0113 1 principal-agent theory. We're just starting to look at 2 applying some of this in our research at George Washington 3 University. 4 Of course, the first application it got applied 5 to is not really privacy, but rather intellectual property 6 in terms of Napster and who are the agents and who are the 7 players. But there are some exciting opportunities that 8 are going to open up. 9 MR. MAXWELL: Okay, a microphone over there. 10 QUESTION: The greatest keepers of information 11 on a consumer are federal, state, and local governments. 12 When you have 50 bodies of state government making laws 13 saying, yes, we will sell information or, no, we will not 14 sell information on a consumer, how do we get these 15 governments, including the federal government, involved in 16 this issue about selling information on the consumer? 17 Most of the information that a lot of the companies get 18 come from their databases. 19 MS. HALLIGAN: I think you're right that 20 government practices with respect to data collection and 21 privacy are critical, particularly because they're a focal 22 intake point for a lot of information. I think Elliot 23 Maxwell addressed that briefly this morning. I think that 24 they have to be part of any ongoing dialogue about what 25 privacy protections are important. 0114 1 I also think it's the case that if you look at 2 the roster of laws on the books now that govern the 3 information practices of government versus the privacy 4 sector, there are already a number of books -- of statutes 5 on the books that do govern what government entities can 6 do when they take information, and whether it's for 7 driver's licenses or other purposes. 8 But I think we should absolutely be part of that 9 going forward. 10 MR. SCHWARTZ: The public records laws are a big 11 issue in this debate and technologies are not going to be 12 able to solve those issues. There have been several 13 different groups that are now looking into those issues 14 about what should be public, what is public now, what 15 should be public. 16 Right now there's a comment period that the 17 Department of Treasury has on bankruptcy laws. If you go 18 bankrupt now, your bank account information becomes part 19 of the public record. I think most of us would agree that 20 that shouldn't be the case, but these laws were written 21 when it wasn't thought that this information could be 22 posted online. 23 So we really need to go back and revisit some of 24 these laws and make decisions on them one at a time. It's 25 important to keep the idea of government accountability 0115 1 while they're doing that and that balancing test provides 2 a unique challenge that technology is not going to be able 3 to answer. 4 MS. PERRIN: I'd like to make two points. The 5 first one is that point about consumer fatigue again. For 6 a democracy to work, it's got to be functional so that 7 people can actually use it. If you're going to rely on 8 sectoral regulation across 50 states and the federal 9 government at multi levels, only the most intrepid, dare I 10 say, whackos are going to try and assert their rights. 11 It's very difficult. In Canada what we did when 12 we legislated for the private sector -- stay tuned for a 13 debate to take place in Parliament at the tabling of the 14 regs this fall -- was we legislated the guys who buy the 15 databases from the public registers and they can't buy 16 them without the consent of the consumer now. The 17 governments aren't getting the consent of the consumer to 18 sell that data, so they either change their practices or 19 the companies can't get it. So there's a little bit of 20 market pressure being brought to bear there. We'll see 21 what happens. 22 DR. CRANOR: On the rear microphone. 23 MR. NATHAN: My name is Craig Nathan. 24 My question is on linkability and pseudonymity. 25 There's been a discussion or sort of a common theme that 0116 1 if I adopt a pseudonym for myself online that I can 2 participate in a variety of the personalization 3 technologies and have one to one access. My question sort 4 of stems from stories that I've read and heard from 5 previous CFP's and a variety of other things that with 6 just a few pieces or a few dimensions of personal 7 information about me that you can basically triangulate 8 who I actually am in real life. 9 So my question is, one, can people speak towards 10 how many dimensions does it require to do that 11 triangulization? Two, how do you educate a sixth grade 12 level or lower consumer on the notions of linkability, 13 dimensions of information, pseudonymity, so that they can 14 make informed decisions? Then the third is, what happens 15 if that information gets out and somebody has done that 16 linking? What are my technological or, in the context of 17 this discussion, my rights, or what access should be given 18 to me so that I can now remove that from the databases? 19 MR. ADKINS: I think very briefly that -- my 20 sound again. Very briefly, I would say any site that says 21 we don't collect personally identifiable information about 22 you or we don't use or share personally identifiable 23 information, I would say that legally that should also be 24 treated as a non-triangulation pledge, to begin with. 25 DR. CRANOR: Glee. 0117 1 MS. CADY: Well, we now know it's on, don't we. I'm going to speak directly to the system that 2 we do and what we're trying to do and not for anyone else in this instance, although generally earlier I was trying 3 to speak for us as an industry. While I am a database expert, I have not ever worked in advertising online. 4 One of the easiest things to do is to triangulate your zip code with in particular always-on 5 services. So a good thing to do is to not release same unless you absolutely have to. It is possible to do lots 6 of things online without ever telling anybody what your zip code is or what your city of residence is or, if 7 you're in Canada like Ms. Perrin, your postal code. I'm trying very hard not to be U.S.-centric. 8 The linkage -- the danger point in any system is can you trust the person that's providing you the service, 9 and one of the things that you should do is to look the see what our CEO refers to is how is it they're making 10 their money, or the follow the money aspect. So what you want to be able to do is to say, is this a way of doing 11 business that meets my own personal values, that processes information in a way that I respect and find acceptable, 12 and to remember at all times that, similar to Professor Hoffman referring to the intellectual property debate, 13 that personal information once it's gotten out isn't recoverable. 14 So you want to be able to guard it. That's not any different than teaching an elementary school age child 15 not to talk to strangers unless they're introduced by your school teacher, not to walk into streets, not to accept 16 candy from strange people, which is all things that we do as responsible adults whether or not we're parents. 17 So sharing how to be a good citizen, a responsible citizen, when we're talking to other business 18 people is really important, and teaching our children how to be safe is important, too. It's really a big education 19 initiative. DR. CRANOR: Thank you. I think we're out of 20 time now, so we'll wrap it up. MS. LEVY: I'd like to thank Dr. Cranor and all 21 the panelists for an excellent panel. At this time we will break for an hour and a 22 half lunch. There's a list of restaurants in your packet 23 and we will all reconvene at 2:00 o'clock. 24 (Whereupon, at 12:35 p.m., the workshop was 25 recessed, to reconvene at 2:00 p.m. the same day.) 0118 1 AFTERNOON SESSION 2 (2:08 p.m.) 3 MS. LEVY: Welcome back. We're going to start 4 the afternoon session now. It's my pleasure to introduce 5 the moderator of our panel, which is going to discuss the 6 implications of self-regulatory and regulatory 7 environments. Our moderator this afternoon is Peter 8 Swire. Peter is the United States Government's Chief 9 Counselor for Privacy in the Office of Management and 10 Budget. As such he is responsible for much of the privacy 11 policy at the Federal Government level, and we are 12 extremely grateful that he can join us today. 13 Peter will introduce the panelists. 14 PANEL DISCUSSION: IMPLICATIONS FOR 15 SELF-REGULATORY AND REGULATORY ENVIRONMENTS 16 MR. SWIRE: We're just going to do brief 17 introductions. The full and impressive biographies are in 18 your materials. I'll introduce our speakers in the order 19 that they'll present today. 20 Speaking first will be Toby Levin from the 21 Federal Trade Commission, who's experienced in privacy 22 issues for the FTC. Second will be Bill Guidera from 23 Microsoft, who will be discussing some Microsoft 24 technologies that especially are relevant in the 25 children's area. Third will be speaking Richard Smith on 0119 1 the role of computer professionals in providing better 2 privacy protection. 3 Then we'll have something of a point- 4 counterpoint between Andrew Shen of EPIC, the Electronic 5 Privacy Information Center, and Christine Varney of the 6 Online Privacy Alliance and Hogan and Hartson. And 7 wrapping up our presentation part will be Professor Dick 8 Pierce from the George Washington University Law School 9 here in town. 10 Part of what we're doing in today's panel is a 11 familiar discussion of what's the role of law versus the 12 role of self-regulation or market in privacy. But we hope 13 to put a twist on that by focusing very much on what 14 technology does and does not accomplish in this area, so 15 when are technological solutions most likely to be the 16 preferred way of helping with privacy issues. 17 Thank you all for being here, and I'm going to 18 keep, as promised to the panel, keep the presentations 19 short and that way we'll have time for more back and forth 20 and for, we hope, some audience questions. 21 So, Toby, please start. 22 MS. LEVIN: Thank you very much. 23 COPPA is the first federal online privacy 24 protection act. It was passed in October of '98, went 25 into effect this past April. The FTC enforces the new 0120 1 rule, which is not to be confused with the COPA, C-O-P-A, 2 which regulates the dissemination of harmful material to 3 minors. COPA is under litigation. COPPA is not. It is 4 the law. 5 I urge you to visit our wonderful web site at 6 www.ftc.gov/kidzprivacy, which is designed to help 7 parents, children, and web site operators understand the 8 new regulation and has a lot of informative materials 9 there. 10 First and foremost, COPPA's about giving parents 11 the tools to protect their children's privacy, and in that 12 regard COPPA recognizes the important role of industry, 13 its self-regulatory efforts, as well, as technology. We 14 are in the process now of reviewing a number of 15 applications for safe harbor programs, more commonly known 16 as seal programs, programs that are going to assist in the 17 implementation and enforcement of COPPA. 18 Those applications are under review at this 19 time. It evidenced a very important role for industry to 20 play to make sure that the new regulations are effective. 21 In addition, we are encouraging developers of 22 technology, such as the ones we have here today, to 23 develop services that will give parents the ability to 24 convey their consent to a variety of activities that 25 involve information collection and at the same time be 0121 1 easy to use and affordable to web site operators. 2 COPPA's requirements are very much rooted in the 3 fair information practices that we heard about earlier 4 today, with one particular notable addition, which is that 5 the rule limits the information that can be collected from 6 a child, the personal information, to that which is 7 necessary for the activity. So companies whose revenue 8 models are based on personal information collection 9 admittedly have a challenge here. 10 The new rule is the law, but we are still seeing 11 sites at the early stages of the learning curve. A number 12 of sites are doing a great job in providing very exciting, 13 informative content with very minimal information 14 collection, but we're still seeing some sites that are 15 still trying to understand how the rule affects them. For 16 that reason we're engaged in both educational efforts to 17 help sites understand the new rule and are in the process 18 of law enforcement as well, and we will be bringing a 19 group of cases later this year to evidence the FTC's 20 vigorous enforcement. 21 MR. SWIRE: Thank you, Toby. 22 Bill Guidera, Microsoft. 23 MR. GUIDERA: Thank you, Peter. 24 At Microsoft we're very, very serious about 25 protecting consumers' privacy. It matters tremendously to 0122 1 us as a matter of public policy and as part of our 2 business model as well. 3 Relative to the COPPA regulations we created a 4 tool called Kids Passport, which serves one element of 5 compliance with the regulation and that's parental 6 consent. It's a tool made available to web site 7 developers, as well as to parents and children, that 8 allows for the parents to consent to web sites which 9 collect personally identifiable information from a child 10 via credit card authorization. It's one example of a 11 technological solution to protect children's privacy and 12 comply with the law. 13 I want to complement Toby on the workshop she 14 hosted a month ago or so which showed many, many other 15 tools. Microsoft's tool is one among many and I think by 16 looking out in the auditorium and what we saw earlier 17 today and what we're seeing out there today presently are 18 some fantastic tools. There are some supremely innovative 19 companies out there creating neat technologies that we 20 couldn't anticipate just a few months ago, let alone a 21 year ago. Toby did a nice job of hosting those companies 22 in the workshop on COPPA compliance. 23 I also want to compliment her on the provision 24 within the regulation for reviewing technological 25 developments starting in October 2001. We're seeing so 0123 1 much innovation, so much growth, so much new product 2 development, that having a dynamic regulation which 3 permits for the growth of technology in ways that lawyers 4 like myself, other folks, couldn't possibly anticipate is 5 a super-neat way to address these issues and do it 6 dynamically. 7 Thank you. 8 MR. SWIRE: We're staying very much on the brief 9 side. So next, Richard Smith. I neglected to mention 10 that Richard recently took a new role at the Privacy 11 Foundation out in Denver to help keep track of the 12 activities of companies and governments around the web. 13 MR. SMITH: Thank you, Peter. 14 I'm the one programmer on the panel here. So, 15 on the issue of technology solutions for privacy, I'm 16 probably one of the largest proponents of it. But one 17 thing that we need to keep in mind, that when companies 18 develop technology that technology is being done by 19 people. 20 One of the issues with self-regulation, a 21 problem that we have is that the men and the women in the 22 trenches building products in many, many cases in my 23 experience don't have a lot of experience with privacy 24 protections. So we have an education issue here if we're 25 going to say we're going to have self-regulation, that the 0124 1 folks who are building the products are going to have to 2 have some awareness about this issue. 3 In this room here, we probably have a very high 4 degree of awareness of privacy problems. We all know 5 about cookies and building technologies to deal with them, 6 but it doesn't do a lot of good if we have the folks who 7 are building web browsers not being aware of those issues. 8 Just as an example here, we have Microsoft 9 recently bringing out a security patch for IE 5.5 for 10 addressing the issue of third party cookies, and I think 11 that's a good example of using technology to address a 12 privacy issue. At the same time, we had on a security 13 list a couple weeks ago somebody discovering sort of 14 another form of cookies in the Internet Explorer browser 15 called -- it doesn't really have a good name, but it's 16 "Persistence." I can't really put a better name on it. 17 Basically, it provides the same functionality as 18 cookies, but it's done a different way. The person 19 implementing that feature didn't think to put in any kind 20 of control to turn that off. So we have cookie controls 21 that allow us to protect privacy, but not this other 22 feature that works just like cookies. The problem was 23 there, I believe, that the person who is implementing that 24 feature is not really privacy aware. That I think 25 illustrates very well the problems that we're dealing with 0125 1 here. 2 Now, in terms of where we're going to get 3 privacy protections from technology, it's going to be in a 4 number of different sectors. We have obviously Microsoft 5 being the leader, the leading provider of web browsers 6 today with close to say 80 percent market share. We're 7 going to have to look to them on the client side in the 8 browsers for protections. 9 We also are going to go look to the ISP's, to 10 the Internet service providers, where we have data that's 11 going to -- when I have data on my computer going out to 12 the Internet, that's the first place it's going to go. So 13 ISP's are going to be a place we look to. Already today 14 AOL, for example, is providing a level of privacy proxying 15 for users of their service. 16 Then we get into the third party solutions like 17 we're seeing here today at the show, who address 18 particular areas of privacy not dealt with by the client 19 side or the server. 20 Thank you very much. 21 MR. SWIRE: Andy. 22 MR. SHEN: Thanks, Peter. 23 I think we should -- and I think most of you 24 would probably agree with me -- encourage the development 25 of online privacy technologies. These technologies would 0126 1 hopefully give consumers greater control over what happens 2 to their personal data. However, I believe that the 3 widespread adoption of online privacy technologies will 4 depend on the existence of laws establishing a baseline 5 standard for privacy protection. When we establish a 6 guaranteed level of privacy protection, I think you'll see 7 a growing number of privacy technologies that go up and 8 meet or exceed that standard. 9 Just a couple of comments about what exactly 10 we're talking about, what is a good privacy technology, 11 what is a privacy enhancing technology. Good privacy 12 technologies would allow consumers greater ability to 13 exercise their rights as provided through fair information 14 practices. I also believe that good privacy technologies 15 would allow consumers to minimize or eliminate the need 16 for personal data in many of the online transactions they 17 may partake in. 18 These technologies I think would obviously give 19 us the benefit of allowing consumers to avoid some of the 20 more burdensome and ineffectual procedures we have right 21 now: a growing dependence on privacy policies, which are 22 often very long and confusing, and also a growing reliance 23 on a greater number of preferences, different settings, 24 more jargon, which I think is definitely, when we're 25 talking about consumers, the average Internet user, is 0127 1 something that we have to get away from. 2 Also, I think clearly privacy enhancing 3 technologies should not facilitate the collection or 4 distribution of more personal information than we have 5 now. Obviously, we're in a time where the amount of 6 information collected from an individual has exceeded 7 anything we have seen before. Privacy technology should 8 help us curtail that growth. 9 I think one of the more promising groups of 10 online privacy technologies that we have right now, and 11 there are many companies presenting products to this end, 12 are ones that allow for anonymous interaction, anonymous 13 communication, anonymous surfing, anonymous payment. 14 Anonymity is simply, I think, the best way to protect 15 privacy. If you do not distribute any personal 16 information, you don't have to worry where it ends up and 17 I think you will also avoid many of the possible 18 regulatory requirements if you are in an information 19 intensive business. 20 As a complementary option, we can also think of 21 pseudonymous alternatives. Pseudonymous virtual personas, 22 virtual identities, will allow the individual to disclose 23 as much as he or she wishes about themselves. This is 24 already something that is ubiquitous through the Internet 25 -- user names in chat rooms and on message boards. I 0128 1 think it's something that Internet users can easily be 2 comfortable with. 3 I'd like to end on what I think, gathering from 4 what has been said already today, I think a very 5 noncontroversial point. We should obviously take 6 advantage of all the legal and technical means we have at 7 our disposal to protect privacy. The privacy debate has 8 been going on for several years. Several people in this 9 room and on this panel have been heavily involved, and I 10 think to that end we need to use whatever we have. 11 I think the most consumer-friendly environment 12 for privacy protection will allow privacy enhancing 13 technologies and legally enforceable fair information 14 practices to complement and reinforce each other, rather 15 than placing one or choosing one over the other. 16 MR. SWIRE: Christine. 17 MS. VARNEY: Thanks, Peter. 18 It seems to me that this workshop could not have 19 come at a better time, because I think we're at a fairly 20 critical juncture as we examine technology and 21 legislation, regulation, and privacy, because for too long 22 -- and I think this is one of the reasons that many of us 23 who argued for a legislative moratorium were concerned, 24 that legislation can tend to be very technology-specific. 25 I mean, can you imagine a law that said cookies are 0129 1 illegal? It's not that we mind cookies; it's that we mind 2 the abuse of cookies. 3 Can you imagine a law that said GIF's are 4 illegal or web bugs are illegal? Again, it's not 5 necessarily that we mind those technological devices, but 6 what we do mind is whatever we define as an individual, 7 the abuse of those technologies. 8 So I think, Andrew, I agree with part of what 9 you were saying, which is that any undertaking has to be 10 policy focused. It can't be technology focused, because 11 most of the technology has appropriate uses as well as 12 what some would consider inappropriate uses. So what do 13 we do? I think as is evidenced today, there is a huge 14 amount of marketplace activity. 15 We're bringing more and more technologies to the 16 forefront that can empower individuals. Obviously, I 17 think you probably talked today about P3P a lot and I 18 won't spend an enormous amount of time on it, but it seems 19 to me when we get P3P fully deployed the best thing that 20 could happen is we would have a competitor to P3P, because 21 competition is good, and I think it would demonstrate to 22 me that we have got a viable marketplace. 23 What I don't want to see -- and I'm not quite 24 sure how we avoid it -- is legislation that locks in 25 technical standards, that says these are the standards 0130 1 that will protect privacy. 2 Thanks. 3 MR. SWIRE: We're pleased to have Richard Pierce 4 on the panel, who's studied many other regulatory systems 5 and is speaking, I think, about privacy, if not for the 6 first time, for one of the first times, but with a 7 perspective in many other systems. 8 MR. PIERCE: Thanks, Peter. I basically want to 9 address one issue that I think has been somewhat neglected 10 in the debate or discounted a bit, and that is the 11 existence of what I believe to be extreme consumer 12 heterogeneity in terms of the relevant preferences in this 13 area. 14 I know there's a lot of people who feel the way 15 that Andrew does, for instance, that it's terribly 16 important, they put a very high value on their privacy 17 rights, they want to protect them at virtually any price. 18 But there are an awful lot of people who are more like I 19 am. Basically, I'm willing to trade off my privacy rights 20 for a nickel or a dime most any time, and I do on a 21 regular basis, and I don't want to be deprived of the 22 opportunity to do so. 23 To give you an illustration from a non-Internet 24 context, a lot of supermarkets now provide the option of 25 getting a little card that you use at the checkout counter 0131 1 that enables them to keep track of your pattern of 2 purchases, use that data for internal purposes, and sell 3 it to anybody they want to. I joined up in a flash, 4 saving a buck a week on my groceries in return for 5 providing that information, particularly when I think the 6 information is going to help me indirectly anyway by 7 allowing them to be in a better position to serve my, 8 cater to my preferences in a superior manner. 9 What I want to avoid is any form of government 10 regulation that keeps -- well, I look forward to 11 government regulation that allows people like Andy and 12 people like me to further our preferences. Judging by 13 what I see at the checkout counter, I see a lot of people 14 like me who have swapped their privacy in that context for 15 50 cents a week and seem to be quite happy about it. 16 I want to go to sites that say: We guarantee 17 you we're going to collect every bit of data we can about 18 you, use it for every possible internal purpose, and sell 19 it to anybody who will buy it, because I think I'm going 20 to get more value out of those kind of sites than others. 21 But I want Andrew to be able to go to the sites 22 that he prefers. So I think the only thing that could do 23 that is a market, a competitive market. That's how we 24 provide a wide range of services that satisfies the 25 heterogeneous preferences of consumers in every other 0132 1 context. 2 I have been convinced by Peter, however, that 3 the market needs two forms of help to be effective in this 4 context. One is just a simple statement that says: 5 Everybody that runs a web site, ISP, etcetera, has to have 6 a privacy policy that is displayed prominently. Here 7 technology can help a lot. You can set it up so that you 8 can't get access to the site, to at least the portion of 9 the site where any data is going to be collected, without 10 first seeing the screen that has the privacy policy and 11 clicking your assent to it. 12 Second, that some agency -- FTC seems like a 13 good candidate to me -- be assigned responsibility to 14 detect and punish any violations of any of those posted 15 policies. Anything else I think is going to do more harm 16 than good. 17 Thank you. 18 MR. SWIRE: Richard, from preparing -- you 19 prepared a short paper for this. You also had comments 20 about requiring bargaining as part of the technology, and 21 I wonder if you could share those with us. 22 MR. PIERCE: Sure, I'd be glad to, Peter. Now, 23 I understand that this is part of some proposals, that 24 there be mandatory bargaining about a privacy policy. 25 That strikes me as incredibly burdensome and totally 0133 1 unnecessary. The analogy that comes to mind is I go to a 2 grocery store and I'm looking at the toothbrush rack and I 3 see one company provides only a toothbrush with soft 4 bristles. And I say: I want a toothbrush with hard 5 bristles, damn it, so I'm going to insist on my right to 6 negotiate with that company to see if they're willing to 7 make a hard bristle brush. 8 Well, the response to my demand ought to be: 9 Buy a hard bristle brush made by another company, for 10 God's sakes. If you look at there, the marketplace is 11 capable of providing, and will if allowed, provide a wide 12 range of privacy policies. It'll provide web sites that 13 operate in accordance with Andrew's preferences, in 14 accordance with my preferences, in accordance with the 15 preferences of a whole lot of other people. 16 We don't need the government to mandate 17 negotiation over this any more than we do to mandate 18 negotiation over the stiffness of toothbrush bristles. 19 MS. VARNEY: Peter, may I add to that? 20 MR. SWIRE: Sure. 21 MS. VARNEY: If Peter had persuaded you that a 22 level of government action may be necessary, and that is 23 for web sites to provide disclosure, why would you limit 24 that obligation to online merchants? Why wouldn't you 25 make it a universal obligation? 0134 1 2 MR. PIERCE: I don't know that I necessarily 3 would if pressed on the point. But I certainly do buy 4 what I've heard argued many, many times forcefully, that 5 the Internet opens up opportunities for invasions of 6 privacy on a level, a magnitude unsurpassed in history, 7 and so it's worth of special attention. 8 So I'd start with the Internet context. I don't 9 know that I'd necessarily limit it to that context if 10 pressed. 11 MR. SWIRE: I want to ask Andrew, who was 12 privileged to be mentioned by name the most of the last 13 three minutes -- 14 MS. VARNEY: Because we all want to agree with 15 Andrew. 16 MR. SWIRE: -- to answer this point of varying 17 preferences or, as economists say, heterogeneous 18 preferences. What about a world where Richard wants to 19 show everything and other people don't? How do we handle 20 that in the view you would have? 21 MR. SHEN: Well, I think fair information 22 practices to a great extent address that very issue that 23 Professor Pierce brought up. Fair information practices 24 do not simply say your information will never be 25 disclosed. Fair information practices do not say your 0135 1 information always will be disclosed. What it does is 2 give you certain rights, such as the ability to access 3 your record, such as the ability to make sure that that 4 information is used for a specific purpose, that allow us 5 to make a really informed decision about whether or not to 6 transfer personal data. 7 I'd also like to address what I think Professor 8 Pierce is putting forth as his answer to the policy 9 problem in front of us, and that is it looks a lot, at 10 least to me, like what we have right now. You seem to be 11 advocating greater use of privacy policies, privacy 12 notices, and backup enforcement by the Federal Trade 13 Commission under their FTC Act prohibiting unfair and 14 deceptive business practices. 15 I think to some extent we already addressed why 16 that may not be the best answer. A lot of people simply 17 do not understand what things like cookies and web bugs 18 are, and so I don't think they'll be able to understand 19 all the full implications of a lot of these privacy 20 policies. 21 Even getting away from the technical issue, 22 privacy policies are very difficult for a lot of people to 23 understand. If you go to the most popular web sites -- 24 we're not talking about obscure ones -- these privacy 25 policies run for pages and often require a great deal of 0136 1 legal knowledge, or at least a very good command over the 2 English language. I don't think that all consumers will 3 be equipped to deal with that. 4 But I think this is the best answer to why I 5 think that might not be the best answer, because it's 6 simply not getting it done. What we have right now is a 7 great reliance on privacy policies and the Federal Trade 8 Commission as backup, but I don't see in the past few 9 years less people being concerned about privacy. I think 10 the level of privacy concern is as high as it's ever been. 11 We can just tell a lot of consumers that they're 12 crybabies, that they're unreasonable, but I don't think 13 that's the best approach to take. I think we should 14 address those concerns and deal with them in a responsible 15 manner. 16 MR. SWIRE: What I'm tempted to do next is to go 17 around the panel. Many people in the audience and on the 18 panel have been in discussions about the role of law 19 versus the role of self-regulation. Today is Technology 20 Day here at Commerce and I've asked people to think about 21 if there is one example of the sort of technology area 22 where technology will help the most, where really 23 technology is the answer, and then one area where perhaps 24 technology is least likely to provide the answer for 25 privacy. 0137 1 Maybe we'll go in the order we started in, 2 unless there's anyone who wants to jump in. Toby? 3 MS. LEVIN: Well, first of all, let me highlight 4 just a couple of applications about technology in terms of 5 the children's online privacy protection rule. First of 6 all, technology can help reduce the collection of personal 7 information and it can do that, as we saw in our recent 8 workshop, by a number of web sites using anonymous 9 cookies. If a web site collects a zip code, child's 10 preferences, their hobbies, they can collect basically any 11 information using a cookie and a user ID and enable the 12 child to interact with the site to provide customized 13 information and recognize the child as "Crazy Mary" every 14 time she returns to the site. So that's one application, 15 a very positive application of technology. 16 That same technology can be used to help prevent 17 a child from falsifying their age. General audience web 18 sites that have a mixed audience, when they ask age, 19 they're obligated under the rules to provide the COPPA 20 protections to those children that are under 13. Well, if 21 they're designed so that if a child enters honestly their 22 birth year and a screen pops up and says something to the 23 effect that, I'm sorry, you can't participate at this site 24 because you're under 13, then the child simply clicks on 25 the Back button, re-enters their year, their birth year, 0138 1 and they're allowed entry, then technology's failed in 2 that way. 3 That same technology can drop an anonymous 4 cookie, a session cookie, and direct the child really to 5 content on the site that they can engage in. We've seen 6 examples -- and this is where we think there's been some 7 shortsighted action taken by companies to block children 8 from their web site rather than see how they can provide 9 content to children who are the next generation of 10 purchasers. So we were encouraging companies to think 11 about using those very easy, basic technology tools. 12 Then the other is tools that will help companies 13 deal with parental consent mechanisms. In that regard, we 14 can see the role of infomediaries, data management 15 companies, that will provide easy methods for parents to 16 obtain passwords or PIN numbers, ways in which central 17 registries could be established for companies that sign on 18 with a particular site, where a consumer can come, learn 19 about the privacy policies and practices of companies, and 20 in one easy, automatic way sign their child onto a variety 21 of interactive activities. 22 So we think there's some promise there, but we 23 need for web sites to buy into the technologies. 24 MR. GUIDERA: Some areas where technology works, 25 empowering and educating users. Empowering, talk about 0139 1 the P3P standard which is in the works. Several companies 2 outside are showing off their P3P tools. It's a way to 3 lower the transaction costs issue of the legalistic 4 privacy statement. I'm an attorney. I read those things 5 out of curiosity on occasion and they put me to sleep 6 faster than just about anything else in the world. 7 But the P3P technology, IBM's showing a tool out 8 there that puts it into natural language and allows you to 9 cite your preferences in a way that's understandable by 10 attorneys and non-attorneys alike. I think that's a 11 fantastic way to empower people, lower the costs of 12 understanding privacy policies and finding privacy 13 statements. 14 Education. The cookie tool that we are putting 15 into the browser, integrating into the browser, is an 16 educational tool as well as an empowerment tool over 17 cookie technology. It shows you the great benefits of 18 cookies. I like to follow the Minnesota Twins. Every 19 time I turn on my browser I get last night's Twins score, 20 and that's due to a cookie. That's a darn cool thing. If 21 I were to wipe the cookies on my hard drive, I would no 22 longer be able to get that personalized experience right 23 away. I think that goes for the Redskins as well and 24 other teams, not just the twins. 25 Perhaps a way that technology works is in 0140 1 enforcement. I follow a lot of law enforcement security 2 issues and I think there are maybe some opportunities for 3 government to use technology as a means to enforce 4 different laws and acts as we have them today, and I'm not 5 going to go anywhere near Carnivore. 6 Where is it perhaps not working? I think this 7 is one of the hardest issues we all face, and that's 8 getting the end user or the parent or whomever it may be 9 to activate the technology. We can market our tools, we 10 can run advertisements on television, we can place icons 11 in certain places, and even though consumers and parents, 12 users and others are expressing great concern about 13 privacy issues, there's still a level of inertia required 14 on their behalf to activate these technologies. 15 I'm not saying that they don't want to do that. 16 I'm saying that perhaps we may need to make that easier. 17 But it's very difficult for technology to get a person to 18 activate the empowering tools that are at their disposal. 19 It still requires them to take the action. 20 MR. SMITH: On this issue of where technology 21 works the best, my experience in situations that I have 22 looked at is, if we take a look at the Internet, we have 23 the clients, which are people's PC's or in the future 24 wireless phones or whatever kind of device, and then we 25 have the Internet itself, where privacy protection 0141 1 technologies work really well, as on people's PC's. 2 Fundamentally, the game that goes on in privacy protection 3 is to send out less data. The less data you send out from 4 a PC about yourself, the less you're going to have a 5 problem with misuse of that data. 6 So I think the technologies on the client is a 7 good solution. I heard about one product today, which 8 unfortunately I do not remember the name of, that allows 9 you to provide preferences about what your interests are 10 and that data is stored on your computer as opposed to 11 some server on the Internet, and then marketers can match 12 up with what's being done on your PC. So I think that's a 13 good example of a client side technology that preserves 14 privacy. 15 The problem where I think that technology, where 16 it's going to fall down, is once you get the data off your 17 computer and it gets out into the wild, if you will. In 18 that sense -- although there might be some folks here that 19 argue with this, but you send out data in plain text, that 20 can be very easily copied. Computers are very, very good 21 at sending data around. 22 So I think that that's a section where 23 technology isn't going to work and it has to be companies 24 make promises about what they do with the data and not 25 give it away or potentially legislation. But I don't see 0142 1 the technology fix working so well in that area. 2 I can see on a certain side of things like an 3 online service that can do a job here, a good job of 4 hiding your identity when you're out there on the 5 Internet, and again that has to be a promise, though. The 6 problem with any kind of anonymizing system is you have to 7 trust the anonymizing company. But that's fine, you can 8 trust companies as long as they make representations about 9 what they do. 10 Thank you. 11 MR. SHEN: I'd like to echo a few of the points 12 that have already been made in terms of what is the best 13 place for technology to go or the best place for online 14 privacy technology to go. I think it's towards anonymity 15 and the minimization or the elimination of the use of 16 personal data. 17 I think we definitely have the capability right 18 now, but we're not yet seeing it. I think that some 19 companies are addressing the issue, but I think it 20 obviously needs to be more widely adopted. 21 I think the failings of technology, sort of the 22 limits of where technology can take us today, is obviously 23 -- a couple of people have already touched on this -- 24 enforcement. That doesn't mean there's no enforcement 25 right now. Obviously there are government agencies, but 0143 1 they have that enforcement investigative authority because 2 of their laws, the laws that back them up and give them 3 those authorities. 4 I also think that one place technology will fail 5 is actually creating a standard for privacy protection. I 6 talk to a lot of people in the public and the most common 7 question you probably get from people is: What is a good 8 company to go to, what is a good web site to go to? Who's 9 going to take care of my privacy? And I usually have to 10 throw up my hands and say: You're on your own. Some of 11 the big companies are great at protecting privacy, some of 12 the big companies are horrible at it, and the same goes 13 for smaller companies. 14 I think what consumers really want and I think 15 we should try to provide is a guaranteed level of privacy, 16 so I can tell them to go on the Internet and not worry 17 about it and just go buy products and surf the web and 18 enjoy what's out there. 19 MS. VARNEY: I think I'm next. 20 MR. SWIRE: Oh, I apologize. 21 MS. VARNEY: That's okay. 22 I think to disclose, first of all, I am on the 23 board of a company called NSERC, which may have been the 24 company that Richard saw that does client side database 25 development. So without speaking to the deployments, the 0144 1 business deployments of technological solutions, for 2 several years I have been a huge fan of client side 3 databases. I think they can ultimately solve a lot of 4 problems. 5 The other technological solution -- and I firmly 6 believe this and continue to believe this -- is P3P. What 7 technology won't do, it won't make people care, it won't 8 make people read privacy policies, and -- well, COPPA I 9 think works spectacularly well for web sites that are 10 targeted at kids. I have come to conclude that it doesn't 11 and can't work for general audience sites that kids go to, 12 because there isn't any 13 year old, 12 year old, 10 year 13 old, 11 year old that I know of who doesn't know how to 14 erase cookies. 15 My kids have taken me to at least two dozen 16 sites and showed me how when you enter a birthday and they 17 don't let you in, how you can go back, not just through 18 clicking back and changing the birthday, but how you can 19 circumvent that are server side technologies as opposed to 20 client side technologies. 21 So for protecting kids I think, outside of the 22 context of web sites who are targeted to kids, it's a very 23 vexing question other than parental-installed technology, 24 which could help the problems there. But I've got to tell 25 you, I have not seen these 10 to 13 year old kids that 0145 1 can't beat a solution that some of our biggest, most 2 prestigious, most important, and most committed companies 3 have tried to design. 4 I think that's a real problem, because we've set 5 up a standard of liability that these companies are trying 6 very hard to meet, but you've got a group of 10 to 13 year 7 olds that can beat anything they put up, or at least so 8 far they can. 9 MR. SWIRE: It might not be a surprise if 10 Christine's kids are talented at this. Maybe some of the 11 rest of our kids of a little harder time. 12 (Laughter.) 13 MR. PIERCE: The two areas in which I think 14 technology could be most valuable are: the first is 15 communication. I think that's its most valuable 16 potential. This really responds nicely to Andrew's 17 concern about adequacy of communication, clarity of 18 expression, ability to understand, etcetera. If you don't 19 know what a cookie is, well, you can go to the FTC privacy 20 site and you'll see that they refer to cookies and then 21 they say: Want to know what a cookie is? Click here. 22 You click here and they give you a description of cookie. 23 I didn't know what a cookie was until I went there, but 24 even I could figure out what a cookie was once I read that 25 little one-sentence explanation. 0146 1 Well, you can have -- what I would do is mandate 2 the privacy policy, nicely displayed, or you can have it 3 mandated that it's got to be annotated, so if you want it 4 in Spanish it's in Spanish, if you want it in whatever you 5 want, kid talk. 6 I don't know whether Christine is right that 7 they're smarter than we are, but you can put it any way 8 you want. You can get extreme clarity and extreme details 9 in communication at relatively low cost using this medium. 10 The other area is enforcement. As I understand 11 it, Richard has access to and uses a whole bunch of tools 12 that enable him to figure out whether somebody is actually 13 living by the policies that they announce, and I'm sure 14 there is even more that can be done in that area and 15 that's very valuable. 16 What I don't think is going to be at all useful 17 is actual decisionmaking. Peter and I were discussing the 18 state of artificial intelligence development today and it 19 ain't there yet. When it comes to real decisionmaking, 20 human beings have to be involved in that, and I think that 21 will be true for a long time. 22 One final point. I'm just a little concerned 23 about Andrew's urging this use of anonymous or pseudonyms 24 for all your Internet business. I'm on the board of an 25 organization that runs a web site. It's a kayaking 0147 1 organization. That's mainly what I do. I'd like to do it 2 full-time if I can figure out how to support myself. 3 But in any event, we have, after many painful 4 experiences, adopted an absolute prohibition on any 5 anonymous or pseudonym access to the site. You must log 6 in through a procedure. We've got a technological fix 7 here. You can't get on the site without going through a 8 log-in procedure that we have designed to make it 9 impossible, because we've had real bad experiences with 10 irresponsible\dishonest users of the site on an anonymous 11 or pseudonym basis. 12 So you got to be ready to say who you are before 13 you get on our site, and I think a lot of people will 14 adopt that policy if they haven't already. 15 MR. SWIRE: Do you want to comment? 16 MR. SHEN: Sure. Just in case you were 17 wondering, I've actually never met Professor Pierce before 18 today. 19 MR. PIERCE: It's true. 20 MR. SHEN: A lot of this probably could have 21 been dealt with beforehand, but I'm sorry to put you all 22 through this. 23 I guess I'll start with the point about your 24 kayaking association, the group, and I guess some of your 25 problems with anonymous or pseudonymous registration. I'm 0148 1 not -- obviously, your organization made its own 2 determination based on its own experience, and for your 3 organization in kayaking you need to know their names or 4 their mailing addresses so you can tell them about events 5 or news or just keeping account. 6 MR. PIERCE: No, to keep them from saying lies 7 about other people all the time. 8 MR. SHEN: But I think you can also see on the 9 Internet an incredible growth of places that do allow 10 pseudonymous interaction for very good reasons. We can 11 think about searching for health information. If you have 12 a certain health condition, if you have certain issues 13 about your sexuality, whether if you're a heterosexual or 14 homosexual, for many reasons, you may want to keep all 15 communication regarding those matters pseudonymous. I 16 think pseudonymous communication and the availability of 17 that allows many people to partake through the Internet on 18 many topics that are controversial, and I think that if 19 they didn't have that option would lose out ultimately. 20 MR. SWIRE: Let me make one point about 21 technology from my own experience and thoughts where it 22 works particularly well and less well. For data in 23 transit, getting from me to the other party, if it's 24 wrapped up in encryption or if it's wrapped up in SSL, 25 then that's a way to stop hackers from getting it and 0149 1 seeing it, and technology can do a very good job of 2 stopping the data from being intercepted between me and 3 the recipient. 4 But as somebody made a point more recently, once 5 it gets to the other end and somebody can open it up 6 legitimately, technology isn't going to tell you what 7 they're going to do with that data. It's much harder to 8 control it at that point. 9 I'm inclined to ask one question and, after 10 giving the panel a chance to respond, open up to the 11 floor. My question concerns client side technology, which 12 a couple of people just mentioned. In the Federal 13 Government, I've also worked on some of the government 14 computer security. In the security world, some of you 15 know about CERT, which is an organization at Carnegie 16 Mellon that comes up with this day's and this week's 17 security alert, the latest holes, the latest patches. 18 There's Federal Government groups that work with them. 19 But to really keep your big organization's security good, 20 you have to be at CERT every day or so to see the latest 21 problems and what you're going to do about it. 22 My question is are we asking too much of the 23 client side if it's an every day or every week update 24 situation? Is that something that's just going to 25 overwhelm what we can expect the client side software or 0150 1 the ordinary home users to do? If so, does that show a 2 limit to client side and suggest we have to have other 3 approaches? 4 Anybody who wants to answer that. 5 MS. VARNEY: I'm not sure, Peter, if you're 6 asking about what I was talking about, which is client 7 side databases and the potential for data leakage or data 8 theft out of client side databases. 9 MR. SWIRE: Well, presumably there's a new -- I 10 think that one thing about client side is whether -- I 11 think Bill suggested this -- is whether people decide to 12 exercise it and change the default settings, like you can 13 change defaults on cookies. So one question is level of 14 interest and will people understand enough about the 15 product to spend the money or the time to set it up. 16 My brother, who's a wonderful guy, recently came 17 over to help me with my home computer and we lost our 18 Internet access as a result. Sorry, Andy. This is on 19 tape. I think a lot of people can be cautious about 20 tinkering on the client side, not knowing quite what it 21 will do. 22 A related question is, whatever you do, there 23 might be countermeasures that people can overcome and, 24 with the level of change and complexity, is it an 25 achievable solution to say do it on the client side and 0151 1 expect consumers to handle it that way. 2 MS. VARNEY: I think it's certainly a solution 3 for some consumers. For example, the client side 4 databases that we're talking about allow you as a consumer 5 to be marketed to based on your preferences, whether it's 6 content or advertisement or editorial, without disclosing 7 to others what your preferences are, sort of a reverse 8 marketing technology. 9 Could those kinds of client side databases have 10 holes? Of course they could. Are they impenetrable? 11 Probably not. Probably somebody smart could figure out 12 how to hack into a university client side database. Does 13 that mean it is not a solution? I don't think so. I 14 think that you've got to allow the solutions to evolve. 15 It's completely conceivable, obviously, and I 16 defer to Richard, but if you had a pretty broadly deployed 17 client side database and there was a hole in it you could 18 probably distribute a patch via a server pretty easily to 19 anyone that had it. On the kind of broader question of 20 client side solutions and how do you get people to 21 download them, set them, all of that, yes, yes, you have 22 to have a minimal amount of interest in order to do it, 23 but there's not any legislation I know of that's going to 24 compel people to be interested. 25 MR. SMITH: On the issue of client side 0152 1 security, there is a real problem there in the sense that 2 -- and not to pick on Microsoft at the end of the table 3 here, but most people do run Microsoft operating systems. 4 If you just look at statistically, there's approximately 5 four to six security bulletins a month related to client 6 side security. That's a lot to keep up with, especially 7 if you're not a computer person who really understands all 8 of this. 9 Now, there are some nice mechanisms that 10 Microsoft has provided for getting patches through the 11 Windows update site, so there are some ways to do it. But 12 it's a lot to keep up with and I think it's really 13 burdensome. 14 I'd love to see software ship without any 15 security holes, but that's probably going to be a while 16 away. However, if we take a look at how products are 17 really attacked, the vast majority of security problems we 18 see come through e-mail. So what you want to really take 19 a look at is not so much maybe patching every hole, but 20 patching the way the attacks come in. Microsoft's kind of 21 done that with the security patch for Outlook this summer, 22 and I think that's a really good piece of software. 23 Now, ironically, Outlook Express users don't 24 have that same patch. So I would encourage Microsoft to 25 provide that. 0153 1 In addition, I think there's a role for the 2 ISP's. They can certainly offer scanning of e-mail to 3 eliminate problems, as well as, it's a little more tekkie, 4 but prevent packets from coming in that have not been 5 initiated by the PC itself. 6 So that's the way I'd solve that problem. I 7 think this keeping up with security patches, I find it 8 difficult. It's too much to ask, and instead we just look 9 at the way that most attacks occur and solve those. 10 MR. GUIDERA: Peter, I'd like to thank Mr. Smith 11 for complimenting our patch on the Outlook. 12 MR. SWIRE: You have a future in this town, 13 Bill. 14 MR. GUIDERA: That was very kind. I appreciate 15 it. 16 Let me address this in a couple of ways. First 17 off, the software is tremendously complex. I think we all 18 can acknowledge that. Creating safe, secure, privacy- 19 enhanced software is no easy task for any company, whether 20 you're the market leader like Oracle or a closely 21 following second like Microsoft in the database area. 22 It's tremendously complex stuff. 23 I work very closely with our chief security 24 officer and our product security people and they're great 25 folks who care passionately about these issues and they're 0154 1 steeped in law enforcement experience. 2 On the server side, one of the things that we're 3 seeing is a business model change in the way software 4 services are provided as we go towards an application- 5 subscription model. You will see where there may be four 6 or five patches in a given time period, in a subscription 7 model this will become a natural part of your utilization 8 of the software service. So as you take your update to 9 your operating system, your browser, your other tools, 10 enhancements, developments, new features, new security 11 protocols, will be built in and you'll receive that as 12 part of your utilization of the service. 13 I think that's a great, great development. I 14 mean, it minimizes the transaction cost. Again, you don't 15 have to go to the web site to find the patch, download it, 16 install it, do it correctly. It's something that the 17 provider gives to you as part of your subscription. 18 That's another way in which end costs, transaction costs 19 for the end user, are reduced by providing a superior 20 business model, and that goes to a different point. 21 We are seeing companies every single day use 22 privacy protections and security protections as a 23 competitive differentiator in the marketplace. I've got 24 to tell you, I watch those American Express ads and I just 25 think that is so cool. What a neat tool they've got, and 0155 1 they're using that as a way to differentiate themselves 2 from their competitors, and they're saying to consumers: 3 We care about your privacy and we're providing a 4 technological solution to it. 5 The market for these tools is out there. It's 6 vibrant, and we're seeing it out in the hall, we're seeing 7 it on television, and I think every company feels that 8 pressure to provide strong privacy tools as well as 9 security tools. 10 MS. LEVIN: I wanted to just make one comment 11 about how do we get consumers and parents to use these 12 tools. There is some evidence that when you move from 13 simply parents buying filters, for example, or parental 14 controls at their CompUSA or retailers, the incidence of 15 using those kinds of filters has been ranging about 25 16 percent or so. But ISP's that offer parental controls are 17 finding a very large use by parents. I think upwards 18 towards like 80 percent of households with kids are using 19 parental tools when they're made available through the 20 ISP's. I think there's a lesson there that we should 21 think about. 22 MR. SWIRE: Are there any questions from the 23 audience? There's a lot of knowledgeable people out 24 there. Any technology, what it does, what it doesn't do 25 questions, or other things that have been raised? 0156 1 Otherwise we'll continue here. Anybody? 2 Somebody's waving their hands in back. We're 3 blinded by the lights up here. Please. 4 MR. FENG: Hi, my name is Patrick Feng from -- a 5 little echo -- RPI. 6 I just wanted to ask a question based on the 7 observation there is tension, most evident between 8 Professor Pierce and Andrew, but in general seems to 9 pervade the privacy debates, and the tension seems to be 10 this: that we recognize that law can be too slow or too 11 inflexible to catch up with changes in technology. So 12 Christine said we can't make laws that are too technology 13 specific. 14 So then we talked about technical solutions and 15 in many ways these seem good because the technology is 16 flexible. But then Andrew says, but it can't address 17 baselines, like if we want fair information practices or 18 just a baseline of privacy. 19 So my question is are technical solutions only 20 good when we think of privacy as a commodity to be traded 21 against, or can technology also help if we think of 22 privacy as a basic human right, where there's a baseline 23 for everyone to enjoy? 24 MR. SHEN: Thank you for the question, and I 25 think that's a very apt observation. Obviously we're all 0157 1 in this room and a lot of people are outside in the lobby 2 talking about privacy technologies. But I don't think 3 that gets away from the central point on this privacy 4 debate, whether we should establish a standard or not. In 5 many ways, going back to Peter's original question, will 6 it be reasonable to expect many Internet users on their 7 own end to implement many of these technology solutions? 8 Well, I think part of what can inform our answer to that 9 is just the question, how many people right now read 10 privacy policies? I imagine a very, very small minority 11 take the time out to do that. 12 I expect that the very same answer is what 13 you'll find with technological solutions. A very, very, 14 small minority of people are going to use a lot of these 15 tools that are available. The central question we have to 16 go back to time and time again is where we want the burden 17 to lie and whether we think we should establish a privacy 18 guarantee for Internet consumers. 19 MS. VARNEY: Peter, I think that it was a very 20 apt observation and question, and whether or not we like 21 it, I think in the United States privacy is clearly viewed 22 as a commodity and not as a human right. If we want to 23 change that standard, that's a much bigger question than 24 what kind of technology can you use to protect privacy. 25 What always strikes me about these debates is 0158 1 that I think the most comprehensive source of information 2 about any of us who consume health care is in something 3 called the Medical Information Bureau, and I'll bet there 4 are not three people in this room who know what that is. 5 But every time you go to the doctor, you submit an 6 insurance claim, you get a lab test, you go to the 7 dentist, that information gets sent to the Medical 8 Insurance Bureau, and it is processed and used in a number 9 of ways. 10 For me, everybody has their own privacy 11 thresholds, but my medical information is my most 12 important and sensitive medical information. But I can't 13 get the results of a blood test and not allow those 14 results to go to the Medical Information Bureau, and I 15 frankly think that's appalling. But that's the way it is 16 in the United States today. 17 MR. NATHAN: Craig Nathan. 18 I used to agree with Professor Pierce on the 19 notion of Safeway Club Card, for example, where I should 20 be able to make as much money or get as much value for my 21 personal information as possible and use my card with 22 vigor, until a friend of mine noticed or pointed out that 23 Safeway could find a very high value in selling the fact 24 that I drink ten liters of beer a week to my health 25 insurance agency, who now could immediately either revoke 0159 1 or deny continued coverage for a variety of health 2 insurance reasons. 3 So which brings me to my question, which is, at 4 that point I went: Oh my God, they can know all of these 5 things about me, and what if they decide red meat's a bad 6 thing or if they decide that I've been eating too much 7 bacon or what have you. Now that this information is out 8 and it's in this public database and they're selling it, 9 my question is what are the technology solutions, or are 10 there technology solutions -- or I guess a better way is, 11 are there any nongovernment or regulation solutions for 12 pulling it back, for letting me say, wait, I didn't know 13 that this was of possible use, now I want to control it 14 again? 15 MR. SWIRE: Is the question to Dick how do you 16 pull back the data once it's out there? 17 MR. NATHAN: Is it possible? 18 MR. SWIRE: I think it's hard. 19 MR. PIERCE: I don't know whether it's possible, 20 with or without the government, to pull the data back once 21 it's out there. I mean, once somebody knows -- I guess we 22 all now know about your ten liters a week and that could 23 cause you a lot of inconvenience in a lot of contexts. 24 All I'm interested in is empowering individuals 25 to make these decisions. If you buy ten liters of week of 0160 1 booze at the grocery store, I would strongly advise you 2 not to sign up for the program I signed up for, because 3 that information could get out and really hurt you, 4 because I sure as hell ain't going to insure you or employ 5 you if that's what you're doing. 6 All I wanted to do is create the environment in 7 which people have that choice and they can say, okay, I'm 8 a ten liter of beer a week person and so I want to do my 9 transacting on a site that really protects that or I'm 10 not, so I don't care and I'm perfectly happy to give all 11 my information away for a buck a week. 12 MR. SWIRE: One more question and then we'll go 13 around for a last remark from everybody. 14 MR. JAYE: Thank you. Daniel Jaye, Engage. 15 One criticism of seal programs has been perhaps 16 a lack of teeth, perhaps because today they're voluntary 17 and laws have been proposed as maybe one way to address 18 some of the deficiencies. But I was particularly 19 interested in Bill, Toby and Peter perhaps commenting on 20 the ability of technology perhaps to encourage 21 participation in seal programs. That is, if you have 22 technology solutions like P3P, if we see consumers or 23 default preferences rewarding companies who have some sort 24 of seal program attached to their policies, would that not 25 encourage participation in seal programs and then give 0161 1 them perhaps more latitude to have tougher enforcement 2 guidelines in contracts? 3 Is there a way that technology and seal programs 4 could work more closely together as one way of remedying 5 the deficiencies of a technology-only solution? 6 MS. LEVIN: Well, I think at a minimum it seems 7 like there should be some efficiencies that result from 8 the application of technology working with seal programs, 9 and certainly with seal programs taking on the duty of 10 making sure practices are what they say they are, again, I 11 think technology can play a very important role in helping 12 through audit trails and the ability to monitor practices. 13 So I think there is definitely a very important 14 role there to help the seal program. 15 MR. SWIRE: A couple points. One is that some 16 seal programs already use what's called seeding, where 17 they put in some fake addresses in and if my middle 18 initial's not "Z", but if I get back a letter to Peter Z. 19 Swire then I know somebody used a list who wasn't supposed 20 to use it. So that's one technological tool to detect 21 cheating by people who get address lists. 22 More generally, how do we get belief out there 23 that people are following their policies in the future? 24 The FTC's been working really hard with the resources they 25 have, but it's a great big country. There's lots and lots 0162 1 of web sites, there's lots of offline companies doing lots 2 of deceptive things. 3 If you try to imagine scaling an economy of 260 4 million people with any kind of monitoring enforcement, 5 it's hard to imagine we're going to staff up the FTC to 6 the level where they can be looking at each web site in 7 any sort of regular way. So if we're not going to go that 8 way, even if we wanted to, then what are the institutions 9 for having monitoring, for having verifiability? 10 There was an article in The New York Times 11 earlier this week about some companies hiring third 12 parties to do some of this verifiability, and going 13 forward I suspect there will be more ways that folks check 14 on each other. Being in the government, we have the 15 privilege of having the GAO check on us on a regular basis 16 and that helps discipline us. 17 In the private sector, whoever does it, your 18 auditors or other people, seal programs, I suspect we'll 19 see more of that because this stuff's important going 20 forward. 21 Why don't I invite, again in the order that we 22 started in, a closing comment or two on anything we've had 23 -- I'm sorry, go ahead. 24 QUESTION: I had one quick comment. I enjoyed 25 the point-counterpoint between Andrew and Professor 0163 1 Pierce. 2 MR. SWIRE: I think we all did. 3 QUESTION: Christine Varney said very well about 4 technology is important, that it's neither good nor bad, 5 but there are appropriate and inappropriate uses. I think 6 I'd say the same thing about anonymity. I don't think 7 anonymity in and of itself is a goal. I think anonymity 8 can be used appropriately and inappropriately, and that 9 hopefully will be part of the focus as we go on, because 10 this is a very difficult problem. 11 MR. SWIRE: Anything you wanted to say? 12 MS. LEVIN: Well, just to wrap up and encourage, 13 re-encourage, self-regulatory programs. We encourage 14 technologists and, most importantly, we encourage web site 15 operators to look at COPPA, understand what its 16 requirements are, and really look to the FTC for 17 assistance in making it work. 18 MR. PIERCE: I just want to announce with great 19 pleasure that Jim Lehrer has agreed to moderate a series 20 of three debates between Andrew and I in prime time. 21 (Laughter.) 22 MR. SWIRE: That's your closing comment. 23 Bill. 24 MR. GUIDERA: A couple of things. I wanted to 25 return to the gentleman who's buying all those gallons of 0164 1 beer a week and first off paraphrase Winston Churchill, 2 who would have said: So much to do, so little time. And 3 I envy him. 4 I think that question is really a neat kind of 5 question, because it's a very unique one to today. I 6 suspect that it's an ex post analysis of what we've seen 7 in privacy handling thus far, asking how can I reclaim my 8 data because I didn't know that they might sell it. I 9 think tomorrow that question is arguably obsolete, because 10 we are learning through press interaction, through 11 competitive forces, through government regulators, that 12 privacy is a major issue and that your card service is 13 using privacy as a commodity and that there are tradeoffs 14 right there. 15 So your ex post question is becoming an ex ante 16 perspective as to what privacy means from here forward, 17 and it'll impact our incentives and what we do with those 18 cards, and we won't have to talk about reclaiming our 19 data. We'll talk instead about what data we provide to 20 the public or to a commercial operator like a grocery 21 store. 22 The second point. I want to return quickly to 23 what Richard Smith said about educating developers. 24 Software developers are certainly not for the most part 25 lawyers or politicians or public policy analysts. Yet 0165 1 their world is being infused by our concerns. They're 2 technologists and they are learning that the tools they 3 make and how they impact privacy and security have 4 tremendous impact to the end user and to their employer. 5 Our company is like many, many others. We now 6 have a director of privacy for the entire corporation. 7 Richard Purcell is his name. He's a super, super fellow, 8 and he works across the corporation implementing privacy 9 considerations into all of our products, all our services, 10 and that stuff matters a lot and people listen to him very 11 closely and he's playing that role of educating developers 12 as well as executives within the corporation. 13 Another point, back to the prior comment. This 14 is such a dynamic area, things are changing so quickly. 15 Companies are learning a ton. Technological developers 16 realize that what they do matters to presidential debates, 17 to public policy formats, to lawyers, to litigation. I 18 think we're seeing tremendous, tremendous change, much of 19 it being driven by market incentives. 20 MR. SMITH: I just want to follow that really 21 quick, that, yes, lawyers and programmers are now getting 22 to talk a lot more than they used to, and that's basically 23 because the Internet does communications and that's where 24 we get into security and privacy concerns. 25 Real quick, in terms of privacy, I'm not sure I 0166 1 want to use the word "solutions," but I see them coming 2 from a lot of different directions. They're all 3 important, technology being one, legislative being 4 another, social pressure and the competitive nature of the 5 market. I think they're all important and all play a role 6 here. 7 MR. SHEN: I'd like to return to a more general 8 point that's been brought up a couple of times already: 9 What exactly is privacy? Christine, Professor Pierce, 10 people in the audience, and I'm sure many other people 11 share the idea that privacy is a commodity, that our 12 personal data is something that's put up for sale. In 13 contrast, there are a lot of groups like mine that believe 14 that privacy is a human right. 15 But rather than going into a long exegesis about 16 what is a right, I think the best way to answer that is 17 privacy is just plain common sense. When you talk to a 18 consumer and you ask them, to try to get away from the 19 technical terminology that pervades this field about opt- 20 in or opt-out, if you ask a consumer, would you like me to 21 ask permission before I use your personal data, they're 22 all going to say yes. When you go up to a consumer and 23 say, would you like to be able to access the records that 24 we have on you and your behavior, they're going to say 25 yes. 0167 1 To a lot of people that sounds completely 2 reasonable and common sense, and I think a good way to 3 serve consumers. 4 MS. VARNEY: At the end of the day, technology 5 can provide for the unlimited exploitation of data without 6 anybody's notice and consent. Technology can also provide 7 for the complete protection of data and individuals' own 8 data. So while we can continue to have the dialogue about 9 what is the right societal value and what is the right 10 role of government, I think we have to acknowledge that 11 the technologists are going to beat us to the punch line. 12 To the extent that our dialogues about societal 13 values and government action can help inform the 14 technologists, that's useful. But to the extent that they 15 inhibit them, that's not useful. 16 MR. SWIRE: To wrap up, the panel today has 17 tried to answer some questions about what technology will 18 achieve in the privacy area, and the demonstrations out in 19 the hall and many of you in the audience are trying to 20 figure out what can be delivered by technology. Going 21 forward, it's clear that society is demanding privacy 22 protections as part of the Internet future. It's also 23 clear that we're demanding a strong e-commerce, which is 24 the source of so much growth and other developments of our 25 society. 0168 1 So I thank you for participating in the event 2 today and I thank our panel for helping us continue to 3 explore how technology comes in here and how we meet all 4 these goals. So thanks to the panel. 5 MS. LEVY: Thank you, Peter. 6 (Applause.) 7 MS. LEVY: For our final session today we're 8 going to try something different. We're going to actually 9 bring up one of each of our technology demonstrators onto 10 the stage now. So we're going to ask all 19 of them to 11 come up and we're going to have a question and answer 12 session that will be led by Tim Lordan, who is Staff 13 Counsel at the Internet Education Foundation. So I'm 14 going to ask everyone to come up to the stage now. 15 (Pause.) 16 Q AND A WITH TECHNOLOGY FAIR DEMONSTRATORS 17 MR. LORDAN: Thank you. My name is Tim Lordan 18 and I work for the Internet Education Foundation. We 19 helped coordinate the 19 demonstrators you see on the 20 panel table here, and I want to thank all of them for 21 going through a lot of effort to get here to show their 22 technologies. Often we talk about policy in a vacuum and 23 I think it's very important that folks got together and 24 actually showed a lot of the technologies that we often 25 talk about. 0169 1 But first I wanted to thank the Secretary of 2 Commerce and the Assistant Secretary Rohde for showing 3 leadership and vision in actually allowing us to pull all 4 these demonstrators into this technology event. As far as 5 I can tell, it really hadn't been done in any other forum 6 before, kind of a comprehensive exposition of this type. 7 So I think it's truly visionary, and I want to thank the 8 Assistant Secretary and the folks in the Department of 9 Commerce, NTIA -- Wendy, Sally Ann, Sandra, and everybody 10 else. 11 I'd like to also thank the folks that are on the 12 podium. If I can just quickly introduce folks: from 13 American Express we have Peggy Haney; Anonymizer.com, we 14 have Adams Douglas; Andrew Weinstein from America Online; 15 Jonathan Zuck from Association for Competitive Technology; 16 Dave Marvitz from Disappearing Inc.; and from Encirq we 17 have Sherry Serokin. 18 I believe from IBM we have Martin Presler- 19 Marshall; Mike Schwartz from ID-Vault. From ILumin we 20 have Brant Anderson; Incogno, Joe MacIsaac; Ruben Cohen 21 from IPrivacy; and of course we have Bill Guidera back 22 from Microsoft; Dr. Steve Lukers, Privacy Persona; Scott 23 Beechuk from PrivacyRight; Glee Cady, of course, from 24 Privada; and David Zimmerman from YOUPowered, following 25 out with Stephanie Perrin from Zero Knowledge Systems. 0170 1 I'm actually hoping that I won't really have to 2 say anything more. I was hoping that panelists, 3 attendees, could ask some questions. I hope you have some 4 good questions fired up. If you don't, I'm happy to ask a 5 few questions. But even if you've asked questions for the 6 folks already in the lobby, I may ask you to reiterate 7 them just to share with people who may have not been part 8 of that conversation you might have had with the 9 technologists. 10 So anybody, any questions I have from the 11 audience? Please, don't make me work too hard. 12 (Pause.) 13 MS. WOODARD: My name is Gwendolyn Woodard. 14 I would like to know how can we as a society 15 develop policy and procedure that would span the seven 16 continents to allow the Internet to be a self-regulated 17 entity which would allow the entire world to prosper? 18 MR. LORDAN: That's a pretty wide-open question. 19 Anybody want to field that one from the panel? Peggy? 20 MS. HANEY: Hi. Peggy Haney. 21 I'm not sure I have a very good answer to that. 22 I think that's one thing we're all looking at and 23 struggling with. But I think some of the technology 24 examples that are being demonstrated out in the lobby are 25 a way for consumers to be more empowered in terms of 0171 1 controlling the amount of information that is provided 2 online. 3 I also think that by encouraging companies to 4 have comprehensive privacy policies which they are by law 5 required to enforce because it would be fraud if they 6 didn't, I think that's a huge step in making that happen. 7 MR. LORDAN: Stephanie. 8 MS. PERRIN: I think it's a great question. I 9 think it's one of the critical questions facing us in the 10 next century. One of the problems with solutions that are 11 developed in North America is we have a democracy. We 12 complain that it doesn't function perfectly, but it does 13 function here. So we can afford to have choices and 14 preferences, and we can depend on an educated population 15 to a certain extent to make those choices and function 16 under the rule of law. 17 The reason -- one of the reasons we at Zero 18 Knowledge believe that the default's got to be set on 19 privacy and strong privacy is that in many, many nations 20 the people don't have those rights, and if we build 21 solutions here that only work in a liberal democracy we 22 are enslaving the rest of the world with the results of 23 the architecture we build. 24 MR. LORDAN: Let me just ask one question of the 25 panel. It's interesting that we actually got a chance to 0172 1 see some of the names and faces associated with the 2 technologies. An interesting point was raised on the last 3 panel about trust, server side technologies versus client 4 side technologies, and now that we actually have seen you 5 and you're all at the panel table, at least for those 6 folks that are doing server side technologies, why should 7 we trust you? Why should a consumer trust you with their 8 data? 9 VOICE: I don't know that they should. I think 10 that we are, with a couple of exceptions -- clearly IBM, 11 American Express, Microsoft up here -- there's a lot of 12 startups. There's a lot of new companies who've just been 13 in existence for a short period of time. As we've all 14 seen from even the Toysmart situation, good privacy 15 policies, strong parentage, good sponsorship, does not 16 guarantee success, and when that success is out of the 17 grasp then you have people going and being required to 18 sell that information by law. 19 So it is not clear to me that there should be 20 the immediate assessment that, because somebody puts the 21 word "privacy" or "private" within the name of their 22 company, that all of a sudden they should be deserving a 23 level of trust. 24 For that reason, the way that we've tried to 25 build our service is that you don't want to continue the 0173 1 proliferation of information going out onto the web, 2 whether it's with a third party or whether it's with a new 3 company that you're trying to entrust, that you should be 4 able to deal with people that you already know and that 5 you already have relationships with, and you should not 6 have to go and extend trust in order to try to buy 7 privacy. 8 MR. LORDAN: Sherry from Encirq, the panel 9 talked a little bit about your technology being client 10 side. Do you see that being a differentiator in this 11 area? 12 MS. SEROKIN: Well, I think client side helps 13 from the technology architecture side to protect against 14 some of the potential for data leakage or hacking or 15 people getting access to data that's kept on the server 16 side. But I think trust is as much an issue for client 17 side technology as it is for server side technology, 18 because how does the average consumer know that the data 19 never leaves their own computer? 20 Ultimately all business relationships come down 21 to a level of trust, so all of us have to be engaged in 22 explaining what we do, what we don't do, and keeping our 23 word at that, and there probably does need to be more 24 legislative baseline on sort of what's acceptable and what 25 you can have enforced. Not all companies are going to be 0174 1 good guys. Not everybody's going to do the right thing, 2 and therefore there does have to be some kind of 3 enforcement side, no matter whether you're client side or 4 server side technology. 5 MR. LORDAN: Thank you. 6 Much of the technologies that are on display 7 here are really consumer-focused and they're for 8 consumers, which is a very good thing as far as I'm 9 concerned. One person noted to me today that there was a 10 concern about there weren't enough B2B technologies being 11 displayed here today and that perhaps B2B technologies, at 12 least when it comes to privacy, will probably be leading 13 the way and that's where the development is really going 14 to occur. 15 Does anybody have any comments on that 16 particular aspect? 17 MR. SCHWARTZ: Hi, I'm Mike Schwartz from ID- 18 Vault. 19 I'd like to say that I think privacy is a big 20 area and when we first looked at security five years ago 21 we discovered that security was more than just firewalls, 22 that there's intrusion detection, there's firewalls, 23 there's VPN's, there's PKI. I think privacy is in the 24 same state right now, where we like to simplify the 25 market, but in reality there's a lot of tough issues and 0175 1 tough challenges to deal with. 2 In the B2B space, I think that we need to 3 realize that there are legitimate and necessary 4 disclosures of personal information. For example, your 5 bank, they know who you are and in certain circumstances 6 they need to disclose that information, or your health 7 care information; your health plan needs to get paid or 8 your health care provider needs to get paid by your health 9 plan. 10 So there's a lot of areas in the B2B space where 11 disclosures are necessary and better tools are needed to 12 manage the transfer of information, so that the transfer 13 can happen but people's privacy rights can be respected in 14 the process. 15 MR. LORDAN: Martin from IBM. 16 MR. PRESLER-MARSHALL: Would you like a general 17 answer or would you like a product advertisement? 18 MR. LORDAN: I think I'd like a general. 19 MR. PRESLER-MARSHALL: The product advertisement 20 is: Stop by my booth and I'll talk to you about it. 21 The general answer is that this is definitely a 22 space where new products are being developed and we will 23 be seeing in the next several years. If a web site is 24 going to make a representation, these are the uses we will 25 make of your data and these are the organizations we will 0176 1 share it with, then the web site's got to be able to live 2 up to that promise. 3 It's one thing to try to say let's do that 4 manually. It's far more reasonable for an organization to 5 want to say: Okay, let's have some tools to help enforce 6 that. I think you'll be seeing those coming out in the 7 next couple of years, because it's just not realistic to 8 do it without them. 9 MR. LORDAN: Yes. 10 MR. ZIMMERMAN: David Zimmerman from YOUPowered. 11 I'd just like to mention a plug for an emerging 12 standard. The CP Exchange Standard, which will be coming 13 out later this year, is a standard for the exchange of 14 information on the B2B side that I think has the potential 15 to have the same sort of effect that P3P has had on the 16 B2C space in terms of laying out a protocol that really 17 enables people to develop B2B type privacy technologies. 18 I think that'll be real progress on that end. 19 MR. LORDAN: One more. Scott Beechuk. 20 MR. BEECHUK: I would agree with that comment. 21 CP Exchange is actually built upon that XML. We at 22 Privacy Right have one particular client in a very B2B 23 space. It's actually a purchasing system where millions 24 of pounds of raw steel are purchased in an auction system 25 by multiple businesses. So the question becomes what is 0177 1 the significant data that we're trying to protect against? 2 Is it just personal information, such as health records 3 and insurance records? Or is it actually the fact that 4 those particular businesses that are buying those raw 5 materials may not want their competitors or other market 6 analysts to know that they're actually in that business? 7 Maybe they're developing a prototype. 8 That's one of the reasons why we came into play 9 as consultants to that B2B organization. So I think it is 10 very relevant even if you're not talking about personally 11 identifiable information. 12 MR. LORDAN: From ILumine? 13 MR. ANDERSON: I'm Brant Anderson. We offer a 14 digital handshake, which is a way to do documents on line 15 as though you were passing them back and forth, to have 16 signatures that are binding but can be done over the web. 17 We have an interesting application that you brought to 18 mind, where we're working with the Air Force that is 19 repairing airplanes that warring factions have bought from 20 the United States, but they both send them back to us to 21 repair. They'd rather not have each other know when their 22 aircraft are down, and providing ways that information can 23 flow cleanly back and forth between people who need to 24 know it, but without the authorization others who don't 25 need to know it don't have access to it. 0178 1 We use digital signatures with PKI to do that in 2 much less dramatic settings, but it has the same effect, 3 that an individual or an individual company can, with 4 trust, put information out, receive information back, and 5 even do it at a very granular level, that data within 6 transactions that ought to go to some people are not 7 available to others. 8 MR. LORDAN: I'll interpose a question that the 9 last panel really addressed, but in this context, with 10 regard to standards. I think a lot of folks have been 11 grappling on this panel with how do we set the defaults 12 for some of these technologies and where is the baseline, 13 what are the standards for -- well, are they fair 14 information practices? Where do we set the defaults on 15 the cookie browsers and the cookie managers? 16 I'd like to ask Jonathan and Bill Guidera from 17 Microsoft and then go to Dr. Lukers on, when there isn't 18 legislation or regulation per se on the books to help 19 implement the technologies, how do we make those 20 decisions? 21 MR. ZUCK: It is very difficult to define what 22 the default should be or some sort of minimum standard, 23 because so much of this is a negotiation between vendors 24 and consumers about information they're willing to share 25 in return for services or discounts and things like that 0179 1 they're trying to receive. 2 So I think it's important not to make a 3 presumption that everyone wants to be warned every time 4 there's a cookie, for example. But instead, I think that 5 our goal has got to be to better educate people about the 6 choices that they make and turn them into proactive actors 7 in that choice. 8 One of the things that came up in the last panel 9 that I found very frustrating is, well, we can see that 10 technology is not working because of the people that don't 11 read privacy policies. Well, if they don't read privacy 12 policies then it's not as important to them as we all like 13 to think that it is. If it is, they will take proactive 14 steps. 15 Even if we talk about privacy as a right, every 16 right that we have in the United States is something that 17 we take individual responsibility for enforcing and 18 practicing. So I think that we need at some level to 19 empower and educate consumers instead of spending so much 20 time trying to protect them from themselves. 21 MR. GUIDERA: I don't think we have an easy 22 answer for that. Our cookie tool integrating into the 23 browser has the default set for a prompt when a third 24 party cookie appears on a screen. But that's hardly a 25 permanent default setting. Once the screen appears and 0180 1 informs you who's placing the cookie, it gives you an idea 2 as to what its functionality is. But it immediately gives 3 you an opportunity to change the default setting. 4 So if you are experiencing these windows that 5 appear repeatedly saying that you've got a third party 6 cookie appearing -- for instance, on some of MSN's sites 7 you can see three, four, five of these screens show up at 8 once -- you have an opportunity, if you're tired of seeing 9 that, to change the default interactively right there in 10 your browser from that point forward. 11 MR. LORDAN: Dr. Lukers, then Joe, and back to 12 Lorrie. 13 DR. LUKERS: I think when we talk about defaults 14 I think people look at it from two perspectives. If we're 15 talking about the defaults that exist in terms of a 16 baseline that was discussed in the last panel, baseline 17 legislation, I think that there probably is a need for 18 some default legislation that deals with simply the issue 19 of requiring fair information practices to be posted by 20 companies. 21 If we're talking about defaults that are set by 22 the consumer in terms of what permissions they want to 23 assign to their data, I agree with what everyone else has 24 said, and that is that this is individual based on what 25 consumers feel, the level of confidence they have in the 0181 1 site. 2 I think that we have a tendency to look at 3 privacy sometimes as one-dimensional. I think it's at 4 least two-dimensional. It's based on, I think from the 5 consumer perspective, on the degree of confidence they 6 have in a site respecting their privacy practices, as well 7 as the site adhering to the consumer's privacy preferences 8 and the degree of value that the consumer feels they're 9 going to get in return for going to that site. 10 There was a site out there a little while ago 11 called FreePC. You all remember that, where over a 12 million consumers went to that site and gave out the most 13 intimate details about their lives for a chance at winning 14 a PC valued at under $500. I think that tells us that 15 consumers are interested in exchanging their information 16 for value. 17 I think that recent studies have shown that 18 consumers are willing to give out a lot of information for 19 the value of personalization. So the way I look at it is 20 the default should always be in essence off until the 21 consumer decides how comfortable they are in going to a 22 site and providing information. But, having said that, I 23 think that the only really valid privacy policy out there 24 is really one that's created by the consumer based on 25 their experience and based on their comfort in going and 0182 1 exchanging information with the site. 2 MR. LORDAN: Joe MacIsaac. 3 MR. MacISAAC: Hi. Thank you. 4 I think that policies are kind of limited in 5 what they can achieve for the big picture. The big 6 problem that everyone understands is that in order to buy 7 online you have to give the merchant your credit card, 8 your address, and your name in order to make a purchase. 9 This requires trust, and it's not the kind of trust you 10 need at retail, where you can just walk up: Here's my 11 money, here's the product; okay, see you later. 12 So technological advances are going to bring 13 about that fundamental shift in the need for trust or the 14 removal of the need for trust, because when people 15 understand that technologically their disclosure is 16 limited to the parties that they want it to be limited to, 17 they will walk away saying: I understand that this is 18 acceptable and I don't need trust; I can buy as with cash. 19 So I just want to -- I think that privacy 20 policies are not going to prove ultimately to be the only 21 answer. They will always exist, but that technology has 22 got to come about. 23 For plug purposes, what we do is we recognize 24 the problem as this need to disclose your personal 25 information to the merchant because of the way the order 0183 1 processing system is written, and if you decouple that you 2 get the opportunity with more than one computer system to 3 control disclosure in the manner that we all know with 4 encryption technology for the appropriate participants. 5 This will give consumers and buyers in the B2B space what 6 they need to feel like the technology supports their 7 concerns. 8 MR. LORDAN: Thank you. 9 I'd like to go to Lorrie and then a couple 10 questions from the audience. 11 DR. CRANOR: On the question of defaults, in the 12 work that we've done with P3P there are sort of two 13 answers that we've been looking at. One is that we need 14 to find ways to make the configuration of these tools 15 very, very easy. When you look at all the different 16 parameters, it's a really difficult problem. At AT&T 17 we've taken a first pass at trying to break that down into 18 basically a one-page questionnaire. That's what you see 19 out in the tool that we're demo-ing. 20 We know we need to do more work. We know we 21 need to do user testing on it. But that's the beginning 22 and we hope to get this down to something that will be 23 really easy for users to approach. 24 The second part of it is that we think that 25 there should be easy ways for consumers to go to people 0184 1 they trust and get a set of settings and put them in, so I 2 don't have to say, well, I bought this from this software 3 vendor and so I have to use their defaults. There should 4 be a way that I could go to some of the organizations we 5 talked about before, either the Better Business Bureau or 6 the ACLU or the bishops or whoever, and get this set of 7 settings and plug it in. 8 As part of P3P we have a language called APPEL, 9 for A P3P Preference Exchange Language, which is designed 10 to allow different organizations to create these settings 11 in a standard format that should be able to be plugged 12 into any P3P product that supports APPEL, and so we're 13 really encouraging the implementers to support APPEL so 14 that we can allow consumers to do this. 15 MR. LORDAN: Thank you. 16 I'd like to go to the audience for just a couple 17 questions and then wrap it up. 18 QUESTION: As a panel of technologists and 19 technology companies who've all very recently jumped into 20 the emerging privacy marketplace, I'm wondering what you 21 think the impact is of potential legislation or 22 legislation that you see being drafted at the federal 23 level or the state level on the development of privacy 24 products generally and for some of your individual 25 companies. 0185 1 VOICE: There are laws against stealing and yet 2 people still lock their doors. I don't believe that 3 baseline legislation could do anything but help raise the 4 awareness and the level of discourse and things that are 5 going on and still provide the consumers with the 6 flexibility of making the informed choices that they want 7 to make in terms of how much information to be allowed. 8 So I don't look at it as a threat. I look at it as in 9 fact an opportunity for all of us. 10 MR. LORDAN: Stephanie. 11 MS. PERRIN: Couldn't agree more. When the 12 legislation is passed, companies are going to have to 13 figure out how they're going to be held accountable and 14 how they're going to implement, through often very large 15 organizations and multiple interfaces. 16 One point I'd like to throw in here in terms of 17 the B2B space, besides the usual commercial to come and 18 visit our booth, is that most security attacks have been 19 from within, 50 to 80 percent depending upon who you read. 20 As long as you've got personal information held within the 21 organization, even if it's congruent to law and policy, 22 you're only one disgruntled employee away from a major 23 privacy Chernobyl, the way information can hit the 24 Internet nowadays. 25 So all privacy legislation has some attention to 0186 1 safeguards and you're required to keep things 2 confidential. Our technologies of all the members here in 3 some way try and meet that obligation, and I think that 4 the more you think about it, the more we realize liability 5 that's coming, we're going to be heading for less personal 6 information, particularly when we can do personalization 7 based on pseudonyms. 8 MR. LORDAN: Peggy Haney from American Express. 9 MS. HANEY: I have to come down on the side of 10 put as much power and control as you can into the hands of 11 the consumer. I haven't made my plug for Private 12 Payments. I notice some of you have come by. But 13 certainly, with the Private Payments and private browsing 14 that we'll be introducing later this year with Privada, 15 we're really believing that the Internet is not the wild, 16 wild West it's been described to be, as much as it used to 17 be, because of some of these technology solutions where 18 control and power goes to the consumer to protect their 19 privacy. 20 And while Private Payments provides that 21 randomized unique number linked to your American Express 22 card, so your card number does not go on line, it's one 23 step along with the other companies who are sitting here 24 in trying to put power in the hands of the consumer. So 25 I'm not sure legislation will ever protect consumers the 0187 1 way those what are asking for it that it will protect them 2 in the end. 3 MR. LORDAN: I need to go to the audience for 4 questions, but I'm hoping Adams or Ron or Glee can answer 5 it. 6 QUESTION: I don't know if any of those people 7 can answer it, but I just want to sort of solidify or 8 restate the question that I asked to the previous panel, 9 but this one might be a better one, which is: What 10 happens when I give information, whether it's to an 11 infomediary or some third party, and that information I 12 then approve for it to go out, if I then choose, because I 13 either didn't understand or because the company violated 14 that agreement or because there's some new use of that 15 information in the case of the Safeway example where I 16 bought a gazillions of beer? 17 What technologies exist, A? B, should I have 18 the right to pull back my information from these people? 19 MR. LORDAN: Right, I think that question was 20 asked. That was asked of the last panel, and it is 21 probably more appropriate for this panel. So are there 22 any technologies that can pull the data back? 23 MR. DOUGLAS: Adams Douglas, Anonymizer.com. 24 There are technologies to hold that data back. 25 I'd like to also comment relative -- one of them is ours. 0188 1 There's the plug. I'd also like to comment, relative to 2 the previous set of answers, that there is a limit to how 3 much can be kept under control once the data is out there. 4 I mean, any technology is vulnerable to attack or to 5 failure. Any person's trust can be violated or fail. 6 At some point, there has to be a point where the 7 information doesn't get out there, and that can be a point 8 of decision by the consumer, but some consumers, besides 9 being enthusiastic about the Internet and also a little 10 naive, can also make the mistake of giving out too much 11 information. And once that happens, the genie can't be 12 put back in the bottle. 13 Once your information is out there, it won't 14 ever go back, no matter what policies or technology is 15 applied. That's one of the keys to protecting privacy, I 16 feel, is to make sure certain information never gets out 17 at all. 18 MR. LORDAN: I think we're about to wrap up with 19 some closing comments. Dave. 20 MR. MARVITZ: I'm Dave Marvitz from Disappearing 21 Ink, and the question that you bring up is exactly on 22 point for what we're trying to address, or at least close. 23 What we do is make e-mail transients, like you send 24 someone an e-mail message and it expires some time later 25 that I can set. So I can say this e-mail is only good for 0189 1 a week and then all copies go away. 2 That still assumes cooperating parties. The 3 question of noncooperating parties -- I want to give you 4 some data and retain the right to pull it back -- is a 5 classic content control issue that, for example, the 6 record companies are struggling with, right. They want to 7 be able -- Divx was an attempt to do that. That was an 8 utter and total disaster. 9 So I think you'll see solutions like ours and 10 others coming out in the near future that allow 11 cooperating parties to pull data back. One example might 12 be, let's say, sort of a slightly different way of 13 addressing the problem that the Amex folks are addressing: 14 Amazon doesn't really want my credit card number on file, 15 not really, because if they get hacked it gets exposed, 16 like that CD company that had all those problems. I'd 17 like to give them my credit card number, let them do their 18 thing with it, and then pull it back so it's not always on 19 their site, just a point or something that I can control 20 is on their site. Everybody is happier because everybody 21 is safer. The data is not really out there in any 22 permanent kind of a way. 23 So anyhow, just to summarize, you're going to 24 see baby steps in that direction that involve cooperating 25 parties. Then in time I think you'll get tighter and 0190 1 tighter control on the data. You can sort of let the 2 leash out further and further. 3 MR. LORDAN: Thank you very much. I'm afraid 4 that's all we have time for right now. As far as some 5 closing comments, I think if the panelists could just stay 6 seated for some closing comments, and just one moment for 7 the Assistant Secretary of Commerce, Greg Rohde. 8 CLOSING REMARKS - GREGORY L. ROHDE 9 MR. ROHDE: Thank you very much, Tim. 10 I want to again just express my deep gratitude 11 to the Internet Education Foundation for co-sponsoring 12 this event with us. I also wanted to say to all of you 13 hearty souls who have endured this entire day: 14 Congratulations. The good news is that there is food and 15 drink waiting for you as soon as I'm done speaking, which 16 will only take a few minutes. So I really wanted to thank 17 all of you for your participation. 18 Also, for the neophytes in the audience who may 19 not fully appreciate what an impressive assembly we have 20 here and the folks we had who are moderators and 21 panelists. Please join me in thanking this impressive 22 group of folks here who have traveled around the country 23 to join us today. 24 (Applause.) 25 So thank all of you for being here. 0191 1 I'm a civil servant and I'm also a political 2 civil servant. For my job, I am appointed by the 3 President, I'm confirmed by the U.S. Senate. I work here 4 for the public in this building. So as a political civil 5 servant I expect that somebody is reading my e-mail and 6 monitoring everything that goes on in my computer. On a 7 regular basis the Inspector General, the General 8 Accounting Office, or some Congressional oversight 9 committee will subpoena everything that goes on on my 10 computer to look at it. Whether it be a fishing 11 expedition or whether it be some other thing that they're 12 looking at, I don't know. But I expect this to happen 13 with my job. 14 But most people, most private citizens, don't 15 expect that to occur on their computers, and they 16 shouldn't have to expect that. That's what this whole 17 issue is about, is how are we going to protect privacy. 18 I come from a relatively small town, Bismarck, 19 North Dakota. For those of you who are foggy in your high 20 school geography, North Dakota is a state. It's one of 21 the 50 states. Bismarck, it's not just a battleship or a 22 German dictator; it's actually a state capital. But it's 23 a relatively small town. 24 When I grew up, my mother is a nurse at a 25 hospital, my father was a pharmacist, and between my 0192 1 mother and my father they had amongst them a great deal of 2 medical information in their own heads. All the people 3 that would go to my father to write a prescription and go 4 to the hospital and see my mother -- I knew that the two 5 of them knew an awful lot about our community. It's a 6 very small town. 7 But the fact is that the people in our 8 community, they had a lot of confidence in my parents. 9 They knew that when they got a prescription from my father 10 that it didn't go any further than that, and they were 11 pretty confident of that. 12 We live in a very different world now. It used 13 to be the case that when our medical information or 14 financial information or whatever data we have about us 15 was stored in a file somewhere, locked in a file cabinet, 16 put in a secure building, and we knew who had access to 17 that information, we had a lot of confidence in that as a 18 society and we felt pretty good about that. 19 We now live in a very different world, because 20 now that information is now put on a computer network and 21 it's now accessible through the Internet. It's much more 22 vulnerable now than it used to be, because we've become an 23 electronic society. That's the challenge we're faced 24 with, as to exactly how do we keep up with this changing 25 technology. 0193 1 Now, over 30 million people worldwide are 2 accessing the Internet, and that's growing exponentially 3 on a daily basis. Just short of half of all the Internet 4 users are in the United States, and we are becoming more 5 and more dependent on electronic commerce. It's exciting, 6 it's wonderful, it's doing tremendous things for how we 7 conduct business, for how we learn, for how we get our 8 health care. It's terrific. 9 But equally exciting, it's also equally 10 challenging for us to address these issues. That's what 11 this conference was all about today. We heard a great 12 deal today throughout the day from the panelists and from 13 the information that was shared about how technologies are 14 going to address people's various interests in protecting 15 their own privacy. We heard a lot about pursuing 16 questions about how we can educate consumers about these 17 various technologies. 18 I tell you, I've been very impressed. I spent 19 some time out in the foyer area with many of your 20 companies and I've seen some of the demonstrations, and it 21 really impressed me, the level of sophistication that 22 exists within these technologies and the flexibility that 23 exists in many of these technologies to help consumers 24 personalize the protections they want. 25 We've also asked a lot of questions about what's 0194 1 the appropriate role of government: How do these new 2 technologies play a role in any kind of legislative or 3 regulatory environment that may or may not exist? What's 4 the role of third parties in verifying all of these 5 protections we've been talking about? 6 So we've pursued a lot of very interesting 7 questions here today. I think one thing that is true is I 8 don't think anybody here would admit that a particular 9 technology or technologies amongst themselves are the 10 silver bullet. This is not going to be simply the 11 solution. It is indeed part of the solution. It's a very 12 important part of the solution. 13 That's why NTIA felt it was very important to 14 hold a workshop that specifically looked at the role of 15 technologies. We have a lot of debates about computer 16 privacy and online privacy, and it's going to continue. 17 It's going to continue to be a hot issue in government, in 18 the private sector, and we're going to look at it from a 19 lot of different angles. But looking at the role of 20 technology is very, very important. 21 Today we're fortunate that today's workshop has 22 been webcasted and in real audio, so people who have not 23 been able to participate here have been able to access 24 this workshop through NTIA's web site. We hope that today 25 is not the end of anything. We hope it's the beginning of 0195 1 a greater dialogue. NTIA very much wants to continue this 2 dialogue. We're very grateful for the expertise that all 3 of you have brought and the input all of you have brought 4 today. We hope that you will be in touch with us. We 5 hope that you will continue to provide us comments. 6 Anybody can keep track of what we're doing at 7 NTIA and to give us feedback on our web site, which is 8 www.ntia.doc.gov. Just so you know, for those of you who 9 access the web site, we provide no personally identifying 10 information to the Republican National Committee for 11 fundraising purposes. Now, not every web site will give 12 you that guarantee, but we will. 13 (Laughter.) 14 Sorry, it's a political joke. I probably 15 shouldn't say that. 16 Before we close, I need to take the time to 17 thank some people who have worked extremely hard on this 18 conference. First of all, I need to thank the staff at 19 NTIA, who really do a terrific job. One of the great 20 things about being Assistant Secretary is you basically 21 are put in a job where you take credit for the work that 22 everybody else does. That's one of the things I've 23 learned at NTIA, is that the bulk of the work is actually 24 done by the hard-working civil servants that are in the 25 agency. 0196 1 In particular, Kelly Levy, Becky Burr, Wendy 2 Lader, Sally Ann Fortunato, Sandra Laousis, Judy 3 Kilpatrick, Dan Davis, and Hershel Gelman have just done a 4 terrific job with us. Also we've had a couple of interns 5 who have worked on this forum, John Carls and Gilian 6 Crawford, who worked with us throughout the summer and 7 have participated in this. 8 Without any further ado, I know if you've been 9 here all day you're probably hungry and the last thing in 10 the world I want to do is be between people and food. So 11 I would really encourage all of you to stay here for a 12 while, please enjoy the hospitality in the front foyer 13 area, and see some of the demonstrations. Once again, I 14 want to thank the Internet Education Foundation for co- 15 sponsoring this. We're very grateful for it. And thank 16 all of you for coming. I really appreciate it. Thanks a 17 lot. 18 (Applause and, at 3:56 p.m., end of workshop.) 19 20 21 22 23 24 25