Student Projects:Privacy Internet:Rough Draft

From CSEP590TU
Jump to: navigation, search

ONLINE PROFILING AND THE RIGHT TO PRIVACY Class Term Paper CSE P590TU December 3, 2004 Jim Jantos Ryan Kaneshiro John Peterson Ted Zuvich

Introduction A large segment of the U.S. population is genuinely convinced that computers, by their very nature, prohibitively invade their personal privacy . Clearly, computers dramatically increase one's capabilities to gather and process data about virtually everything relating to others. Today, we live in an era of unprecedented reliance upon information and analysis provided by computers. As computers, software, and data manipulation methodologies grow ever more sophisticated and powerful, data compilation and subsequent analysis of that data has predictably led to profiling of individual citizens on a dangerously large scale . It is this very ability to collect, combine, and analyze data from different databases that worries American citizens. Profiling of individuals is a very real, immediate, and serious threat to the privacy rights and civil liberties of all of us. Nearly a billion people are now using the Internet as a personal or institutional system of communication . The World Wide Web user base doubles every twelve to eighteen months . This same system that provides humans the capacity to instantly communicate on a planet-wide scale is, at the same time, developing into a nefarious tool for the collection of information about average people and their communications. Some users of the Internet want to shield their identities while participating in frank discussions on various sensitive topics, while others fulfill harmless fantasies by role-playing in chatrooms . Others are concerned about unauthorized hacking into computer systems, unauthorized search and seizure issues, unsolicited e-mail, defamation, and secretly creating databases consisting of individual personal information . The nature of the Internet provides a potpourri of challenges to our traditional top down approach to controlling citizen behavior and implementing public policies. It also magnifies the competing interests of commercial business, government, and the privacy concerns of individuals from overreach by government and the private sector. Continued Internet usage will only exacerbate the privacy issue, particularly since there is no formal law existing in cyberspace. This paper constitutes a cooperative effort between certain computer scientists and lawyers to detail privacy concerns related to computer data collection and derivative profiling. The paper surveys relevant laws and approaches relevant to privacy, and presents a critical review of present legislative and technological attempts to address many of these concerns. The paper concludes with several recommendations for privacy protection. Privacy - What Is It? There appears to be a lack of consensus as to what may be considered subject to privacy rules. The Merriam-Webster Online Collegiate Dictionary defines privacy as: a) the quality or state of being apart from company or observation: seclusion; b) freedom from unauthorized instrusion (one's right to privacy). Thus, in a sense privacy is the freedom from unauthorized intrusion. Privacy may be defined in such a way that in a cultural context it would apply to certain aspects of personal where one has reasonable expectations of privacy . There are many areas of life that differing cultures choose to consider private. Historically, the people of the United States have associated privacy protection with personal information . We view "snooping" and similar behavior as intrusive and violative of our privacy rights, irrespective of whether or not any personal data has been obtained by those who would engage in such activities. In the U.S., the argument for the right to privacy derives from the Fourth Amendment to the U.S. Constitution, which states: "The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized." While the word "privacy" is not found in the Fourth Amendment, it is obvious how it relates to one's freedom from unauthorized intrusion by the government (as we defined privacy, above). Privacy doctrine has been subsequently interpreted by the U.S. Supreme Court to be the essence of the Bill of Rights, and hence a Constitutional guarantee . Privacy is referred to as the "penumbra right" that grows from and is protected by several of the Constitutional Amendments . Although neither explicitly protected by the Constitution nor specifically spelled out therein, privacy is generally considered a "core value" by most Americans . State constitutions, federal and state statutes, and tort law judicial decisions also provide authority in support of the individual's right to privacy. At the time the Constitution and the Bill of Rights were written, the Founding Fathers addressed what they believed were the most pressing privacy fears of their day. These fears can be summarized as follows: 1. that the government would search one's home whenever it so desired, confiscating whatever information it deemed desirable for its purposes; 2. that the government would quarter its troops in citizen's home without their consent, effectively placing government spies among the people ; 3. that a unified religious majority (read as christian) would impose its doctrines upon the citizenry via effective peer monitoring. The framers of the Constitution successfully addressed these concerns. Disappointingly, however, they were not able to effectively address the impact of future changes in technology and the concomitant privacy concerns that have gained life as new technologies entrenched their way into American life. As a result of this unfortunate but glaring failure, we must ask ourselves whether or not the laws protecting the individual's right to privacy are sufficient to protect us from evolving computer related technology. It would seem that traditional privacy consists of two principles: a) the freedom from unreasonable surveillance; and b) the right of the individual to control the access and dissemination of information about himself . Decisions of the U.S Supreme Court have broadly defined privacy in precisely these ways. The court has recognized "associational privacy," (NAACP v. Alabama, 1958); "political privacy" (Watkins v. United States, 1957); "right to anonymity in public," (Talley v. California, 1960); "reasonable and legitimate expectation of communications privacy," (Katz v. United States, 1967); privacy includes personal decisions about sex, marriage, & reproduction, (Griswold v. Connecticutt, (1966); "individual interest in avoiding disclosure of personal matters, or informational privacy" and "interest in independence in making certain kinds of important decisions," (Whalen v. Roe, 1977); the right to be free from unwanted medical attention (Cruzan v. Missouri Dept. of Health, 1990); freedom from unwarranted wiretapping, (Olmstead v. U.S., 1928); "freedom from bodily retraint, the right of the individual to contract, to engage in any of the common occupations of life, to acquire useful knowledge, to worship according to the dictates of [one's] own conscience, to marry, establish a home, bring up children, and generally to enjoy those privileges long recognized at common law as essential to the orderly pursuit of happiness," (Meyer v. Nebraska, 1923) . A person has privacy in his home because it is possible to close the world out. No one can see or hear you, thus freeing you to do things that if viewable in public would be considered socially unacceptable. Public nudity, for example, is generally unacceptable, but it is common in one's daily household life. Similarly, in most places you can walk down the public streets and not have to worry about someone recording your every movement. Surveillance is nothing more than intentionally collecting information that happens to be about other people. Neither the purpose of the data collection, nor the intentions of the data collector ultimately determine what will be done with the data collected in the future. For example, telephone records collected for business accounting purposes are frequently used in police criminal investigations. Thus, purpose and intent of data collection should not be considered in determining a protectible zone of privacy. The fact that surveillance has occurred and information collected is what fundamentally matters to the citizen. Who has access to the information collected about us? What pieces of information are we talking about? A citizen reasonably expects that his medical data, work records, financial records, educational records, military records, shopping habits, and social life are not publicly available. Most of us would consider our privacy violated in those instances where information about our lives is shared with others whom we have not expressly authorized to have it. Governments have the ability to force information sharing because of their positions of power. Private enterprise does not have this similar power. While the effects of corporate and governmental privacy invasion are the same, the corresponding countermeasures necessary to protect one's privacy are rather dissimilar . In a nutshell, the more others know about you, the more power they have over you. Explanation of the Constitutional Framework The makers of our Constitution understood the need to secure conditions favorable to the pursuit of happiness, and to the protections guaranteed by this are much broader in scope, and include the right to life and an inviolate personality - the right to be left alone - the most comprehensive of rights and the right most valued by civilized men. The principle underlying the Fourth and Fifth Amendments is protection against invasions of the sanctities of a man's home and privacies of life. This is a recognition of the significance of man's spiritual nature, his feelings, and his intellect. Every violation of the right to privacy must be deemed a violation of the Fourth Amendment. Now, as time works, subtler and more far-reaching means of invading privacy will become available to the government. The progress of science in furnishing the government with the means of espionage is not likely to stop with wiretapping. Advances in the psychic and related sciences may bring means of exploring beliefs, thoughts and emotions. It does not matter if the target of government intrusion is a confirmed criminal. If the government becomes a lawbreaker it breeds contempt for law. It is also immaterial where the physical connection of the wiretap takes place. No federal official is authorized to commit a crime on behalf of the government." (Justice Holmes, Stone, and Brandeis dissenting in Olmstead v. U.S., 277 U.S. 438, 1928) . Privacy as protected by the federal Constitution is different from tort law privacy protection in a couple of important ways. First, the acts constituting privacy intrusion are somewhat dissimilar and second, the methods of protection afforded the citizen are different. Constitutional privacy protects the individual against the intrusive actions of the federal government, whereas, the common law of torts protects the citizen from the actions of other private citizens . Most suits against the federal government, its agents, employees, or contractors ("state action" or under "color of law") include claims based on the fourth, fifth, sixth, or ninth Amendments. Twenty-four of the U.S. states also have constitutional provisions or statutes that protect the citizen's right to privacy, and some have been construed by the courts to include authority for civil claims. Restrictions imposed by the Fourth Amendment apply to the federal government. The Fourteenth Amendment imposes the restrictions of the Fourth Amendment on the fifty states and their local governments . Private entities are restricted by common tort law, state and federal statutes. In Griswold v. Connecticutt, 381 U.S. 479 (1965), the U.S. Supreme Court announced the penumbra theory of the right to privacy . Under this theory, "...specific guarantees in the Bill of Rights have penumbras, formed by the emanations from those [other Bill of Rights] guarantees that give them substance. Various guarantees create zones of privacy, such as the First Amendment right of association, the Third Amendment prohibition against quartering soldiers in a home, the Fourth Amendment right to be secure in one's person, house, papers, and effects, the Fifth Amendment right not to surrender anything to one's detriment, and the Ninth Amendment right not to deny or disparage any right retained by the people. These cases press for recognition of the penumbral rights of privacy and repose." The best that can be said for this approach is that it relies heavily on a liberal interpretation of the Ninth Amendment. In Katz v. United States, 389 U.S. 347 (1967), the court shifted its definition of privacy from being place-based to being person based. The court tried to balance the government interest in protecting society from criminals, with the interest in protecting individuals from government intrusion. It enunciated a so-called two-part "reasonable expectation" test. The first part of the test asked whether or not the individual exhibited a personal expectation of being left alone from the claimed government intrusion. The second part asked the question whether or not this personal expectation is of the kind that our society is prepared to recognize as reasonable. Hence, the rules against unreasonable search and seizure. There is no expectation of privacy for items in plain view, open fields, abandoned, or in public places . Federal Statutory Protections In the U.S., privacy rights have developed in a piecemeal fashion. A patchwork of issue specific and industry related statutes have prevailed over any coherent right to privacy. Included within this hodgepodge of legislation designed to protect citizens are: 1. the Privacy Act of 1974 (which safeguards the privacy of government collections). 5 U.S.C. Section 552a; 2. Right to Financial Privacy Act of 1978 (which curbs the government's ability to access financial records maintained in financial institutions). 12 U.S.C. Sections 3401, et seq: 3. the 1970 Fair Credit Reporting Act (which safeguards the privacy of financial information); 4. the 1986 Electronic Communications Privacy Act (which safeguards the privacy of communications). This was enacted because the federal wiretap statute failed to protect us from modern computer transmission technologies. It is intended to prevent unauthorized surveillance of electronic communications. 18 U.S.C. Sections 2510-2521, 2701-2710, 3117, 3121-3126; 5. the Telephone Consumer Protection Act of 1991 (which protects telephone privacy); 6. the Health Insurance Portability and Accountability Act of 1996 (which protects the privacy of one's medical records). Public Law 104-191; 7. the Video Privacy Protection Act of 1988 (which safeguards the privacy of other personal records); 8. the Privacy for Consumers and Workers Act (which requires that if one is being monitored, he must be given notice that he is being monitored or recorded); 9. the Children's Online Privacy Protection Act of 1998 (which ensure the protection of children's personal information from commercial website misuse). 15 U.S.C. Sections 6501 et seq.; 10. Computer Fraud and Abuse Act of 1994 (which is supposed to contain computer technology abuse in government and banks). 18 U.S.C. Section 1030; 11. Gramm-Leach Bliley Financial Services Modernization Act of 1999 (which requires financial institutions to respect customer privacy, provide security therefor, and maintain confidentiality of customer data, and disclose their privacy policies). 15 U.S.C. Sections 6801 et seq.; 12. USA Patriot Act of 2001 (which requires any business that holds customer data to cooperate in giving such data to the government and law enforcement authorities in order to assist in anti-terrorist activities). Pub. Law no. 107-156, 115 Sta. 272; 13. Privacy of Mail ( which proscribes access to mail other than the adressee). 39 U.S.C. Section 3623, 1994; 14. Wiretap Statutes (which prevent unauthorized electronic communications interception) 18 U.S.C. Sections 2510 et. seq., 47 U.S.C. Section 605; The Telecommunications Act of 1996 (which sets rules for providers of telecommunications services to protect customer's personal information). Pub. Law 104, section 222, 110 Stat. 56, 1996; 15. Computer Matching and Privacy Protection Act of 1988 (which regulates exchange of computerized records among governmental agencies). Pub. Law No. 100-503. Executive Branch agencies also regulate privacy matters. Over the past thirty years, the federal government has engaged in a wide range of privacy initiatives. The Federal Trade Commission has been promulgating privacy regulations for the private sector. So has the White House Office of Management and Budget as well as the U.S. Department of Commerce . In November, 1999 the FTC examined online "profiling." Profiling is the practice of compiling information about consumers' preferences and interests primarily via collection of data from tracking consumers' online activities. The resulting profiles are used for a variety of commercial purposes. At present, the FTC supports both self-regulation and further legislation . State & Tort Protection of Privacy Today, the right to privacy is recognized in practically all fifty states by common case law, state constitutions, or by statute. The federal courts have said that the various states may enact greater privacy protection than that required under federal statutes. Some states have passed laws that appear to protect privacy in such a manner as to clearly include e-mailing . Many state court privacy decisions, however, have traditionally favored employers. It may that the common law of torts will become the basic battleground for private sector privacy protection decision-making. Of note is The Restatement, Second of Torts, section 652A which states that "one who invades the right to privacy of another is subject to liability for the resulting harm to the interests of the other. The right to privacy is invaded by the unreasonable intrusion upon the seclusion of another." Should not The Restatement encompass all invasive activities? There are four common torts that can be cited in the violation of privacy: a) intrusion upon the plaintiff's seclusion or solitude, or his private affairs; b) public disclosure of embarassing private facts about the plaintiff; c) publicity which places the plaintiff in a false light in the public eye; d) appropriation for the defendant's advantage, of the plaintiff's name or likeness . A serious question remains, however, as to the extent to which common tort remedies of invasion of privacy can truly protect us in the digital information age. Defamation too, is generally prohibited, no matter what form it takes. Defamation and/or disparagement essentially consist of the publication of false and unprivileged statements about someone that are relied upon and bring harm, economic loss, or social ill-repute to the one whop is the object of the statements. Who Needs or Cares About Privacy? Some people think that "If I am doing nothing wrong, then I do not need trouble myself with privacy concerns." This is a very naive view of the value of privacy. Hiding illegal activity is only a very small portion of the entire privacy issue. When one's privacy is violated, without at least mutually agreed upon compensation, one is literally being stolen from. Stolen identity records, for example are typically sold for hundreds of dollars per document. Privacy sensitive information is frequently leaked. How often do we have to deal with telemarketers who obtained information about us via unprotected private data? At the core of this issue is a power struggle. Are we going to maintain the right to control information about our private lives? Everybody who is interested in not being forced into subservient relationships, including criminal ones, by any person or entity that just happens to have the power to collect information that might be harmful to him needs a full blown right to privacy. Federal information collection systems of many different types raise concerns about the citizen's real privacy rights, especially since the advent of the USA Patriot Act. Privacy in federal systems is an important component in protecting against human rights threats. Federal agencies and employees have used citizen information stored in federal systems to carry out political and personal vendettas . Some past abuses include using census data to identify people for internment camps and spying on during W.W.II, snooping through I.R.S. tax records, and of course Presidential administrations illegally obtaining FBI files on political opponents . Identity theft is a problem in federal information databases. Identity theft occurs because the database is often holding the wrong kind of information and using it improperly. Why create more federal databases? Our Constitution created a government of narrowly defined and limited enumerated powers. Such a limited government model is the best defense against threats to privacy and other human rights. Unfortunately, this is a model of government that we have pretty much abandoned since the 1930's. As the federal government adopts more ambitious regulatory programs and agendas, the more its agencies demand more personal information from the citizenry. The higher taxes go, the greater IRS demands for personal and business records. Government's desire for tax money is somewhat like a growing grizzly bear - it always wants more food and everyone is afraid to get near it or to take that food away from it. While return to the limited government model might be best as a defense to dangers to privacy, it appears unlikely that that will occur in our lifetimes. The fundamental threat to civil liberties comes from the growth of governmental power, not the growth of databases. As long as we assume that the federal authorities should be responsible for regulating more and more of our lives, we will not be able to resist their demands for more privacy related data from us. Governments that do more need more tax money. It is probably illogical to argue that its taxing agency will not want to keep closer track of us. As long as government power grows, so will the government databases on the citizenry. The point is this: the answer to the threat of citizen privacy by powerful government is not the imposition of trifling restrictions on the use of collected data (from which the government will exempt itself), but rather to eliminate the power of government to violate our privacy. Maintaining a Real Right To Privacy - Solutions Differing solutions to the issue of privacy have been proposed. They can be summarized as follows: 1. Technical Solutions 2. The European Union Model of Private Sector Regulation 3. The American Model of Private Sector Self-regulation 4. Further Legislative Patchwork Approach 5. Proposed Constitutional Privacy Amendment/Reliance Upon the Ninth Amendment Technologies designed to meet the information requirements of business and government have effectively deprived private citizens of the power to control their personal information and profiling. Communication technologies, in addition to facilitating the gathering of detailed personal data have enabled collectors and others to share data between themselves for unlimited purposes, without the knowledge or consent of ignorant online users. Cookies, Hypertext Transfer Protocols, Browsers, Search Engines, and Electronic Commerce are all used to collect data on citizens. Elsewhere in this paper, the authors have discussed technical initiatives like P3P, proxies, firewalls, anonymizers, system cleaners, cryptographic limitations, and OECD guidelines (collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accounting principles). The European data Protection Directive has been discussed at length and determined to be flawed because it does not limit government data collection . Self-regulation is inadequate vs. the U.S. government due to both a lack of enforcement and the absence and the absence of legal redress to harmed citizens, but it has not been a colossal failure in the private commercial sector. American business is fundamentally interested in making money, not building databases on private citizens. Industry tends to favor sel-regulation, arguing that it results in workable, market based solutions. The Uniform Commercial Code, like many industry-wide code systems, does work in the U.S. Recent online regulation has come from Better Business Bureau Online Privacy Program, Online Privacy Alliance, Online Seal Programs, the Network Advertiser's Initiative, and Liberty Alliance. Industry's primary success has been with online seal programs and disclosure of privacy policies. Those businesses who adopt such programs tend to stick with them. Since the U.S. private sector remains comparatively free of regulation, it is motivated to make self-regulatory systems work. The private sector, however, is not armed with the unique powers to control police, the courts, and armies, like the government is . Consequently, the commercial private sector does not worry us like the government does. Further patchwork legislation should prove to be no more effective than the present legislation. People must own their privacy sensitive information. Current legislation dealing with pieces of the privacy puzzle suffers from the haphazard and chaotic way that it tries to deal with each situation as it arises. It is nothing more than a list of special cases that becomes obsolete before it is even put into effect. This clearly is inadequate for the larger task of protecting people's privacy as a whole. What is needed is a comprehensive and cohesive theory of privacy. We need clear guidelines on what is privacy. We need to grant the power to control private information to the citizen, himself. We need to focus on people not technology. We need to rein in the already too powerful federal government. There is precedent for controlling dissemination of information in both the form of trade secret law and duties of confidentialit y. There is precedent for setting value or refusing distribution entirely under copyright law principles. A Call For a Right To Privacy Amendment The author of this part of the paper proposes the following Amendment to the U.S. Constitution: "The right of every person to personal zone of privacy is recognized, and may not be infringed without a reasonable showing of a compelling state interest that may not be achieved in any less intrusive and reasonable manner. Such zone of privacy shall be held inviolate and shall include the right to control his body, property, all personal information...(not yet complete)..." Proposing a constitutional privacy Amendment may seem like an admission that such a right does not exist in the Ninth Amendment . This argument goes back to the days of the Federalist Papers and dialogue between Thomas Jefferson, Alexander Hamilton, and James Madison. Protection of the unenumerated rights in the Constitution rests solely on the Ninth Amendment. In the 215 years since the Constitution was ratified, the interpretation of Congress' enumerated powers has grown considerably. An enumerated powers argument in support of the right to privacy is not going anywhere these days. Rights need to be enumerated to protect them from judicial and legislative infringement. The Ninth Amendment does not have the clout it was originally intended to have. Adoption of such a privacy Amendment would mean that it would apply to the states via the 14th Amendment, and to all persons acting as contractors or operating under color of law for the government. It would put the power to control privacy back with the citizen and form a powerful tool in dealing with a powerful government. EU Data Protection Directive In contrast to the patchwork of U.S. privacy laws and the uncertain application of U.S. privacy rights to Internet data collection, members of the European Union (“EU”) are bound by certain rules governing data protection promulgated under Directive 95/46/EC (the “EU Data Protection Directive”). The EU Data Protection Directive was adopted in 1995, and became effective for all EU members on October 25, 1998. The EU Data Protection Directive recognizes privacy as a fundamental right and is designed to uphold individual rights pertaining to the collection and processing of personal data. The EU Privacy Directive has broad application to both traditional paper and electronic personal data, and therefore implicates data gathering and profiling conducted through the Internet. Although many EU member states had pre-existing data protection laws, the EU Data Protection Directive was specifically drafted to provide a unitary approach among EU members. The omnibus EU Data Protection Directive is a far-reaching approach to privacy and data protection. Based on the interests of the U.S. in the global economy and on the limitations placed on data transfers outside the EU mandated by the directive, the EU omnibus approach raises the question whether the U.S. should follow the EU lead and enact broad legislation along the lines of the EU Data Protection Directive. As discussed in further detail below, the EU Data Protection Directive suffers from many flaws and is not an appropriate approach to address privacy concerns related to online data collection and profiling in the U.S. EU Data Protection Directive – Basic Framework The EU Data Protection Directive recognizes privacy as a fundamental human right, and the rules set forth by the directive are justified by such right. The EU Data Protection Directive applies to the collection, transmission, and processing of “personal data” within and from the EU. Personal data is defined broadly as “any information relating to an identified or identifiable natural person ('data subject'); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity.” The EU Data Protection Directive generally takes a top-down approach; privacy rights prohibit certain acts, although the prohibited acts may be subject to one or more exceptions. Presumably recognizing the existence of legitimate uses of “personal data,” the EU Data Protection Directive begins from the position that the processing of personal data is lawful subject to the terms and conditions (i.e. limitations) of the directive. The EU Data Protection Directive requires EU member states to adhere to certain principles regarding (i) the collection and treatment of personal data; (ii) the processing of personal data; (iii) access and objection by data subjects to personal data; and, (iv) the exportation of personal data outside of the EU. The EU Data Protection Directive requires EU member states to implement legislation consistent with the principles of the directive and the directive mandates enforcement mechanisms for violations of the privacy principles set forth in the directive. The EU Data Protection Directive is limited in scope, which provides for many important uses of personal data beyond the reach of the directive. With respect to the handling and treatment of personal data, EU member states are required to implement the following policies regarding personal data: (i) personal data must be processed fairly and lawfully; (ii) the data must be collected for specified, explicit and legitimate purposes and not used inconsistently with such purposes; (iii) the data must be adequate, relevant and not excessive in relation to the purposes for which it is collected and processed; (iv) data must be accurate and up-to-date with reasonable steps to erase of rectify inaccurate or incomplete data; and, (v) data must not be kept in a form to permit identification any longer than is necessary for the purposes for which the data were collected or processed. The EU Data Protection Directive sets forth certain requirements which must be met before personal data may be “processed.” Processing of data is defined broadly as “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.” In order to process personal data, one the following requirements must be met: (i) the data subject must unambiguously give consent; (ii) the processing of the data is necessary for the performance of a contract to which the data subject is a party or is a step at the request of the data subject to enter into the contract; (iii) data processing is necessary for compliance with a legal obligation; (iv) the processing is necessary in order to protect the vital interests of the data subject; or, (v) the “processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller or in a third party to whom the data are disclosed except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject which require protection.” The EU Data Protection Directive prohibits the processing of “special” data, which includes “personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.” However, the prohibition is subject to many exceptions. Special data may be processed if the data subject has given explicit consent to processing, except in the case where applicable law prohibits such consent. Processing of special data is also appropriate when it is necessary for the purposes of carrying out the obligations of the controller related to employment law; to protect the vital interests of the data subject or of another person where the data subject is physically or legally incapable of giving consent; to carry out in the course legitimate activities with appropriate guarantees by a non-profit-seeking body with a political, philosophical, religious or trade-union aim; or, when the processing relates to personal data made public by the data subject for the establishment, exercise or defense of a legal claim. Furthermore, special personal data may be processed by health care professionals obligated to secrecy for medical purposes, for purposes related to “substantial public interest,” and for criminal and national security purposes under EU member national law provisions which provide “suitable safeguards.” Data subjects under the EU Data Protection Directive are given the rights to access and object to personal data collected and processed. With respect to access, data subjects may confirm whether or not personal data is being processed; to receive a communication of the data under processing, the processing method (if automated), and information as to the data’s source; and, to rectify, erase, or block the noncompliant processing of data, including notification to third parties to whom noncompliant data has been disclosed. Additionally, data subjects may object to the processing of personal data related to the subject on “compelling legitimate grounds.” Furthermore, data subjects may object to certain processing of personal data used for direct marketing purposes. Of particular importance to countries that are not members of the EU, the EU Data Protection Directive prohibits the export of personal data to a third country unless such country ensures an “adequate level of protection.” In effect, the directive attempts to ensure that third countries concur with the fundamental rights recognized by the EU. The prohibition on the transfer of personal data outside of the EU is of particular importance to multinational U.S. companies. In response to the directive, the U.S. Department of Commerce negotiated a “safe harbor” for U.S. companies that was approved by the EU in 2000. Under the safe harbor, a U.S. organization may voluntarily join by agreeing to abide by the following seven safe harbor principles: (1) notice (organizations must notify individuals about the purposes for which they collect and use information); (2) choice (organizations must give individuals the opportunity to opt-out for personal information and opt-in for sensitive information which will be disclosed to a third party or used for a purpose incompatible with the purpose for which it was originally collected or subsequently authorized by the individual); (3) onward transfers (to disclose information to a third party, organizations must apply the notice and choice principles); (4) access (individuals must have access to personal information about them that an organization holds and be able to correct, amend, or delete that information where it is inaccurate); (5) security (organizations must take reasonable precautions to protect personal information); (6) data integrity (personal information must be relevant for the intended use, accurate, complete, and current); and (7) enforcement (independent recourse mechanisms and procedures must be available to address individual complaints, with the provision for damages and meaningful sanctions).

 If a third country does not provide an “adequate level of protection,” the directive provides several additional exceptions to the prohibition of personal data transfer outside of the EU, including:  (i) the data subject has given his consent unambiguously to the proposed transfer; (ii) the transfer is necessary for the performance of a contract between the data subject and the controller or the implementation of precontractual measures taken in response to the data subject's request; (iii) the transfer is necessary for the conclusion or performance of a contract concluded in the interest of the data subject between the controller and a third party; (iv) the transfer is necessary or legally required on public interest grounds, or for legal claims; (v) the transfer is necessary to protect the vital interests of the data subject; or, (vi) the transfer is made from a register which according to laws or regulations is intended to provide information to the public and which is open to consultation either by the public in general or by any person who can demonstrate legitimate interest.   Furthermore, an EU member may authorize the transfer of personal data to a non-compliant third country if it can demonstrate the existence and applicability of safeguards to protect the privacy and fundamental rights of individuals, with specific recognition of contractual clauses that protect rights. 

In addition to those previously noted, the EU Data Protection Directive is riddled with exceptions. For example, the processing of personal data for historical, statistical or scientific purposes is not prohibited in the presence of appropriate safeguards. Furthermore, historical, statistical, or scientific data may be kept for longer periods with appropriate safeguards. To minimize disruptions in everyday life, personal and household use of data is beyond the scope of the directive. Under certain conditions, churches, trade unions, and other non-profits are permitted to keep sensitive information about members. The processing of personal data for journalistic purposes, artistic purposes, and literary expression are also generally exempt from the directive. Most importantly, the scope of the directive does not include the processing of personal data concerning security, defense, and criminal law. EU governments are given the power to adopt legislative measures to restrict the directive to safeguard national security, defense, public security, criminal law, and taxation matters. In essence, an EU member has the option to limit the EU Data Protection Directive exclusively to private interests while excluding the government from the main thrust of the directive. As applied to data collection and profiling on the Internet, the basic framework of the EU Data Protection Directive provides that, as between private parties, Internet user data may only collected for legitimate purposes; such data may only be processed if the Internet user unambiguously consents; and, the collected data must be current, relevant, accurate, and kept no longer than necessary. Data collected on the Internet within the EU may not be transferred to parties outside of the EU without satisfying certain exceptions. Last, EU member states must provide an enforcement mechanism to uphold the principles of the directive.

Criticism of the EU Data Protection Directive The EU Data Protection Directive has been criticized from many angles. Some of the leading areas of criticism are as follows: Bureaucracy and Complexity Although the EU Data Protection Directive recognizes the importance of data collection, the directive is based on a top-down approach; all data collection and processing is prohibited unless certain conditions are met. Compliance with the top-down approach imposes heavy costs and inconveniences on EU companies and third country companies attempting to comply. Inconsistent With Other Fundamental Rights The EU Data Protection Directive has raised concerns as to whether application of the directive will impinge on other fundamental rights. With respect to free speech rights, a literal application of the directive would prohibit the posting of all information on the Internet that identifies and individual. As such, the impact of the directive is overbroad which implicates freedom of speech. In a recent case, the Swedish Supreme Court reversed the conviction of an individual who posted on the Internet severe criticism of Swedish banks and related bank officials. The conviction was based on Swedish law conforming to the requirements of the EU Data Protection Directive. The court recognized the contradictory requirements of the directive and freedom of speech and broadly interpreted the directive’s exception for journalists and authors for free speech purposes. Trade Barrier Concerns Under the EU Data Transfer Directive, EU companies from EU member states with conforming legislation benefit from the unobstructed flow of personal data within the EU. However, non-EU companies are subject to the general prohibition on data transfer between EU member states and third countries. For example, U.S. multinational companies not participating in the safe harbor are disadvantaged as compared to EU companies and may be subject to substantial penalties for noncompliance. Such disadvantage may be viewed as a significant non-tarriff trade barrier. Unknown Scope of Application In addition to creating a potential trade barrier, the principles of the directive conceivably apply to a broad range of situations which may implicate non-EU interests to create potential third party liability under the directive. For example, the directive may conceivably apply to an EU consumer surfing the website of a U.S. company and related data collection, even if the U.S. company’s server is located within the borders of U.S. Furthermore, the directive conceivably applies to all email communications initiated from the EU, since email messages implicate a “data subject” and presumably contain “personal data.” Government Exception The most glaring criticism of the EU Data Protection Directive involves the powers reserved to government. As stated above, the governments of the EU are free to collect and process personal data for purposes of national security, criminal matters, and taxation matters. Viewed broadly, the government exemptions touch upon every conceivable use of personal data. As a result, the EU Data Protection Directive merely applies to the collection and processing of personal data in the private sector, although the principles of the directive are intended to be implemented and enforced by government. By excluding government, the EU Data Protection Directive fails to recognize the most avaricious collector and user of personal data. Due to a long history of information abuse, governments are generally viewed with suspicion regarding the collection and processing of personal information. The directive recognizes privacy as a fundamental right enforceable against the private sector, but data rights are not enforceable against government, the party charged with enforcement of the directive. The enhanced ability of EU governments to collect, process, and monitor personal data under the directive are contrary to the recognition of privacy as a fundamental right.

Should the U.S. Follow the EU Lead? Although the EU recognition of privacy as a fundamental right is highly commendable, the EU Data Protection Directive suffers from many problems. Application of the directive may contravene other fundamental rights, such as freedom of speech. Additionally, the top-down framework of the directive matches or exceeds the uncertainties and complexities in applying the patchwork of U.S. privacy laws to data collection and processing. Last, the directive falls short by failing to recognize the fundamental right of privacy as applied to government. As such, the directive does not prohibit use of personal data collected on the Internet and related data profiling by the government. The EU Data Protection Directive is not an appropriate framework for the recognition of privacy rights in the U.S.

Recent Federal Legislative Efforts Addressing Privacy Although privacy has historically played a role in many U.S. legislative efforts, there has been a flurry of activity in recent years with respect to the recognition of privacy rights pertaining to online data collection and profiling. The heightened activity is partly attributed to suggestions from various Federal Trade Commission (“FTC”) reports submitted to Congress. The May 2000 FTC report recommended legislation that would require all consumer-oriented commercial websites that collect personally identifiable information to comply with four widely-accepted fair information collection practices: (1) notice - whereby websites would be required to provide consumers with conspicuous notice of information practices, including what information is collected, how it is collected, how it is used, whether information is disclosed to other entities, and whether other entities are collecting are collecting information through the site; (2) choice - whereby websites would be required to offer consumers choices as to how their personal identifying information is used beyond the use for which the information was provided; (3) access - whereby websites would be required to offer consumers reasonable access to the information collected about them, including a reasonable opportunity to review the information and to correct inaccuracies or delete information, and (4) security - whereby websites would be required to take reasonable steps to protect the security of information collected. The FTC reiterated that enforcement through “the use of a reliable mechanism to impose sanctions or noncompliance” remained a "critical ingredient in any governmental or self-regulatory program to ensure privacy online.” In July 2000, the FTC issued a report recommending legislation to address online profiling. 106th Congress (1999-2000) During the 106th Session of Congress spanning the period of 1999-2000, more than thirty bills introduced in the House and Senate directly addressed Internet privacy rights in some manner. Although significant efforts were spent addressing the topic of Internet privacy during the 106th Congress, none of the major Internet privacy bills were passed and signed into law. Notable Internet privacy legislation during the 106th Congress included: S. 2928 Consumer Internet Privacy Enhancement Act (would require commercial websites to provide specific notice of practices with respect to personally identifiable information and opt-out provisions); S. 2606 Telecommunications and Electronic Commerce Privacy Act (would require opt-in provisions for the collection and disclosure of personally identifiable information with FTC enforcement authority); HR 3321 Electronic Privacy Bill of Rights Act (would require website privacy disclosures, consumer consent, and access to own personal data); and, HR 2644 Personal Data Privacy Act (would prohibit government from transferring, selling, or disclosing personal information without consent). Overall, much of the legislation introduced in the 106th Congress tangentially followed the basic structure of the EU Data Protection Directive and closely followed the guidelines set forth by the FTC regarding notice, consent, access, and security. However, none of the legislative efforts directly addressed the collection and use of online data by government. 107th Congress (2001-2002) Following the path of the 106th Congress, the 107th Session of Congress considered many Internet privacy bills which addressed both the government’s access to and use of information as well as the practices of commercial website operators. Consistent with the results of the 106th Congress, none of the major Internet privacy bills that addressed the practices of commercial website operators were passed and signed into law. Notable Internet privacy legislation during the 107th Congress that addressed the practices of commercial website operators included: HR 2135 Consumer Privacy Protection Act (would require notice, opt-out provisions for personally identifiable information, opt-in provisions for sensitive personal information, and limits on disclosure by recipients); S. 1055 Privacy Act of 2001 (would limit the sale and marketing of personally identifiable information); S. 2201 Online Personal Privacy Act (comprehensive legislation providing provisions regarding notice, consent, access, security, and enforcement for personally identifiable information); and, HR 4678 Consumer Privacy Protection Act of 2002 (provisions for notice, choice, and access for personally identifiable information). The 107th Congress did enact legislation that broadened the powers of the federal government with respect to information privacy. In the wake of September 11, 2001, the following four privacy-related laws were enacted by the 107th Congress: The 21st Century Department of Justice Appropriations Authorization Act (P.L. 107-273); the USA PATRIOT Act (P.L. 107-56); The Homeland Security Act (P.L. 107-296); and, the E-Government Act (P.L. 107-347). The USA PATRIOT Act and The Homeland Security Act both broadened the powers of the federal government to monitor Internet activities based on law enforcement and national security justifications. Most notably, the USA PATRIOT Act widely broadens law enforcement’s power to monitor Internet activity. The USA PATRIOT Act expands the scope of subpoenas for Internet data, allows ISP’s to divulge information under certain conditions, and expands the scope of legal devices and methods used to monitor Internet data by government. However, the USA PATRIOT Act lacks judicial oversight for the use of its procedures. In furtherance of the USA PATRIOT Act, The Homeland Security Act lowers the threshold (“good faith” belief of an emergency involving danger, death, or physical injury) when ISP’s may voluntarily disclose information to a government entity. On the other hand, The 21st Century Department of Justice Appropriations Authorization Act and E-Government Act represent a recognition of privacy rights for individuals, albeit minimal. The 21st Century Department of Justice Appropriations Authorization Act requires the Department of Justice to report to Congress regarding the use of Internet monitoring systems (Carnivore, DCS 1000, etc.). The E-Government Act places certain restrictions on government privacy practices by providing a set of requirements addressing the privacy of personally identifiable information with respect to government agencies and establishes policies for federal government websites. However, neither The 21st Century Department of Justice Appropriations Authorization Act or the E-Government Act place limitations on the collection of personally identifiable information by the federal government. 108th Congress (2003-2004) Following tradition, several bills addressing Internet privacy have been introduced in the 108th Congress, although none have been passed and signed into law. HR 69 would require the FTC to prescribe regulations to protect the privacy of personal information collected from and about individuals on the Internet. HR 1636 Consumer Privacy Protection Act of 2003 is similar in form to HR 4678 introduced in the 107th Congress. HR 1636 provides notice, choice, and security provisions for personally identifiable information. Furthermore, HR 1636 provides a self-regulatory “safe harbor” and provides an enforcement mechanism through the FTC. However, HR 1636 covers only commercial entities and specifically excludes the government. Last, S. 745 Privacy Act of 2003 requires commercial entities to provide notice and choice regarding the collection and disclosure of personally identifiable information. In addition, there are four pending “spyware” bills before the 108th Congress: HR 2929 Safeguard Against Privacy Invasions; HR 4661 the I-SPY Prevention Act; HR 4255 the Computer Software Privacy and Control Act; and S. 2145 the SPY Block Act. Internet Data Gathering Technology This section details the technological means used to gather data about an Internet user. This will include a discussion of the methods used to gather data for online profiling, which are primarily cookies and web beacons. Later sections discuss the practice and methods of online profiling. Cookies A cookie is a small text file placed on a user’s computer by a web server when the user accesses a particular website . Its primary purpose is to store small amounts of data relevant to the website. The cookie can also transmit information back to the server that placed it (and usually only the server that placed it), allowing the server to collect information about the person using the website (the host of the cookie). There are different types of cookies: persistent or permanent cookies remain on a user’s computer for varying lengths of time, ranging from hours to years. Session cookies expire when the user exits the browser. These are often used as a convenience method for making a shopping cart or counting the number of unique visitors to a site. They can also be used to simplify some tasks, such as storing logon information so that a user does not have to re-enter a user id and password each time they visit a particular site, for example. Cookies can be placed on a computer without a user’s knowledge, such as when a banner advertisement served by a network advertiser appears on a website. The “Online Profiling” section discusses this in greater detail. For more information on cookies see To place a cookie on a given computer, the advertiser’s server just has to implement a simple piece of “script” in the HTML documents used to define a web page. See the sidebar for a simple example of Jscript code that could be used to place a cookie on a user’s machine. Users have the ability to accept or decline cookies. Most browsers automatically accept cookies, but users can modify their browser settings to decline cookies, or to issue a warning whenever a website attempts to place a cookie. In many cases, the functionality of a web site depends on the use of cookies, however. A user who declined all cookies may not be able to fully experience the website. A user who wanted a warning before accepting a cookie might be interrupted by a barrage of popup warnings about cookie placement. Either of these events would seriously disrupt the web-browsing experience. [SIDEBAR] This samples below show how to create a cookie and how to retrieve a value from it once it is placed. This sample is taken directly from the Microsoft Visual Studio help documentation . <SCRIPT> // Create a cookie with the specified name and value. // The cookie expires at the end of the 20th century. function SetCookie(sName, sValue) {

 date = new Date();
 document.cookie = sName + "=" + escape(sValue) + "; expires=" + date.toGMTString();

} </SCRIPT> This example retrieves the value of the portion of the cookie specified by the sCookie parameter. <SCRIPT> // Retrieve the value of the cookie with the specified name. function GetCookie(sName) {

 // cookies are separated by semicolons
 var aCookie = document.cookie.split("; ");
 for (var i=0; i < aCookie.length; i++)
   // a name/value pair (a crumb) is separated by an equal sign
   var aCrumb = aCookie[i].split("=");
   if (sName == aCrumb[0]) 
     return unescape(aCrumb[1]);
 // a cookie with the requested name does not exist
 return null;

} </SCRIPT> Cookies store their data in name-value pairs called “crumbs”. The cookie also has additional parameters that control when it expires, which servers can access it, and whether the cookie is secure (Accessible only from a secure environment). [END SIDEBAR] Web Beacons “Web beacons” are also known as “web bugs”, “single-pixel gifs,” “clear GIFs” or “1-by-1 GIFs.” Web bugs are tiny graphic image files embedded in a web page. They are generally either the same color as the background on which they are displayed or translucent, so that they are invisible to the naked eye. A web bug is placed in a web page with an HTML tag. The HTML tag is programmed to send information back to its home server, which can belong to the host site, a network advertiser or some other third party. This information can include: · the IP (Internet Protocol) address of the computer that downloaded the page on which the bug appears · the URL (Uniform Resource Locator) of the page on which the web bug appears · the URL of the web bug image · the time the page containing the web bug was viewed · the type of browser that fetched the web bug · the identification number of any cookie on the consumer’s computer previously placed by that server Companies use this technique to learn more about how visitors use their sites. The information may be used to target ads to those visitors on other sites. The clickstream activity may be used to determine future advertising downloaded to your browser. It is worth noting that cookies and web beacons can also be used in emails . In most cases, a user must opt-in to receive marketing emails from third parties, but there is no guarantee that this has to be the case [prove]. When you affirm your selection by clicking a hyperlink in an email or checking a box on a website, your email address gets added to the client’s email database. As part of the commentary included in “the 2000 report ,” Richard M. Smith outlined a viable method whereby, in some circumstances, web bugs can also be used to place a cookie on a computer or to synchronize a particular email address with a cookie identification number, making an otherwise anonymous profile personally identifiable. . Web bugs are difficult to block, since they are very similar in coding and appearance to legitimate transparent images used to space text and layout web pages . Web bugs can only be reliably detected by closely examining the source code of a web page and searching in the code for 1-by-1 IMG tags that load images from a server different than the rest of the web page. The only way to disable web bugs is to use a browser (and email system) that allows to block third-party images. Not all browsers can do this , although recent changes to Microsoft’s Outlook (email) and Internet Explorer can perform this action. Other browsers may also have this capability. Online Profiling A large portion of online advertising is in the form of “banner ads” placed on web pages. In many cases, web sites do not supply their own banner ads, but instead rely on third-party network advertisers such as DoubleClick or Engage . These network advertising companies can manage and supply advertising for numerous unrelated websites. In the year 2000, DoubleClick (one of the largest Internet advertising networks) served an average of 1.5 billion ads per day to websites . In 2003, they served an average of 1.8 billion ads per day . Advertising networks do not merely supply banner ads; they also gather data about the consumers who view their ads. The primary technologies used to enable this are cookies and web bugs, as discussed above. The ad networks can compile the following types of information about any activity that takes place on the computer , including:

· pages viewed · links clicked and other actions taken · query terms entered into search engines · purchases · “click-through” responses to advertisements · standard information that the browser sends to every website visited, including IP address, browser type and language, access times, and referring Web site addresses

All of this information can be obtained without the user having to click on even a single ad.

The information gathered in this fashion is usually anonymous. In most cases, the profile is linked to an identification number in a persistent cookie left by the network advertiser on the user’s computer, as opposed to being linked to the name of a specific person. This is non-PII, or “non-personally identifiable information.” There are ways to link the profiles derived from tracking web activities to personally identifiable information, however. The main methods whereby an advertising network can link non-PII to PII are as follows: First, the website to whom personal information (through a form or application filled out by the user) is provided may, in turn, provide that information to the network advertiser. Second, depending upon how the personal information is retrieved and processed by the website, the personally identifying information may be incorporated into a URL string that is automatically transmitted to the network advertiser through its cookie. This includes the method of using web beacons to link PII to a profile, as discussed in the comments by Richard M. Smith. A previously anonymous profile can also be linked to personally identifiable information in other ways. For example, a network advertising company could operate its own Web site at which consumers are asked to provide personal information. When consumers do so, their personal information could be linked to the identification number of the cookie placed on their computer by that company, thereby making all of the data collected through that cookie personally identifiable. As a specific example of this type of linkage, the DoubleClick privacy policy points out that DoubleClick may user voluntarily supplied personal information in order to facilitate the delivery of goods, services, or information, and that DoubleClick may use this PII for “aggregate analysis.” It is unclear whether DoubleClick will link PII with previously collected non-PII while performing this analysis. Data Mining and Analysis Once collected, the network advertiser (or another party) can analyze the profile data and may combine it with data from third-party sources, data on the consumer’s offline purchases, or information collected directly from the consumer via surveys and registration forms. All of this data will be stored in a large database, which allows the advertising network to use data mining techniques to make inferences and conclusions about the consumer’s preferences and interests. Data mining was originally a term referring to overusing data to draw invalid inferences , but today refers to the process of executing complex queries on large, sometimes seemingly unrelated databases to draw useful summaries of data. In this case, the data miners are interested in producing profiles of people, and analyzing activity and deducing patterns in the information The result of the data gathering and analysis is an extremely detailed profile that can be used to predict the individual consumer’s tastes, needs, and purchasing habits. Because the network advertiser can track a consumer on any web site served by the company, they can collect data across unrelated sites on the web. The tracking can also occur over extended periods of time, thanks to persistent cookies. The advertising companies’ computers can then use sophisticated algorithms to analyze this profile and decide how to deliver ads directly targeted to the consumer’s specific interests, or even things that they might be interested in. This is similar to the practice (found on most bookselling websites) of telling you that “people that bought (the book you just purchased) also bought Jane Smith’s Poem Collection.” The potential impact of this extensive and sustained profiling is staggering. Given the current political climate and the behavior of the media, imagine the frenzy in 20-30 years when a profile analysis reveals that a candidate in a tight Senatorial election web-surfed to a few times when he was a young adult? Or suppose that detailed profile analysis by the government discovers that 10% of the people who have web-surfed looking for information on terrorism, tax evasion, bomb technology, embedded programming, pornography, and blue cheese pasta recipes are in fact terrorists? There is also the potential for spurious data entering a profile . How Online Profiling Works “Online Profiling: A Report to Congress” provides an excellent anecdotal illustration of how online profiling works. In slightly more technical terms, the process is as follows: 1. When the user first enters a site, the browser automatically sends some information to the server so that the site can communicate with the user’s computer. Information such as browser type, browser version, hardware version, operating system, and the language used by the computer, as well as the computer’s IP address. 2. The server responds by sending back appropriate HTML code for the requested page. A user may get a different layout when requesting a web page from a wireless PDA versus a desktop PC, for example. 3. Embedded in the HTML code that the user receives is an invisible link to the online profiling site. The browser automatically sends (gets triggered) another HTTP request which identifies the browser type and operating system; the language(s) accepted by the browser; and the address of the referring Web page. 4. Based on this information, the online profiler places a banner ad in the space at the top of the page. The ad will appear as an integral part of the page. 5. The online profiler can now place a cookie with a unique ID number on the user’s computer, if there isn’t one there already. 6. As the user moves around between web sites serviced by the online profiler (network advertiser), the network advertiser can build a profile of the user. Each time the user visits a new site or clicks a link serviced by the particular advertiser, more information gets transmitted, which helps to build the detailed profile. In addition, the online profiler can associate any search terms that the user enters on linked sites, and add those terms to the developing profile. 7. The network advertiser analyzes the collected profile information and makes some decisions about what ads to serve to the user the next time they surf the Web. As an example, if a user searches for golf clubs on a sporting goods site and Scotland on a travel site, they might get an ad for a golfing vacation package in Scotland the next time they surf the web. Profile Access Users have a limited ability to edit their online profiles. For example, a user of MSN may edit the information in their Microsoft Passport, change billing information, or edit information in their MSN public profile [12]. Note, however, that a user cannot edit their profile to remove the fact that someone using their computer visited the subversive and controversial website. Visibility In most cases, online profiling activity is invisible to the consumer. The presence and identity of a network advertiser on a particular site, the placement of a cookie on the consumer’s computer, the tracking of the consumer’s movements, and the targeting of ads are simply invisible in most cases. There are essentially only two viable ways to discover that online profiling is taking place: the user can either set the browser to warn about cookies, or review the privacy policy of every website visited. Unfortunately, very few people have even heard of (Internet) cookies, and even fewer have a basic understanding of what one does . Turning cookies off is difficult , and most people would not know that they could do so in any case. In many cases even reviewing a website’s privacy policy will not help you, as a significant number of websites do not disclose the fact that they use or allow cookies. The 2000 Report discusses these statistics in detail, but the basic finding is that most of the sites surveyed allowed third-party cookies, but not all of them disclosed this fact . Reviewing all the privacy policies on every web site visited is impossible, in practical terms. In a typical browsing session, a user might visit dozens of apparently unrelated sites. Web sites do not typically provide prominent placement for their privacy policy, as discussed in the findings of the 2000 report. Such documents can also easily amount to 32 pages of single-spaced tortured legalese, which you cannot expect a person to read, digest, and understand in a limited amount of time. So the typical response is “I just need to get on with my surfing.” In many cases, the user agrees to a privacy policy that he or she has not read (and does not have time to read), which could almost literally contain anything. It also may be difficult to find. Perhaps the link is in very tiny type at the bottom of an obscure sub-page, instead of featured prominently on the site’s home-page. The Network Advertising Initiative self-regulatory principles state that a network advertiser’s customers should post a privacy policy that clearly and conspicuously discusses the use of profiling data, but this rarely occurs. Invasive Profiling In addition to “passive” online profiling through clickstream analysis and the gathering of data input into ordering forms, etc., there exists the possibility of more invasive profiling using spyware and adware. Spyware is software that collects information about the use of the computer and periodically relays that information back to a collection center. Alternatively, also refers to software that can record a person’s keystrokes and make it available to another party. It is certainly possible to make spyware that can silently deploy onto a target computer via email . Adware is advertising supported software . The software can usually be downloaded free from the web, but it contains banner advertisements that create revenue for the company. Adware will usually install components on the computer that will send marketing information whenever the user is online. Adware usually contains a disclosure telling you that they will be using your information. A recently reported case involving illustrates some of the problems with adware . In this case, the website advertised a “free” mouse cursor available for download, with the plea to “Show your support for our troops by downloading our free cursors!” The catch was that by downloading the cursor software, users also agreed to install a number of other programs made by the company, including a product called “KeenValue.” This product allowed eUniverse (the parent company of to collect information such as: · Websites/pages viewed · The amount of time spent on some websites · Response to advertisements displayed · Standard web log information ( IP address, system setttings, software installed on your computer, first and last name, country, five digit zip code) · Usage characteristics and preferences. This amounts to a very detailed, very specific online profile. Notice the similarity? The difference in this case (as opposed the data gathering conducted by ad networks) is that the KeenValue software allowed eUniverse to track every website viewed on the computer, not just the ones linked to their ad network. Despite the apparent invasive gathering of information, Anthony Porter of agreed with the assertion that KeenValue was not (quite) spyware – it was merely very invasive adware. Other adware programs such as Gator and eZula operate in a similar fashion. EUniverse spokesman Todd Smith said that the practice of linking the adware program to the plea for web users to “support our troops” was a common practice in Internet advertising. The most common Internet model today provides free content, most often subsidized by advertising. Privacy Through Self Regulation Whereas browsing though the shelves at a public library can be performed anonymously, browsing the World Wide Web leaves a surprising amount of information about the user behind. The privacy threats created by this information trail are compounded by the fact that data can be kept in electronic storage for extended periods of time and retrieved at a moment’s notice. Is industry self-regulation a viable option for maintaining privacy? This chapter will examine the limitations of purely technical tools designed to allow users to control how their personal information is disseminated and critique organizational approaches with respect to how well they disclose privacy policies to the consumer. Web Anonymizers Web site data collection can roughly be classified as passive or active. Passive data collection is transparent to the user and is automatically sent by web browsers when navigating through a web site. Each time a link is clicked, a client will send a request to the remote server for the desired resource. By looking at the header information in a web request, a remote web server can retrieve the user's IP address, browser type, and the page the user was referred from. Cookies, discussed earlier, are another form of passive data collection that can be used to uniquely identify visitors and track their movements between web sites. In active data collection, the user explicitly provides data to the web site. Examples of this include filling out registration forms to obtain access to restricted content, entering shipping information to complete an order, or submitting personal preferences to customize the browsing experience. A web anonymizer is an intermediary that sits between a client and a remote site and intercepts web traffic passing between the two. Instead of sending requests directly to a remote site, clients first send them to the anonymizer which repackages the requests and then forwards them on to the remote site. From the remote site's point of view, it is communicating with the anonymizer and not the user. The web page returned by the remote site will contain hyperlinks to other servers. Before passing the page back to the client, an anonymizer will automatically ‘scrub’ the links so that they refer to the anonymizer rather than the original source. When a user clicks on a hyperlink in the scrubbed page, the request is first sent to the anonymizer. There are several variations on the theme that increase privacy. Secure communication schemes such as SSL can be added between the client and the intermediary to prevent an eavesdropper from intercepting data sent between the user and the anonymizer. A chain of anonymizers can be used to forward requests along. With this technique, the original requester cannot be determined unless all anonymizers along the path are comprised . Web browsers are complicated pieces of software and as a result there are several ways that an anonymizer can be confounded. The pages a browser needs to render not only contain static hyperlinks but also pieces of code written in JavaScript that needs to be interpreted by the browser. JavaScript can be used to dynamically insert links to a remote site into a web page. It is a difficult task for an anonymizer’s page scrubber to perform a shallow syntactic analysis of JavaScript code and remove potential hazards in a timely fashion . Ad-hoc rule-based approaches that attempt to recognize potentially malicious JavaScript statements are akin to plugging leaks in a crumbling dam with one's fingers. More systematic techniques require a deeper analysis of the code but also degrade the page rendering speed. Because of this, users are forced to choose between anonymizers that support JavaScript but are slower to render or anonymizers that require the disabling of JavaScript and may result in incorrectly rendered or broken web pages. Another way to bypass the protection afforded by an anonymizer is through third party viewer applications. When certain types of media are opened, the browser will transfer control to the viewer. An anonymizer only has control over the HTML that is sent to the browser and can do nothing about viewers that wish the transfer personal information. For example, clicking on a PDF link will display the document in a browser window but control has actually been transferred to Adobe Acrobat. This can result in a situation where a user is under the mistaken impression that he or she is protected when in fact a viewer is transferring personal information from under the anonymizer’s nose. Anonymizers are useful for hiding passive information but a user who provides active information by filling out web forms circumvents the privacy protection. In certain situations, providing active information is unavoidable in order to retrieve content. Many news web sites such as the New York Times require registration in order to view articles. provides a solution to this problem by providing a publicly accessible database that contains a set of names and passwords for sites that require free registration . A user can register with a web site and then send the account information to so others can use the login information in lieu of registering. Since many users login with the same account, usage information gathered by the web site is lost in the crowd. However, the ability to blend into a crowd is a hindrance when trying to establish a persistent profile. Building a user profile is crucial for crafting personalized pages and to maintain a presence in virtual communities such as online message boards. The Lucent Personal Web Assistant (LPWA) tackles the problem of maintaining a profile while remaining anonymous by providing an alias email address that the user can provide to a remote site during registration . The remote site only has knowledge of a user’s alias account information and LPWA automatically forwards mail onto the user’s real address. LPWA can be setup to use different aliases for each web site a user might want to register with to confound any attempts at data fusion across sites. Despite the best efforts by web anonymizers to maintain privacy they are not sufficient in and of themselves when considering the active information required to complete an e-commerce transaction. To complete a transaction for a physical good, a user needs to provide payment information and an address in order to receive the product. Concerned users can avoid electronic payment by using money orders or cashier’s checks at the cost of delayed shipping. Anonymizing a delivery address is a bigger problem. PO Boxes can be used but shipping companies such as UPS and FedEx do not deliver to them. Purchasing plane tickets online without providing a real name and contact information is nearly impossible given post-9/11 security protocols. Completely anonymizing an e-commerce transaction is a difficult task at best and it is extremely unlikely that users will resort to such measures in order to protect their personal information. Once this personally identifiable information has been submitted to a web site the user has relinquished control over it. In order to conduct transactions over the web some level of trust must be established between the user and the company. The first and most important step in establishing rapport is providing sufficient notice of privacy practices to users. Fair Information Principles The FTC established the core tenets of privacy protection in a 1998 report by looking for common threads in data collection practices in the United States, Canada, and Europe . These widely accepted principles are the closest thing to a best practices guideline that the industry currently has. The primary principle, and the focus of the following sections, is that of notice. Individuals should be made aware of what personal information is being collected, who is collecting it, how it is being used, and how it is shared with third parties. Once individuals have been informed of the data gathering practices, there should be a mechanism to provide consent. Most online companies use some combination of opt-in and opt-out schemes to provide consumers with the ability to specify how their personal information can be used. In addition to notice and consent, an individual should have access to stored information and have some way of submitting corrections as well as the means of filing a complaint against a company. Companies should also provide a reasonable level of effort to secure collected personal information against intruders. It is important to note that the principles are not independent of each other. Consent, access, and redress are only possible if individuals are provided proper notice. While self-regulation systems are often criticized on the basis of enforcement and redress, it is perhaps more important to first examine how well they communicate privacy practices to the end user. Privacy Seals Privacy seals are awarded to companies whose privacy notices meet some minimum standards set by independent auditing agencies such as TRUSTe and BBBOnline. Sites that meet the requirements are allowed to display a privacy seal. Seal providers conduct random audits of member sites and also offer arbitration services to help settle privacy disputes. Companies pay seal providers based on a percentage of yearly revenue. Seal providers hope to increase a consumer’s trust level by vouching for a company’s privacy practices. TRUSTe’s guidelines for drafting a privacy policy indicate that it should follow the Fair Information Principles for informing users of the company’s data collection practices . It is important to note that the seal requirements do not enforce what information should be collected or how users should indicate consent. Sites are free to collect and use as much information as they want so long as the explain what they are collecting in the privacy statement and provide users with opt-in or opt-out choices, the ability to access collected information, and information about how to submit complaints. The burden is on the user to read the statement and decide if the terms are acceptable. A recent survey indicates that only 3% of web surfers carefully read the privacy policies of the web sites they visit. 64% of users spend no time reading them or only occasionally glance at them . Privacy seals do not adhere to the principle of notice because people are not reading them. In addition, privacy seal growth appears to be stagnant. A 2001 survey by the Progress and Freedom Foundation reported that only 12% of randomly sampled web sites were displaying seals. At the time, TRUSTe claimed almost 2,000 members and BBBOnline had 760 . At the time of writing, there are 1,458 participating TRUSTe websites and 763 members for BBBOnline . One of the great benefits of the World Wide Web is the ability to connect any arbitrary set of pages together in a way that is both easy to author and easy to navigate for readers. Indeed, a Notre Dame study found that an average of only 19 clicks was needed to any connect any two randomly selected web sites . The ease in which different pages on different sites can link together actually hinders the effectiveness of privacy seals. After clicking on a link, it is not always apparent to the user which site is the one serving the information. Pages that open within frames compound this problem by not displaying the server name in the web browser’s address bar. Most privacy statements contain disclaimers stating that the company is not responsible for the privacy practices of third party sites linked from its web pages. Even if users were to start reading privacy policies carefully, they may not realize when they have left the site where the statement applies and entered another where a completely different policy may be in place. Clearly, a more automated approach to providing notice to users is needed. P3P The Platform for Privacy Preferences Project (P3P) is a standard for automatically communicating privacy policies to end-users. P3P is built on top of the existing HTTP standard and does not require the deployment of new web servers in order to implement it. Web site operators can use the P3P standard to specify privacy practices in a machine-readable file that software tools on the client side can automatically download and interpret. The client software tools compare a web site’s P3P policy file to a predefined user profile and warn the user if the site does not meet his or her minimum privacy standards. A P3P policy file is essentially a distillation of a web site’s privacy practices into a series of answers to multiple choice questions. The policy files are an attempt to create a standard way for web sites to disclose information usage, how users indicate consent, how users can access personal information, and redress options. For example, the P3P standard dictates that policy files must disclose how each piece of personally identifiable information will be used. The description of each piece of collected data is annotated with one or more of 12 purposes. The data collection purposes range from whether the information will be disclosed to third parties, if the data will be kept for historical purposes, or if the data will be aggregated with other users and used for later analysis. Each purpose is further tagged as being opt-out, opt-in, or always collected . P3P requires support both from the web site to provide a policy file and from the client to run user agents to parse the policy and compare it to a user profile. Microsoft’s Internet Explorer 6 controls the placement of cookies using a stripped-down version of P3P known as compact policy files . The AT&T Privacy Bird is a more complete solution that is capable of parsing complete P3P policy files and integrates into Internet Explorer. Users can select a predefined low, medium, or high privacy profile as well as tweak individual profile elements. If the user navigates to a site that conforms to his or her profile, a green bird appears in the browser toolbar. A red bird appears when opening sites that conflict with a user’s profile and a yellow bird is displayed if a site does not contain a P3P policy file at all . At first glance, P3P appears to solve the problems of providing notice that occur due to a user’s failure to read privacy statements as well as not knowing when new policies are in effect. Unfortunately, like privacy seals, P3P suffers from a low industry adoption rate. Although the number of P3P compliant web sites has been slowly increasing, a May 2004 Ernst & Young survey shows that only 24% of the top 500 web sites have deployed P3P policy files . Critics argue that creating policy files is a time consuming and difficult task and may be infeasible for smaller companies with fewer resources. Furthermore, it is problematic to try and shoehorn the expressiveness of a full privacy statement into small set of discrete options. Companies also fear the legal ramifications that may result from the loss of fidelity that occurs when translating a privacy statement into a P3P file. An oft-repeated charge levied against P3P by privacy organizations such as EPIC is that the standard does nothing to enforce a minimum set of privacy standards . A web site can create a policy file that states that personal information will be collected, aggregated with other information to create a profile, and sold to third parties indiscriminately and still be deemed P3P-compliant. It is problematic however, to mandate a one-size-fits-all privacy standard that applies to all users. An individual may be comfortable providing information because it creates a more personalized browsing experience. That same individual may not want information to be shared in a different context. For example, a user might enjoy seeing recommendations for similar CD’s or books based on past purchases but not want browsing data to be captured when searching for material about an embarrassing medical condition. The burden of maintaining an individual’s privacy standards should be placed on the P3P client tools run by the individual and not on the web site itself. The responsibility of P3P client tools should be to provide adequate notice to a user so he or she can make an informed decision about privacy. There are two main problems with trying to shift the responsibility of providing notice to client tools. First, there is the difficulty of informing users about the existence of P3P client tools in the first place. Although mainstream browsers such as Internet Explorer contain limited P3P implementations, more complete tools such as the AT&T Privacy Bird need to be downloaded and installed separately. The other and perhaps more fundamental problem is the fact that privacy often takes a backseat to functionality. In order to ensure that users are properly notified of privacy practices, P3P client tools need to default to more restrictive privacy settings and let users relax the constraints as needed. Unfortunately, many web sites will not operate correctly without the placement of cookies and do not provide ways for users to turn off information collection. A survey of P3P-enabled web sites found that 82% failed to meet the high privacy profile settings defined in the AT&T Privacy Bird tool . P3P client tools often default to less restrictive settings in order to maintain a seamless browsing experience for the user. In the same way that users do not take the time to read privacy notices, users are also unwilling to tinker with the default settings of P3P client tools. Improving Industry Adoption Both seal programs and P3P suffer from a low industry adoption rate. Companies have little incentive to spend the time and money required to apply for a privacy seal or create a P3P policy file if consumers are currently willing to share personal information or are in the dark about a company’s data collection practices. Why should a company expend resources to provide better notice to users when it may create an aversion to sharing personal information at all? A recent study demonstrated that users may be in fact be more willing to share personal information if both the relevant portions of the privacy policy along with the benefits of sharing information are displayed in context . In the study, a concise description of the relevant privacy practices along with the benefits of sharing was displayed next to the data entry fields in the web page itself. Providing users with better notice increases their ability to weigh the costs and benefits of information sharing. By incorporating contextualized privacy notices, companies may be able to increase their perceived commitment to privacy as well as obtain more data from better-informed users. The results of this study suggest that privacy seals are inadequate and P3P client tools may not be operating at the correct level of granularity. Both Internet Explorer and AT&T Privacy Bird make yes or no decisions for an entire web site. Users are forced to make a decision up front whether to continue browsing without understanding how data sharing might be beneficial to them. A better solution would be to link user interface controls used in web pages such as text boxes and radio buttons with references to the section in the P3P policy file that describes what data the control is collecting. The P3P specification currently has support for annotating data collection disclosures with a human readable sentence explaining why providing the information might be valuable to the user but policy writers are not required to provide this . Armed with this additional information, P3P client tools would be able to display the relevant costs and benefits to the user next to the data entry widget itself. Increasing consumer awareness of data collection practices may drive companies to use higher default privacy settings for P3P client tools. It may be instructive to compare industry’s treatment of privacy with its treatment of security. In the past, expanding a program’s feature list would often override any security concerns. The proliferation of worms, viruses, and other vulnerabilities along with the ensuing negative media coverage changed the priorities of software makers. The recently released Service Pack 2 for Windows XP increased the default security settings of the operating system at the expense of maintaining compatibility with legacy applications . If Microsoft were to add full P3P support to Internet Explorer, set the default privacy settings to a higher level, and let users re-adjust as needed, companies would be forced to adopt P3P in order to maintain compatibility given IE’s dominant market share. Despite industry protest to the contrary, the most expedient way to improve adoption might be to pass legislation requiring it. Lawmakers have a crucial role to play in the enforcement of the Fair Information Practices. It is of critical importance however, that any proposed legislation keeps the principle of notice in mind.