Sunday, March 30, 2008

US Surviellance Laws Hurt U.S. Businesses

A recurring theme on this blog is that foreign data privacy protections can have effects in the U.S. Generally, the posts that deal with this have been about what positive effects of foreign privacy protections might spill over onto U.S. consumers (see, e.g., here). However, this article from the Globe and Mail illustrates another effect that foreign privacy protections can have in the U.S. - they create new markets where U.S. companies are at a competitive disadvantage. The specific situation described is as follows: a Canadian University was having IT problems and needed a solution. Google was looking for customers for its online collaboration products. It sounded like a marriage made in heaven. Except that Google is subject to the USA Patriot Act, which the U.S. Federal Government can use to snoop on people's private communications. That, of course, is incompatible with Canadian privacy law. As a result, there's a kink in what would otherwise be the perfect marriage between a university and Google: those using the University's spiffy new IT tools were told not to use them to transmit any private data, including grades.

On one hand, I'm tempted to laugh at a story like this. Honestly, to me there's something humorous about a university getting a bunch of cutting edge IT tools, then not being able to use them for even the most mundane tasks, like reporting student marks. However, on the other hand, incidents like this could be represent a serious problem for American businesses. Because (unlike the U.S.) many foreign countries actually have great respect for individual privacy, their consumer populations have the ability to form an entirely new market, the market made up of people who care about privacy. As shown by the article in the Globe and Mail, because U.S. law is actively hostile to individual privacy, U.S. companies can only compete for this new market at a severe disadvantage relative to foreign companies. Now, for a monster organization with a huge and well developed stable of products like Google, competition might still be possible. However, for the most part, my guess is that U.S. companies will not be able to overcome the structural disadvantages imposed by this country's disregard for privacy.

Thursday, March 27, 2008


Compliance by U.S. multinational companies with the data protection and e-discovery laws, rules and regulations in both the U.S. as well as other international jurisdictions can pose significant challenges. While the laws do not impose conflicting requirements, the differences in the approach to data privacy protection between U.S. laws and those of the EU and its member states and the complexities of their requirements demand a comprehensive team approach to compliance.
The U.S. federal and state laws take a patchwork approach to personal data protection, with a myriad of data privacy requirements based upon industry. There is, however, no a comprehensive data privacy protection law. Nor are there special requirements for the transfer of personal data, cross-border or otherwise, as long as the “sharing” has been disclosed to the consumer, or is within the exceptions provided for by applicable law.
More than 35 states have enacted data breach notification and security freeze laws, with many variations on the method and timing of notification among them. To date, the U.S. Congress has not been able to agree upon a uniform approach for data breach notification and security freeze rights. Amendments to the Federal Rules of Civil Procedure that became effective in December, 2006 have underscored the importance of electronic discovery, so that corporate counsel must be concerned with the risks and potential sanctions that could result from non-compliance with a discovery order.
Companies that collect and process their own employee or customer data in the U.S. when that data resides in a European Union member country are presented with even greater challenges. The EU Data Protection Directive, as well as the data protection laws of the country of the data subjects’ residence, impose broader data privacy requirements on companies. In addition to the U.S. laws, such companies must be cognizant of the laws of the jurisdiction where the data to be processed resides. The aim of the Directive is to ensure that each member state imposes a similar level of protection of data, so that data can be transferred freely within the EU subject to the same security standards in the country of receipt as it is in the country of transfer.
The EU member states that were formerly under the control of fascist regimes during World War II are particularly keen on avoiding the abuses of individual privacy rights that occurred during that period. Thus, the Data Protection Authorities of France, Spain, Germany and Italy are very active in their enforcement efforts, conducting costly investigations, and levying monetary sanctions and fines against violators of their laws. On April 12, 2007, the French DPA, CNIL, announced the imposition of a fine of €30,000 against Tyco Healthcare France Corporation for non-cooperation and for providing CNIL with erroneous information. To date, the CNIL has imposed 16 monetary sanctions, ranging from €300 to €60,000, issued 170 summons, 11 orders or cease or amend processing practices, and 15 warnings. This equals a 200% increase in activity since 2006. In July, 2007, Spain’s Supreme Court confirmed its DPA’s largest ever fine in the amount of €1,081,822 against Zeppelin Television, S.A. Additionally, for the first time Spain’s DPA has conducted a data privacy audit outside of Spain, in Colombia, where Spanish citizens’ personal data is being processed. In addition, the EU’s Data Privacy Commission has been active in enforcing the requirements of the Privacy Directive on its member countries.
Earlier this year, an independent EU panel launched an investigation into whether U.S.-based Google Inc.'s Internet search engine abides by European Union privacy rules. The panel convinced Google to clear its user data of information that could be used to identify the user once the data has existed for 18 months. Google accurately noted, however, that governments and businesses are obliged to retain information, and it is difficult to operate a global Internet service according to different privacy standards in different countries.
The same observation can be made as to many other types of businesses as well. The complexities and risks associated with privacy laws have never been greater, and require vigilant monitoring by counsel and data security officers. One of the EU Directive principles requires that personal data be transferred cross-border only if the country of receipt provides “adequate protection.” Various options are available to companies to address the requirements of the EU Privacy Directive. Model contractual clauses have been approved by the EU, for inclusion in contracts between companies and their service providers. For companies with employees or customers in multiple European jurisdictions, adoption of Binding Corporate Rules that address all of the EU Directive requirements has also been deemed acceptable by the EU, provided that the BCR have been approved by the EU Data Privacy Commission and the applicable country’s DPA. The EU DPAs are also working on uniform BCRs, so that it would be unnecessary to obtain approval from each EU member state. Finally, certification within the U.S.-EU. Safe Harbor Framework provides protection against challenges of non-compliance with the EU Directive. More information on Safe Harbor certification, including a list of the more than 1300 U.S. companies who have joined the Safe Harbor Framework, can be found at Adopting one of the suggested methods to meet the “adequate protection” principle will permit multinational companies with European operations to truly operate without borders with respect to the personal data of their employees and customers.

Thursday, March 20, 2008

The Problem of Compensation

In my last post, I addressed what is a proper measure of damages for exposure of a person's private information. In response, the Dunning Letter put up a post responding to it, and providing some interesting statistics about the cost ($5720/victim) and prevalence (top complaint reported by FTC ID theft and consumer fraud survey) of identity theft. At that time, I considered preparing a responsive post, essentially playing devil's advocate and pointing out that providing compensation via lawsuits was really a poor way to combat the problem of information exposure, because, even if you could get the proper measure of damages, most people wouldn't take the trouble to file a lawsuit. Further, even if people did file lawsuits, the transaction costs associated with litigation (i.e., attorneys' fees) mean that, even if you did provide incentives to avoid data exposure, there would be a ton of lost effort involved.
However, this post brings the problem into even sharper focus, by describing the situation of a young woman who states the she reported an identity theft, which took between 20 minutes to an hour, and that she got NOTHING in return. If doesn't think it's worth an hour of her time to report an ID theft (and she's undoubtedly not alone in that), then you can bet there will be very few consumers who would be willing to spend the time (years) and money (thousands of dollars) which are necessary to go to court. The bottom line: while providing compensation is worthwhile, it isn't enough.
So what is enough? My proposal is regulation, clearly written and consistently enforced. Part of the problem is that businesses simply don't know what they need to do to avoid having security breaches. Also, businesses know that, even if there is a breach, there isn't much chance that individual plaintiffs will be able to successfully bring suit for damages. Clear regulation which is consistently enforced could solve those problems, both by providing clear guidance for businesses, and by providing a strong incentive (threat of government penalties) for following that standard. At the moment though, the U.S. model seems to be notification followed by individual litigation, which is, as set forth above, a highly suboptimal solution.

Sunday, March 16, 2008

Problems with U.S. Courts' Treatment of Security Breach Damages

This article from Computer World asks the question "When Does a Privacy Breach Cause Harm?" and then proceeds to take U.S. courts to task for failing to recognize damages from security breaches beyond verifiable damages from identity theft or account fraud. While I agree that U.S. courts have done an atrocious job with respect to protecting privacy (see, e.g., here, describing the 7th Circuit's statement that plaintiffs could not proceed on a action based on a data security breach, despite circumstances showing that the breach was caused by identity thieves), I have to take issue with the analysis offered in the article. The article states that the problem with what courts have done is that they have overlooked "[t]he assault to personality and feelings [that is] is the quintessential privacy injury." That rationale just doesn't work for me. Human feelings are notoriously hard to quantify, which means that damages based on assaults to personality and feelings would likely swing wildly from case to case and judge to judge, even if the actual underlying facts in particular cases are similar. Moreover, basing damages on feelings of loss and assault to personality runs the significant risk that juries will simply decide that those losses are too small to justify compensating, since studies (e.g., here) have shown that most people place little to no value on the privacy of their personal information.
A better option, and one I happen to agree with, is for businesses which suffer a security breach through their own fault (e.g., negligence) should be held responsible for the quantifiable damages caused by that breach, even if there is no subsequent identity theft. For example, time spent by customers replacing credit cards with stolen numbers, or the cost of various identity theft protection services are easily determined, and would serve as a measure of damages that courts could easily compute and assess. Indeed, since limiting damages to those directly caused by identity theft or account fraud provides an incentive for consumers not to prevent identity theft, making companies responsible for quantifiable costs would improve the status quo by increasing the level of protection given to privacy by courts, while avoiding the difficulties of trying to quantify injuries to personality. To me, that's a far superior alternative to relying on damages to personal integrity, which are both hard to quantify, and easy to undermine.

via The Dunning Letter.

PostScript: I am well aware that laws vary tremendously from state to state. My statements regarding the state of current privacy laws reflect the holding in Pisciotta v. Old National Bancorp, in which the 7th circuit addressed the issue of damages for a data security breach in the absence of subsequent identity theft.

Tuesday, March 11, 2008

Limits to Depending on Europe

In this post I posited that U.S. consumers might benefit from Europe's generally more protective attitude towards individual privacy. However a court case from last year dealing with obligations to turn over material in litigation demonstrated that there are limits to relying on European privacy laws. The case was Columbia Pictures v. Bunnell, and it was part of the ongoing battle that record companies have been fighting against peer to peer networks. In this case, Bunnell decided to put his server in the Netherlands, hoping that that country's relatively strong privacy laws would benefit his customers if the MPAA ever came calling. It turns out that that strategy doesn't work when you're sued in the U.S. As the court emphasized in ordering Bunnell to produce records of his customer's requests, "it is well settled that foreign blocking statutes do not deprive an American court of the power to order a party subject to its jurisdiction to produce (let alone preserve) evidence even though the act of production may violate that statute" (emphasis added). The bottom line: if you care about privacy, you need to fight for it at home (wherever that may be), because depending on foreign standards to protect you won't work if you're before a local judge who sees privacy as simply an obstacle to the proper function of the law.

Sunday, March 9, 2008

Conversations I will Never Have

Lucky: You got something to write with?

Paris: Ummm, no, 'cause I'm lying on a table.

The above is an actual partial transcript of a conversation between Paris Hilton and a white hat phone phreak who later went on to give Paris a lesson in voicemail security.

via Wired.

All I can add is, if I'm ever in a situation where I might be lying on a table, and I have the poor judgment to answer my cell phone, I hope I have the presence of mind to lie about what I'm doing so that what would be (for me) an embarasing moment isn't broadcast over the internet.

Thursday, March 6, 2008

Bilski: Not a big deal

While I generally write here on information security and data privacy law, in my legal career I also do a good deal of work as a patent attorney. Right now, the patent law community (and a fair portion of the mainstream media) is up in arms over a case which supposedly threatens to end the era of software and business method patents. Frankly, I think that the case, commonly referred to as Bilski, isn't going to be that big a deal. I explain why in this guest post over at Patent Baristas.

Sunday, March 2, 2008

Code Does Not Trump Law

According to this article from C|NET, an Australian judge has propounded an original and counter intuitive theory - that technology, represented by computer code, is more powerful than the law. As he put it in his own words, "We are moving to a point in the world where more and more law will be expressed in its effective way, not in terms of statutes solidly enacted by the parliament...but in the technology itself--code."

Of course, I use the words "original" and "counter intuitive" in their ironic sense, to mean "not original" and "not counter intuitive." The basic thesis is at least as old as Lawrence Lessig's 1999 book Code and other Laws of Cyberspace. However, the point of this post is not to point out that the judge's ideas are unoriginal, it's to set forth why I think they're wrong. Basically, I think there are three problems with the judge's argument that code trumps privacy legislation.

First, the judge fails to recognize that there are many types of privacy concerns. The article asserts that search engines like Yahoo! and Google had rendered the concept of limited usage for personal information obsolete. Frankly, I'm not sure how anyone could support such a silly notion. Yahoo! and Google make it easier to find information about a person, but do nothing to release information which isn't already publicly available. Thus, you can use Google to find out that I was (at one point) a competitive chess player, but you can't use Google to find out how much I paid the last time I went to the hospital. The fact that the judge does not realize that would be more serious concerns raised by my hospital records being publicly accessible by Google than are raised by my blog being accessible by Google indicates that his ultimate conclusion that code trumps law shouldn't be taken seriously.

Second, the judge overlooks the fact that, historically, the force of the law has a relatively good record in trumping the capabilities of code. For example, at one point there was a file-sharing service called Napster which was (supposedly) going to tear down the antiquated structure of intellectual property laws (for an example of the heady predictions related to Napster, see here). Of course, that didn't happen. What did happen is that Napster was sued out of existence based on violations of the intellectual property laws. For an even better example, consider the digital millennium copyright act, which has eliminated entire categories of consumer products (see this article for examples), which, if "code" actually trumped law, would be freely available. Now, by using these examples, I am not trying to make a normative argument that law should trump code, or that the world is better off because Napster (in its original incarnation) is gone and the DMCA is the law of the land. However, I think this examples make very clear that "code trumps law" is a conclusion which is simply not supported by the history of conflicts between the two.

Finally, the judge seems oblivious to the fact that "code" (or technology more generally) doesn't have the power to enforce social norms without the support of the law. For example, a search engine which is written in a manner that collects no personal information could only enhance privacy if: 1) people could trust its claims of gathering no personal information, which requires truth in advertising laws; and, 2) people could be sure it wouldn't unilaterally change its policies after they had relied on its guarantees of privacy, which requires the law to enforce its agreements. To me, the failure to understand that it is law which allows code to act as an alternate enforcer of social norms further undermines the judge's credibility, and his ultimate point that code trumps law.

Actually, the last of the three points is what bothers me most about the "code trumps law" argument. If you really think that "code" (or technology) trumps law, and that "code" can, ultimately be used to enforce social norms, there's no real need to fight for appropriate legal change. The result, of course, is that there won't be any legal change, and "code" will eventually trump law simply because the "law" side of the equation is ignored. As someone who cares about privacy protections, that seems to me to be an outcome which should be avoided, but that can only happen if people realize that the law is vital to protecting individual rights.