You’re walking down the street and receive a mobile-phone text message that offers a digital coupon for a frappuccino at the Starbucks (SBUX) you’re approaching. This brand of communication is known as “one-to-one marketing” or “behavioral advertising,” and it’s likely on its way to a wireless handset near you.
Pharmacies and grocery stores have long targeted offers, but there are many new ways marketers can use personal information to tailor advertising messages. They’re able to gather information about personal interests by tracking Internet use and digital media viewing habits, among other things, and then tailor messages accordingly. Consumers benefit from the customization as they receive ads relevant to them instead of those intended for mass consumption that may have no utility for them at all.
The marketing landscape is being transformed through the availability of new technologies. Social networks Facebook and News Corp.’s (NWS) MySpace recently joined the ranks of behavioral marketers (BusinessWeek.com, 11/7/07). Telecommunications giant Verizon Communications (VZ) has unveiled a plan to mine data from its wireless and wire-line customers. And data powerhouse Acxiom (ACXM) recently announced a new service geared toward personalized marketing.
How personal data will be used to tailor communications with consumers in the future is not exactly known, as new technologies rapidly emerge. For marketers and their targets, though, the marketing world will change. Discussions about how to best protect privacy amid this transformation are well under way, with some calling for an overhaul in regulation. But what’s really needed is the better application of existing guidelines, rather than the creation of a new set of rules.
Too Much Protection?
The theme at the recently concluded meeting of the International Data Protection & Privacy Commissioners in Montreal was “Terra Incognita,” a reference to the unknown future ways that technology will collect and use personal data. While much of the attention was on the new ways that governments can collect and use data, some concluded that privacy laws on the collection and use of personal information are outdated and increasingly irrelevant, with greater restrictions needed.
In that vein, a coalition of U.S. privacy organizations recently demanded that the Federal Trade Commission (FTC) set up a “do not track” list (BusinessWeek.com, 11/5/07) that would let consumers surf the Web not just anonymously but also shielded from targeted marketing that uses anonymous data to tailor online advertising.
This would take privacy law to a new level, where protection is given not only to private data (names, addresses, account numbers, etc.) but also to anonymous data (e.g., data collected through cookie technology), which would be legally regulated. The complexity and enforcement problems with a “do not track” law are enormous. Advocates liken it to the “do not call” rules that pertain to telemarketers, but only the names are similar. Compiling and applying a list of those who do not want tailored advertising will be a technological nightmare. Compliance, to the extent it can occur at all, will be costly. Ultimately, consumers will suffer through increased costs passed on to them, and opportunities for more useful consumer information will be diminished.
Fair Information Practices
Proponents of such a new Web of regulations are overlooking the existing privacy toolbox, principally the practices that have developed under the umbrella of Fair Information Practices. Informed consumers can, using the tools available right now on their computers and choices companies provide them, control the extent to which they are subject to behavioral marketing.
The FTC explains how Fair Information Practices underlie current privacy laws this way:
“Over the past quarter century, government agencies in the United States, Canada, and Europe have studied the manner in which entities collect and use personal information—their “information practices”—and the safeguards required to assure those practices are fair and provide adequate privacy protection. The result has been a series of reports, guidelines, and model codes that represent widely accepted principles concerning fair information practices.”
While the national frameworks for implementation of the Fair Information Practices differ, with the European Union countries being more prescriptive and the U.S. more self-regulatory, the bedrock element of all is the concept that consumers should be informed of how their personal data may be used so they can make educated choices. In short, transparency is the key. And in the U.S., if companies say one thing in their privacy policies but do another in the collection and use of personal data, the FTC will step in to enforce. Practice is measured against promises.
The five major search engines—Google (GOOG), Yahoo (YHOO), Microsoft (MSFT), Ask.com (IACI), and Time Warner’s (TWX) AOL—were recently lauded by the Center for Democracy & Technology (CDT) for changes in their privacy practices, specifically with respect to how search data (search terms, cookies, and IP addresses) will be handled.
The CDT said the changes show that competition works. The competition was made possible by visibility of the changes in the privacy policies of the various companies, which provide the notice and choices to consumers. Others disagree with the CDT over the extent to which the Googles of the world have gone to protect data. In any event, adherence to Fair Information Practices has allowed the debate by providing notice to consumers and to the world over how data is being handled (and what the choices are for consumers before they use the search engine).
User-Friendly Privacy Practices
An overhaul of the existing privacy framework, including the addition of “do not track” regulations, is not necessary. Fair Information Practices are expected to remain the foundation of privacy law for some time to come. So, with the advent of new technologies to collect personal data and tailor marketing messages, the fundamental issue is how the information about data use (and the attendant choices available) is communicated, not whether technology using personal data to engage in behavioral marketing should be regulated. In short, how clear and useful are privacy policies?
The FTC and the financial services community have been engaged in an exercise this year to standardize and streamline the privacy notices sent to consumers under the Gramm-Leach-Bliley (GLB) act. I agree with critics who say the notices sent to consumers in the past—often printed in tiny typeface on flimsy paper, and ignored—need a makeover. The strictures proposed in terms of content, format, and presentation have generated criticism for being inflexible and for stifling innovation in communications with consumers. Whether a standard form emerges remains to be seen.
Outside of the GLB realm, no such proposals for standard forms have been made. So there is a real opportunity for companies collecting consumer data for tailored marketing to communicate in new, clearer, and more consumer-friendly ways in order to provide the notice and choice that are the bedrock Fair Information Practices principles. Privacy policies need to be much more user-friendly. Of course, as a legal matter, the fine print needs to be there. But there is no reason the policies cannot be summarized with headlines in plain English, in a graphically attractive way. Just as dense management reports often contain executive summaries, companies should employ consumer summaries that highlight the privacy provisions. Even video can be used to describe the privacy options available to consumers. Verizon recently experimented with such video information.
Matter of Control
A special opportunity exists for companies that provide wired and wireless voice, video, and data services. The breadth and depth of the data such companies handle makes it incumbent upon them to clearly state their collection, storage, security, and sharing practices, and what the consumer options are with respect to how data are used. Likewise, more and clearer information needs to be provided to consumers regarding how they can use tools on their own computers to control the collection of data at its source.
In the new technological era, marketers will be able to provide more relevant (and more useful) information to consumers based on personal information, but that will only work if people have control over what information they are sharing. Privacy policies therefore will take on an increasingly important role, and companies will (and should) be rewarded for innovations in how such policies are communicated. A new “do not track” bureaucracy is not what is needed.
Christopher Wolf is a litigation partner in the Washington (D.C.) office of Proskauer Rose and chairs the firm’s Privacy & Data Security Practice Group. He is the editor and lead author of the Practising Law Institute treatise Proskauer on Privacy: A Guide to Privacy and Data Security Law in the Information Age.
Compliments of BusinessWeek