ADOTAS – The Federal Trade Commission’s online privacy framework — outlined by Chair Jon Leibowitz at a press conference last week — seems to walk the fine line of compromise between the desires of online behavioral advertising companies and privacy advocates. However, it’s said that the best compromise is the one that leaves both sides unhappy, which could easily be the case here.
However, a better question may be, can anything satisfy data privacy advocates?
On the day of the press conference, which was publicizing a new study by the Stanford Law School’s Center for Internet and Society, privacy report author Jonathan Mayer tweeted, “A lobbyist for an online advertising company asked me today why I won’t give self-regulation a ‘fair chance.’ Isn’t a decade ample time?”
But, as we went into great detail on last week, Mayer’s report has little to do with online behavioral advertising — it’s more about sloppy user data management on the part of publishers sending data to third parties.
While 61% of the sites studied were sending “personal information” as defined by Mayer as “information that with moderate probability and moderate effort can be used to identify a user,” 39% weren’t. Seventy-two of the 185 publishers Mayer examined were using better data security methods — mainly anonymizing URLs. (Facebook started doing this when The Wall Street Journal discovered it was sending unique IDs to third-parties advertising in social games played on the platform.)
However, Mayer interestingly absolved publishers for this with the line “identifying information leakage is a fact of life on the web, and that identifying information may be shared with third parties.”
I was kind of surprised by how many industry people seemed to shrug off the report given how it was advertised as proving “Yes, They Really Know It’s You.” At the same time, Mayer’s report didn’t really reveal anything new — a recent study with very similar methodology found very similar results.
And Mayer’s research really only supplies half the story — even he admitted his research could not inform what happened to the data after it’s collected. Even the title of the report, “Tracking the Trackers,” is a misnomer as there’s no research about what happens after the data is sent. (“What the Trackers Receive” would have been more appropriate, though not as eyeball-grabbing.)
There’s only an implication of some grand conspiracy of publishers sharing personally identifiable information about their browsers with third parties as part of some kind of silent and illicit agreement. And then those companies build ridiculously detailed profiles for targeted advertising (for now!).
But evidence? None. A commenter on the CIS site read my mind: reputable third-party data companies don’t scrape URLs for possible user info. Why? Basic business good practices — if they were doing that clandestinely and it came out in the press, they’d lose all their publishers (i.e., data sources) in a flash. No publisher in its right mind would want to be associated with that toxic waste dump. (Take a look at how many major publishers suspended their KISSmetrics service after it came out they were using E-Tags as hard-to-delete tracking cookies.)
On the privacy side, there’s a lot of clamor about what data is being sent via what methods, but little talk about how it is actually being used — what goes toward ad functionality (e.g., frequency capping) or internal publisher metrics vs. audience profiling. How is that data processed? Is anonymization a myth?
The FTC’s framework would provide an answer to that missing side of the equation. But it feels like no amount of industry transparency will be enough for online privacy advocates. DNT fever is a hard one to break.
Many in the industry are consoling themselves that the widespread incorporation of browser DNT capabilities (and the grudging acceptance by industry associations and companies) will result in a limited amount of browsers flipping the switch. However, that could be just enough.
As we’ve explained many times before — including in an article covering other controversial claims by Mayer — tracking cookies are not just used for behavioral advertising purposes. They also produce data for ad-serving functionality such as frequency capping, as well as site performance and audience insight.
But consider this: if enough users did turn on DNT, there’s the possibility it could force publishers to introduce an opt-in system for site monetization, in which the “unspoken transaction” of data for content shifts front and center — i.e., sites demanding users either pay with data or cash to view certain online material.
And as we’ve suggested before, device fingerprinting (via technology like BlueCava’s) would be an intuitive way to establish this new system. Want to get a privacy advocate really worked up? Mention device fingerprinting — though it should be noted, AdTruth, the device fingerprinting service from online fraud prevention and detection resource 41st Parameter, announced on its launch that it follows the DNT protocol of Mozilla Firefox.