Data Privacy: What People in the US Really Think


Market researchers face a conundrum: to create more effective products and services, they need a lot of data about people and their behavior. But trust in both marketers and market researchers has been seriously damaged by the ongoing revelations around misuses of data involving social media and digital targeting. With the advent of new data protection laws in Europe (GDPR) and now the US (CCPA), it is no longer enough to simply add another line onto an already extensive privacy policy or have yet another pop-up tell us that we must opt in to access content.

At the ARF (Advertising Research Foundation), we believe that the people whose data are being monetized should have more voice in how their data are used. To that end, we decided to get a contemporary read on what people understand about data usage and privacy policies. We presented our findings during our annual AUDIENCExSCIENCE event.

In early June — just after the implementation of GDPR in Europe that saw us all weeding through updated privacy policies and opt-in screens — we surveyed 1,223 adults in the US balanced by age, gender, and region. We wanted to know how well consumers understood common privacy terms, what personal data they were willing to share and how their attitude changed when offered the benefit of greater customization of their on-line experience. The survey also delved into the value consumers placed on their data, and what institutions consumers trust and to what degree.

The mixed news? Consumers are willing to share a variety of data regarding who they are, but less likely to provide information that can be used to personally identify them or locate them in the physical world.

While the clear majority of adults are willing to share gender (95%), race or ethnicity (91%) and even sexual orientation (82%), they are less willing to share their home (43%) and work addresses (33%), any form of phone number (34% will share home landline and 35% will share mobile number), financial (22%) or medical information (29%). Here we see that the concept of protecting PII (Personally Identifiable Information) is well entrenched.

Perhaps the most telling finding from the study was that offering to customize the media experience using personal data does not significantly change a person’s willingness to share or not share the data. For too long, the data marketing world has thought that this value exchange is enough for consumers, but clearly that is not a view shared by respondents to our survey.

For the data they will share, consumers don’t expect to be paid very much — and the values assigned are about the same for each data element. Approximately a third say that they do not need to be compensated at all for their data. However, for those PII variables that they are NOT willing to share, there is no price that changes their mind: their privacy is priceless.

We live in an era of public debate about truth and trustworthiness. Regarding trust in media and institutions, people are polarized: 49% trust television news, while 51% do not. 45% trust social media sites they visit regularly, while 55% do not.  People are pulling inwards. For example, “people like me” are the most trusted, agreed to by 86%. Trusting the media source does not affect the willingness to let the media share data with others.

An important part of the survey was getting at how much people understand of the language used when we communicate about privacy. Here, we clearly could be doing better. We offered a “composite” example of language from the privacy policies that were popping up everywhere in the days following GDPR’s enforcement. Some of the key terminology used in privacy policies is not well understood, such as “first and third-party data,” “pixel tags,” “application data caches,” and “server logs.” But the fact is that few actually read that language since it gets in the way of whatever they are wanting to do. As such, it raises questions about whether such privacy disclosures and permissions will be effective at improving upon the status quo.

The survey underscores the need to take a step back and understand and care about how consumers feel about the retention and use of their personal and behavioral data. We need to begin by addressing people in the US, not just as “consumers”, but as individuals whose data is a privilege for us to access, not a right.

This survey has a clear call to action to all companies dealing with personal data – which these days is pretty much all companies. We must respect the right to privacy and not fool ourselves that we have communicated clearly about how data is being collected and used. A first step is to test our privacy policies for comprehension: let’s try to be extremely transparent, simple and direct in how we communicate. If we do not do so, we risk losing the valuable respect and trust of the US population and ensuring that costlier and business-infringing privacy legislation is passed.

At the ARF, we are in the process of drafting a 2019 code of research ethics for our members. The ARF

Code of Ethical Research Conduct includes principles of honesty, integrity, transparency, chain of trust, applied to research participants, clients, profession, and public. The draft of this code started with a review of 40 different codes of ethics from associations and others and is informed by the findings of our survey. We’ve taken an important first step in respecting the rights of people and their data, and we encourage you to join us.


Please enter your comment!
Please enter your name here