Communication Research Takes on the Myths of Privacy Compliance

Last week, the New York Times published an article on low privacy literacy rates among consumers. The article is based on a recent study from the Annenberg School for Communication at the University of Pennsylvania that looked at Americans’ opinions about and understanding of privacy, surveillance, and technology.

For a comprehensive overview of the report’s findings, I highly recommend reading the New York Times article. In this post, I will be focusing on the research itself and why I’m so excited to see communication researchers tackle important questions in privacy. 

Communication Research Helps Identify Holes in Legal Outcomes

Not all laws are effective in the sense that they may not be written well enough to ensure adequate compliance or enforcement, they may further entrench social inequalities, or they simply miss the mark in achieving their objectives. I’ve observed privacy laws in the United States struggling with all three. I am not a lawyer and so I will not argue on statute or precedent, but I will argue on the basis of what I’m formally trained and experienced to evaluate: whether certain efforts in human communication result in intended or unanticipated behavior, to what extent, and with what impact.

The research from the Annenberg School for Communication highlights at least two key communication failures in current privacy laws that rely on consent:

  1. The definition of “consent” appears to be up for debate even in our laws – and that’s just silly to me as a communications professional because ill-defined terms essentially block mutual understanding. It’s impossible to consent to something you don’t understand and it’s possible for companies to prove compliance with consent-based laws if they’re not expected to measure whether people actually understand their data practices and what they can do about it (see my previous post on outputs vs outcomes). A privacy policy or public statement is merely an communication output, not an outcome (e.g. genuine consent). 

  2. Americans lack “knowledge of commercial data-extraction practices as well as a belief they can do something about them.” I’ve seen this happen when organizations know consumers would make different choices if they had all available information. The FTC isn’t currently amped up on investigations of dark patterns for no reason. But this can also happen when organizations fail to research what consumers truly understand. Regulators fall short here too. There are plenty of legal provisions that require organizations to use incredibly ineffective communication tactics (more on privacy policies later). 🫠

Privacy counsel extraordinaire Brandi Bennett told the attendees at last month’s Enigma Conference that “consent is the fiction at the center of our profession.” I believe these communication failures are a big reason why consent has failed to provide effective privacy protection for consumers. Compliance with the legal requirements doesn't guarantee effective protections – something we learned the hard way with infosec – and it’s disappointing that so many privacy professionals still view the regulatory checklists as sufficient. Narrator: they’re not. 

Communication Research Helps Improve Consumer Understanding of Laws

The covid pandemic catalyzed consumer interest in health data privacy – a right we still don’t have in the U.S. beyond the limited restrictions that apply to healthcare providers subject to the Health Insurance Portability and Accountability Act (HIPAA). Just search for HIPAA and COVID for thousands of examples of communication failures in helping consumers understand their rights. Moreover, 92% of Americans believe health care data privacy is a right we do have, according to recent research from the American Medical Association. The research further shows that people are unclear about rules to protect their privacy and have concerns about who has access to it. Of course they do! Everyone from Congress to businesses and even activists have made the differentiation between rights we should have and the rights we actually have clear as mud. The correct answer is to bring reality closer to consumers’ expectations for privacy, because helping them understand their rights and how to exercise them doesn’t include fine print or disingenuous use of consent.

OK — now, a word on privacy policies. They’re terrible. We know. If we could get rid of them completely, many of us would because they are piss-poor communication attempts. But we can’t get rid of them because they’re legally required in many jurisdictions – note, true “transparency” isn’t necessarily the requirement, the long lengthy legal document is, in fact, what the law mandates. So, we’re stuck with them for now. But don’t get comfortable – they’re still terribly inaccessible to most consumers – which makes them a terrible communication tool. Kashmir Hill has been covering their terribleness for more than a decade! Apparently to a lot of legal teams, “clarifying” their data practices involves breaking up the same drab paragraphs into shorter ones with (gasp!) bullet points. So far, our legally-mandated attempts at informing consumers about how their data is used and what they can do about it, have been colossal failures because the law only requires organizations to create a communication output (published words) rather than an outcome (genuine understanding), a standard that would require organizations to measure the effectiveness of their communications. 

Communication Research Helps Organizations Build Trust with Consumers

What are we to do then when legal compliance fails to create meaningful communication outcomes? 1) Comply with the law anyway to the best of your ability. This is the moment where I remind all in-house privacy counsel that simply being a legal requirement won’t force most businesses to comply. There are plenty of laws that companies knowingly violate due to a lack of harmony with other laws, a desire to force changes in the law, or even willingful defiance (oh hey, Elon). If you want your organization to take you seriously and view you as a trusted business advisor, you need to find a better way than playing the role of Chicken Little to encourage, persuade, and drive change.

And this, my friends, is why we added a panel about “what else can we do?” at the recent USENIX’s Enigma 2023 conference in Santa Clara, California. Officially titled “Privacy Policies, by Lawyers, for Lawyers. What About Everyone Else,” the panel brought together legal and product experience experts to talk about other ways organizations can engage and educate customers in meaningful ways. There was discussion about how to improve privacy policies to make them accessible (a truly superfluous task in my opinion since no one ever reads them), but there were also a lot of examples of product and UX choices designed to give individuals information in context instead of pushing them to a wall of text or a laundry list of toggles isolated from the product experience. I encourage you to watch the full discussion here.

We can’t build trust without understanding and if people don’t understand how their data is used, the rights they have, and how to exercise them then all our efforts to build trust in our privacy programs are for naught. To do this, we need to measure the effectiveness of our communications, including the channels, language, and visuals we use – so that we can adjust as needed to ensure the people whose data we’re using never feel duped into sharing it. 

- - -

Discernible’s privacy communications experts help organizations of all sizes to communicate privacy commitments and competencies — and measure the effectiveness of those communications to ensure privacy programs drive outcomes, not just outputs. Send us a note!

Previous
Previous

CUSTOMER CASE STUDY

Next
Next

CUSTOMER CASE STUDY