The communicative power of privacy policies

The communicative power of privacy policies

by Aisling Dawson (Trilateral Research)

Privacy policies can be an effective communicative tool, building up consumer trust in an organisation by communicating that organisation’s privacy practices. However, many privacy policies are long and complex, obfuscated by legal jargon and vague clauses. How can we ensure that privacy policies retain their communicative power and harness it for good? By developing a framework for analysing organisations’ privacy policies through Deep Learning techniques, the TRUST aWARE project aims to enhance consumer trust when faced with confusing privacy policies, allowing consumers to better distinguish the trustworthy from the untrustworthy.

The proliferation of privacy policies online

With online organisations collecting an increasing amount of personal data from their users, organisations within the EU and those transacting with EU citizens are legally obliged to communicate what data they are collecting, how they are using it, and if they are sharing it with third parties [1].

Privacy policies are documents “regulating the relationship between the user and the website” [2] with regards to data use. Yet, from an ethical point of view, privacy policies should not only be considered a tick-box, legal requirement. Rather, privacy policies should have a normative, communicative power that extends beyond imparting an organisations’ privacy practices: they should communicate to what extent an organisation can be trusted to promote the consumer’s best interest.  In this way, privacy policies act as an assurance to the consumer which fosters trust.

Contemporary privacy policies and miscommunication

Yet, the length and complexity of most privacy policies mitigate their communicative power. Whilst 38% of people claim to sometimes read privacy policies, only 8% could understand the contents [3]. Moreover, where consumers do not understand privacy policies, they are also unlikely to perceive them as trustworthy. By damaging consumer’s perception of their privacy protection, confusing privacy policies can impact consumer interaction with a given organisation, leading to increased withholding of useful data, and negatively affecting trade.  Additionally, with the wide propagation of privacy policies online, as much as 36% of people admit that they do not read organisations’ privacy policies before agreeing to them. [4] There are two potential reasons for this. First, reading every privacy policy they are faced with is, in practice, impossible for consumers. Research found that if an individual skim read every privacy policy they are presented with, it would take them 19 days a year to read them all [3]. This is compounded by the fatigue felt by consumers towards privacy policies due to their daily confrontation with such documents. Second, the communicative strength of privacy policies’ mere existence can be exploited by organisations. Through their transparency and requirement for user consent, privacy policies reduce the consumer’s “perceptions of risk” whilst enhancing their “perceptions of control” [3]. Research finds that by simply providing a privacy policy on a website, consumers are more likely to feel that their privacy is protected even where they have not read nor understood the policy in question [5]. Whilst in both of these scenarios the consumer does not withhold their data and there is no loss in trade, consumers’ blind acceptance of policies remains problematic from an ethical standpoint.

First, where consumers do not read privacy policies but accept them without necessarily trusting the vendor, this distorts the fundamental values underpinning data collection. The GDPR (General Data Protection Regulation) data collection and privacy protection framework is premised on informed consent, notice, and trust between vendor and consumer [3]. Thus, where privacy policies are failing to communicate a company’s privacy practices and, on a deeper, normative level, are failing to communicate that the company can be trusted, this degrades the communicative function and value of such policies.

Alternatively, whilst a vendor may be trusted simply because they have presented the consumer with a privacy policy, where such policies are not read, they fail at the first hurdle: communicating the organisation’s privacy practices to consumers and gaining informed consent. Further, by creating a semblance of user control with regards to their data, such privacy policies engender trust whilst not necessarily being trustworthy. At this juncture, the distinction between trust and trustworthiness should be noted. Trust refers to the “willingness to depend” [6] on the organisation using or collecting an individual’s personal data. Conversely, trustworthiness refers to the “attributes of the trustee” which make them deserving of such trust [6]. Consequently, privacy policies could induce a consumer’s willingness to depend on an organisation, without that organisation providing a trustworthy privacy policy or effectively communicating its privacy practices.

How can TRUST aWARE enhance privacy policies’ communicative capacity?

Thus, to tackle the communicative issues with privacy policies and positively harness their communicative strength, the TRUST aWARE project has developed a framework for analysing third party privacy policies through Natural Language Processing (NLP) and Deep Learning techniques. Using this tool, consumers can input any given privacy policy into the framework. The models then analyse and annotate the policy, returning an interactive visualisation to the consumer. This visualisation displays the most relevant aspects of the policy’s natural text and provides tabs for consumer categories of interest, for example first-party data use or the collection of a particular type of personal data. Once a tab is selected, the consumer will be presented with the segments of the policy that specifically pertain to that category of interest.

This framework enables consumers to quickly access the key elements of an organisation’s privacy practices, improving consumer comprehension of privacy policies and maximising the policies communicative capacity. Further, by highlighting the most pertinent aspects of each policy, this tool enables consumers to better distinguish between trustworthy and potentially untrustworthy policies, promoting digital privacy and security.


References

[1] European Union General Data Protection Regulation (GDPR) 2016/679, Article 5 (principles relating to the processing of personal data), Article 6 (lawfulness of processing), Article 7 (conditions for consent), Article 13 (information to be provided where personal data are collected from the data subject) and Article 14 (information to be provided where personal data have not been obtained from the data subject).

[2] Nili Steinfeld, ‘“I agree to the terms and conditions”: (How) do users read privacy policies online? An eye-tracking experiment’ (2016) 55 Computers in Human Behaviour 992 doi: https://doi.org/10.1016/j.chb.2015.09.038, p.994.

[3] Brooke Auxier, Lee Rainie, Monica Anderson, Andrew Perrin, Madhu Kumar, and Erica Turner, ‘Americans and privacy: concerned, confused and feeling lack of control over their personal information’ (Pew Research Centre, 2019), available at: https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/ (accessed January 2022).

[4] Grace Fox, Theo Lynn, Pierangelo Rosati, ‘Enhancing consumer perceptions of privacy and trust: a GDPR label perspective’ (2022) 35(8) Information Technology and People 181, p.181.

[5] Tatiana Emrakova, Benjamin Fabian, Annika Baumann, Hanna Krasnova, ‘Privacy Policies and Users’ Trust: Does Readability Matter?’ (Twentieth Americas Conference on Information Systems, August 2014).

[6] David Gefen, Izak Benbasat, and Paul Pavlou, ‘A research agenda for trust in online environments’ (2008) 24(4) Journal of Management Information Systems 275, p.276.

error: Content is protected !!