Making GDPR Usable: A Model to Support Usability Evaluations of PrivacyA long version of this paper is available as [theTR]. We would like to thank the anonymous reviewers for helping improve the paper.

Making GDPR Usable: A Model to Support Usability Evaluations of Privacy1 2

Making GDPR Usable: A Model to Support Usability Evaluations of Privacy3

Johanna Johansen\orcidID0000-0003-4908-9045 The first author was partially supported by the project IoTSec – Security in IoT for Smart Grids, with nr. 248113. Thanks go also to Josef Noll for introducing me to the topic of privacy evaluations and labeling.Department of Informatics, University of Oslo.
johanna@johansenresearch.info
   Simone Fischer-Hübner Department of Mathematics and Computer Science, Karlstad University. simone.fischer-huebner@kau.se

Abstract

We introduce a new model for evaluating privacy that builds on the criteria proposed by the EuroPriSe certification scheme by adding usability criteria. Our model is visually represented through a cube, called Usable Privacy Cube (or UP Cube), where each of its three axes of variability captures, respectively: rights of the data subjects, privacy principles, and usable privacy criteria. We slightly reorganize the criteria of EuroPriSe to fit with the UP Cube model, i.e., we show how EuroPriSe can be viewed as a combination of only rights and principles, forming the two axes at the basis of our UP Cube. In this way we also want to bring out two perspectives on privacy: that of the data subjects and, respectively, that of the controllers/processors. We define usable privacy criteria based on usability goals that we have extracted from the whole text of the General Data Protection Regulation. The criteria are designed to produce measurements of the level of usability with which the goals are reached. Precisely, we measure effectiveness, efficiency, and satisfaction, considering both the objective and the perceived usability outcomes, producing measures of accuracy and completeness, of resource utilization (e.g., time, effort, financial), and measures resulting from satisfaction scales. In the long run, the UP Cube is meant to be the model behind a new certification methodology capable of evaluating the usability of privacy, to the benefit of common users. For industries, considering also the usability of privacy would allow for greater business differentiation, beyond GDPR compliance.

Keywords:
usable privacy Human-Computer Interaction usability goals usable privacy criteria privacy certification GDPR.
\onlyShortPaper\longPaper

1 Introduction

\longPaper

1.1 Motivations

\longPaper

Privacy is a human right and an essential prerequisite for protecting fundamental human values, such as dignity and autonomy. However, the goal of protecting the privacy of the individuals and public has proved to be challenging to achieve. Currently, people have difficulty in understanding how their privacy is affected by the current practices in the technological world and are often unaware of its value and the implications of its loss. With our work we intend to build on the recent developments in the data protection law and support the business actors to invest into adopting privacy protecting measures. We offer a way to quantify the level of data protection and its usability in technological products and services. Displaying the achieved level of privacy protection, can be used by businesses to compete and differentiate themselves on the market. In addition, the measurements we produce can be translated into visual labels that can inform the users, at their respective level of understanding and interest, about how well a respective product respects their privacy. In this section we give an overview of the current state of privacy protection in technological products and society at large. This is to argue how our work could benefit the society and why the present context is propitious for such work.

With the boom of electronic commerce and the pervasive IoT (Internet of Things) and web services, people are producing enormous amounts of electronic data that are collected by various actors under very imbalanced privacy agreements (i.e., signing Terms of Services on a “take-it-or-leave-it” manner). The reasons are multiple; for one, it is difficult (if not impossible) for a normal person to know exactly what data her online behavior (or IoT device) produces. Then, much of this data is of private nature, but it is difficult to understand which and in what situation. Lastly, there are numerous and powerful algorithms (for search, machine-learning, etc.) that can make new (some unthinkable) inferences out of seemingly non-private pieces of data; and even more dangerous when multiple data sources are combined.

\longPaper

Digital privacy has eluded people up to the point that many have given up the hope to have both privacy and access to digital services; recognized even in the General Data Protection Regulation (GDPR)4 Recital 9 as “…widespread public perception that there are significant risks to the protection of natural persons, in particular with regard to online privacy”. One of the reasons, however, is that privacy is a complex concept. It has different personal and contextual connotations and larger social, ethical, legal and political implications that are difficult to grasp by laypeople [6]. In the presence of technology, digital privacy becomes even more complicated to understand. For example, there is a constant tension between how much privacy a person can have (i.e., how much of her activity she can keep only to herself) and how much authorities are allowed to see in order to ensure safety in the society (e.g., how many and where should public surveillance cameras be placed?). Another uncertainty surrounding privacy comes from the difficulty modern people have in separating their public and private lives (e.g., use of social media both with coworkers and friends; should (or not) data from browsing the Internet be shared with Internet companies or search engines, and when or for what functionality in return?).

When designing for usability, we also need to consider that the main goal of a person when using a piece of technology or a technological system is to fulfill the task or need that the product was intended for, e.g., to buy train tickets from a ticket machine. The primary goal of the train ticket buyer is not to check how well the machine protects her privacy, but to reach a certain destination. The buyer might even be in a hurry, so that it catches the train that leaves in 10 minutes. There will be no time left to read the privacy terms. And even if the buyer has time to read them, in case she does not agree with the terms, she still has to take the train to reach her destination. There might be a buss alternative, but the privacy terms of the buss company might be even more invasive [21].

The complexity of the privacy concept as such and of digital data and technology, make it difficult for one to evaluate the privacy properties of a specific piece of technology (e.g., web service, \onlyShortPaperInternet of Things (IoT\onlyShortPaper) product, or communication device). The difficulty is not only for average people, but also for regulators to check compliance, and for developers to be able to provide privacy-aware digital services/products/systems.5 Indeed, there are multiple concepts involved in digital privacy, like data sharing (which for normal business practices nowadays can form a highly intricate network of relationships), ownership and control of data, accountability or transparency (both towards the regulators as well as the users). Many of the privacy concepts are even a challenge by themselves, when it comes to their evaluation, since they are difficult to measure or to present/explain.

To solve the intricacies of privacy around the use of personal data in technologies, collaboration between specialists from different fields – such as computer science, interaction design, human ergonomics, law, and cognitive science – is needed. When a common understanding has been reached this should be translated into a form that can be grasped by regular people with limited time and level of expertise in fields such as technology or law. The work of [24] is an example of such initiative meant to bridge the computer science and legal approaches to privacy.6 (See more such examples in Section 2.)

People having difficulties in evaluating the implications of their choices and behavior in respect to privacy can be taken advantage of by the companies that have business interests in harvesting their data. Moreover, powerful and influential business actors, by using media channels, seek to further misinform people about the value of privacy and encourage them to give up their rights related to privacy and personal data protection.

One type of misconception about privacy that businesses try to spread is that “only the wrongdoers have something to hide and that the people showing different faces in different situations lack integrity” [29, Chapter 10. Privacy]. Some renown comments that have shaped the public opinion in this negative way are:

  • “You already have zero-privacy anyway, get over it.” (former Sun Microsystems CEO Scott McNealy, 1999) [23, – source of the citation]

  • “If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” (Google’s CEO Eric Schmidt, 2009)

  • “Privacy is no longer a social norm.”[29, – source of the citation]

  • “You have one identity. The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. Having two identities for yourself is an example of a lack of integrity.” (Facebook’s Mark Zuckerberg, 2010) [29, – source of the citation]

\longPaper

The companies might be interested in subverting people’s privacy so that they can sell them more of their services and products. The information people produce is used mostly for advertising [29, Chapter 4. The Business of Surveillance]. As most of our modern life is conducted online, we do not have much choice, as we cannot simply stop using mobile phones, read or shop online or use e-mail. The solution is to motivate businesses to compete on being ethical and on protecting their users’ privacy. Some ways for businesses to achieve this could be: displaying privacy policies shortly and clearly in product descriptions, adopting labels of the energy consumption type, adopting privacy enhancing technologies (PETs), or/and becoming certified by privacy certifications schemes [25].

One example that consumer communities can do is to include privacy features in technology reviews (e.g., reviews for smartphones), as one main evaluation criterion, at the same level with, e.g., battery life-time. However, people need to be educated to ask for “privacy-as-a-feature”. The task of educating the public cannot be left in the hands of media, which might be controlled by businesses, or leave the users to learn/study privacy by themselves – “if people need to go to great length to protect their privacy, they won’t” [23] because privacy is a too complex matter for a layperson to be able to handle it alone. People need support and guidance in this respect. This can be done through making privacy-related information usable for the general public and motivate the businesses to adopt such an approach to data privacy.

\longPaper

1.2 Making privacy usable

For explaining the intricacies of privacy, besides research articles and books [16], there are several legislative texts adopted in different jurisdictions. \longPaperGDPR\onlyShortPaperThe General Data Protection Regulation (GDPR)7 in Europe makes a good effort in clarifying many aspects of data privacy, providing the legislative support to enforce better data protection practices on anyone (within its jurisdiction) collecting and processing personal data.\longPaper8 However, these regulations only specify the requirements on the data controllers \longPaper(i.e., the organizations providing the service)in the form of basic principles, and the rights of the data subjects, but do not make any strict claims about the extent to which a controller (or processor) should go about implementing these requirements so that they are beneficial for the user, and to what degree.

As such, one motivation for usability evaluations of privacy is the fact that usability goals of GDPR, s.a. “… any information … and communication … relating to processing [to be provided] to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, …” (Article 12 (1) of GDPR), are left open to the subjective interpretation of both evaluators and controllers. The provisions of GDPR regarding usability are too general and high-level to be suitable for a certification process [20]. To remedy this, we propose a set of criteria thought to produce measurable evaluations of the usability with which privacy goals of data protection are reached.

For evaluating privacy we take as starting point the methodology developed by EuroPriSe\longPaper9 [4] that has as purpose to evaluate compliance with GDPR. We are guided by the EuroPriSe criteria when eliciting, what we call, principles and rights, which form the two variability axes at the basis of our model, i.e., which principles are followed and which rights are respected. However, EuroPriSe does not consider usability, which is the main focus of our work here. As such, one contribution of this paper is to show how to add usability aspects to the existing evaluation criteria of EuroPriSe.

Certification schemes (s.a. EuroPriSe) provide a seal showing compliance with data protection regulations and industry standards. In addition to such a certification, our evaluation measures on a scale how well data protection obligations are respected and how easy it is for a user to understand that. These measurements can be presented to the user in different ways, e.g., using “traffic light” scales, showing which level of usability has been reached by the privacy of a certain technological product. A “traffic light” presentation of privacy is recommended by [3, Chapter 6(235)] as a way to “foster competition” and “show good practice on privacy policies”.

\longPaper

The marketing motivation for the providers to adopt such a methodology is that the users tend to choose the product that answers best to their specific and real needs for privacy. Only the lack of alternatives in the market today explains why the data subjects still accept detrimental privacy conditions that would rather fit the interest and attitudes of other type of stakeholders (controllers and processors) [29]. The metaphor of the “dancing bear” of Alan Cooper [11] illustrates well this situation.

Creating alternative products or features is a way for the businesses to differentiate themselves. Now a new way of differentiation is the level of privacy protection offered, beyond the minimum required by GDPR, as well as how usable this is. Certification schemes such as EuroPriSe will give a product the seal of GDPR compliance [10], but as the GDPR compliance is mandatory, all businesses will seek to conform. Beyond this, our methodology would facilitate the differentiation between services by considering usability aspects when implementing measures for protecting privacy. Usability is known as a market differentiator, e.g., the ISO 9241-11:2018 standard [5] asserts that designing for usability helps with marketing of a product and with offering the user better customized choices.

Traditionally, usability is a quality related to the use of a product. In our case, we are not interested in the usability of a product per se, but only in those aspects of a product that concern privacy. Our conceptualization of usable privacy is based on the definition of usability as presented in the ISO 9241-11:2018\onlyShortPaper [5], which we adapt to include privacy as follows: \onlyShortPaper

Usable privacy refers to the extent to which a product or a service protects the privacy of the users in an efficient, effective and satisfactory way by taking into consideration the particular characteristics of the users, goals, tasks, resources, and the technical, physical, social, cultural, and organizational environments in which the product/service is used.

Our long term goal is to create a methodology to support service providers to make the privacy of their products more usable. The Usable Privacy Cube (UP Cube) described in Section 3 and the usable privacy criteria introduced in Section 7 are the first building blocks of the methodology we are aiming for. They are meant as tools, for both usability engineering experts and certification bodies, to evaluate if a product was designed to respect and protect the privacy of its users in an usable way. Once privacy measures and privacy enhancing technologies are integrated into the design of a product, it still remains to find out if (and how much or to what extent) those measures empower and respect the rights of their particular user as intended. In Human-Computer Interaction (HCI) this is determined based on user testing and usability evaluations. The criteria we propose presume the use of such established HCI methods for usability evaluations (e.g., [14]).

\longPaper

Furthermore, the usable privacy criteria that we propose for evaluating how efficient or effective a product is in protecting privacy require the technology providers to take into consideration the context of use, including the characteristics and needs of different types of users. We adopt the definition of context of use proposed by ISO 9241-11:2018:

“[context of use] comprises a combination of users, goals, tasks, resources, and the technical, physical and social, cultural and organizational environments in which a system, product or service is used.”

The legislation does not directly refer to usability goals and context of use as known in the ergonomics/human factors or human-centered design. However, requirements as the one in the Recital (39) of GDPR asking for the information addressed to the data subject to be “easily accessible and easy to understand” are categorized in this paper as usability goals, for which we create usable privacy criteria meant to measure effectiveness, efficiency and satisfaction – as usability outcomes – with regard to privacy aspects (we henceforth call these Usable Privacy criteria, and abbreviated it as UP criteria).

After a short digression into related work in Section 2, we introduce in Section 3 the UP Cube model, which is the main contribution of this work. We then continue to detail the UP Cube in the rest of the paper. Section 5 presents the EuroPriSe in the new light of the UP Cube, forming the two axes of criteria at its basis. The third vertical axis of the UP Cube, a genuine contribution of this paper, is formed of the UP criteria detailed in Section 7. To the best of our knowledge, there is not other work that extends privacy certification schemes with usability criteria. Section 6 presents usable privacy goals that the criteria are meant to measure. The UP Cube naturally captures Interactions between all the axes, which we talk about in Section 8. We conclude in Section 11, presenting also some avenues for further work.

2 Putting the work into context

Usable privacy and security. The present work can be placed in the research field called usable privacy and security, with seminal works s.a. [7, 17, 33, 12] and conference series s.a. the Symposium On Usable Privacy and Security (SOUPS). We consider that research on privacy requires, even more than security, an interdisciplinary approach (encompassing the expertise coming from research fields such as Psychology, Law or Human-Computer Interaction). As [6] points out, privacy has its meaning rooted in larger cultural and social practices and has political, ethical as well as personal connotations.

\longPaper

There have been considerable efforts towards including specialist from different areas of research on issues related to privacy. Examples of such efforts are the constitution of The Privacy & Us Innovative Training Network (ITN)10 or the organization of the IFIP Summer School on Privacy and Identity Management11.

Other examples of cross-disciplinary research efforts come from the automation of privacy agreements (or Terms of Services – ToS) where the goal is to presented ToS in an accessible way to the general user. Notable contributions in this regard are the endeavors of the LeDA network12, The Usable Privacy Policy Project13 or the CLAUDETTE project14.

Regarding the relation between security and privacy, in this paper we consider security as one integral aspect of privacy, where privacy implies security but not the other way around. We consider such a clarification necessary, as we have seen a tendency in the general public to equalize the meanings of the two terms in favor of security. In computer science, privacy research has been closely intertwined with security research, reflected e.g. in the contents and the structure of the book [13]. However, in this paper, we favor the term “usable privacy”, as it includes by default security, which is in accordance with the data protection legislation, where security (integrity and confidentiality) is specified as one of the several principles to abide by in order to assure the privacy of users’ data.

Human-Computer Interaction. Having the goal to evaluate the usability of privacy in technological systems and products, makes our work part of the larger HCI research on privacy [6, 22, 21, 27]. Following the classifications made by Iachello and Hong in their review [19], we approach privacy from a “data protection” perspective by extracting usability related goals from the GDPR. A similar approach is taken in \onlyShortPaper[27]\longPaper[26, 27], which translates legislative clauses of the Directive 95/46/EC (now replaced by GDPR) into interaction implications and interface specifications. \longPaper Similarly, [23] develops principles for guiding system design based on fair information practices found in the US Privacy Act of 1974 and the EU Directive 95/46/EC. \longPaperThe model we propose integrates well with a user-centered design where HCI methods are applied to elicit requirements based on understanding the users, their needs and the context of use.

For evaluating how well a product meets privacy requirements, context of use variables s.a. user capabilities, tasks, the field where the technology is going to be deployed (e.g., healthcare, industrial facilities), should be defined. \longPaperPreferably these definitions should be established in the requirements phase of a product’s lifecycle, but definitely these would be defined and considered when running the privacy evaluation based on the UP criteria that we propose here. We thus adopt the ergonomic approach from ISO 9241-11:2018 where usability is always considered in a specified context of use, since the usability to be applied to a certain technology can be significantly different for varied combinations of users, goals, tasks and their respective contexts.

\longPaper

ISO/IEC29100:2011 gives a good example of how the context of use is decisive for establishing if a certain type of information can be used to identify a natural person [1, 4.4.2 Other distinguishing characteristics, p.7]:

“The last name of a person is insufficient to identify a person at a global scale, but might be enough to identify that person at a company level.”

\longPaper

We also make a distinction between user experience goals and usability goals, focusing in this paper on the latter [28]. User experience goals are concerned with how users, as individuals, are perceiving a product. As the nature of our work is to find criteria that can be generalized to groups or types of people, to be measurable, and part of an evaluation, usability goals will be more appropriate for such a function. This difference is clearly stressed by the ISO 9241-11:2018 standard, which states that usability typically deals with goals sheared by a user group, while user experience has more emphasis on individual goals.

3 The Usable Privacy Cube model

\onlyShortPaper
Figure 1: A generic version of the cube with the three axes of variability: data protection principles, the rights of the data subjects, and usable privacy criteria.
\onlyShortPaper

We devise a model for organizing the criteria to use in privacy evaluations and measurements, and represent it as a cube with three axes of variability (see Fig. 1), which we call the Usable Privacy Cube (UP Cube). The two axes found at the base of the UP Cube are composed of the existing EuroPriSe criteria, which we slightly reorganize in the Section 5 to fit in one of the two categories: data protection principles or rights of the data subjects.

We want to emphasize two perspectives on privacy that the UP Cube represents (hence our restructuring of the EuroPriSe criteria): the perspective of the controllers and of the data subjects. The controllers are thus given an overview of the principles that they are obliged to follow, whereas the data subjects are offered an overview of their rights.

The UP Cube allows to visualize interactions between the axes, made easier by our separation of the criteria into the three categories. Each such intersection has its specifics and could be studied in itself; we identify a few exemplary points of intersection between the axes in Section 8.

Example 1

The intersection between the transparency principle and the right to be informed is identified in Article 12 of GDPR. The controllers are obliged to provide the data subject information that should be concise, transparent, intelligible and in easily accessible form, using clear and plain language.

The third vertical axis of the cube is composed of our UP criteria, presented in Section 7. The UP criteria are determined based on usable privacy goals and are evaluated considering the context of use by following the guidelines in the ISO 9241-11:2018 standard. \longPaperEach of these criteria has several subcriteria intended for measuring the usability level by using different methods and respective tools from HCI, depending on what the criterion asks to be measured [28]. Interactions exist also with this third axis.

Example 2

For the case presented in Example 1, in order to establish how easily accessible or clear the information is, we must measure the level of efficiency, effectiveness and satisfaction in a specific context of use. Efficiency implies measuring the time and effort spent by a specific user for finding the information needed and for understanding it. Effectiveness measures the completeness with which a goal was achieved. In this case we would like to know how much of the needed information was the specific user able to access and understand. At the same time, what a certain type of user perceives as intelligible information, might be perceived by another as difficult to comprehend. Establishing the perceived characteristics of information is an activity categorized under the satisfaction usability outcome.

The UP Cube also brings the idea of orderings on each axis, hence the arrows. Such orderings are important for several reasons, e.g., UP criteria can be ordered based on “how little effort is required to evaluate it compared to how much overall evaluation outcome it entails” or “covers most technologies”. Usual for certification methods is to use a decision tree order to capture the impact of each criterion (e.g., choosing the most discriminating first), thus which to prioritize in the evaluation.

Judging from practice, one is inclined to think that an ordering is not always possible to find as some principles are equally important, therefore the orders are not necessarily strict. Moreover, one can even see one principle as more important than another only in some industry or context, whereas in a different industry the same two principles would be ordered the other way, therefore one may think that the orders are only partial (i.e., not total). However, in a specific cube (i.e., used in a specific methodology by a specific authority for privacy usability evaluations in a specific industry and context) there must always be an ordering in which the criteria should be applied. One can always generate a strict and total order from a partial order by just taking a random decision on ordering two criteria when no reasonable order exists. For example, one can any time pick as default order the one arising from the textual placement of the criteria in the data protection legislation texts (maybe considering content from articles as more general than content from recitals), or in the EuroPriSe (or the regulator/company) catalogs. What is certain is that each use case or industry has its specific requirements from which a meaningful ordering would be created.

Forming a specific UP Cube, i.e., deciding on the precise details of each criteria on the three axes and the orderings, is to some degree dependent on the specific context of use for the respective product to be evaluated. Therefore, one can think of infinitely many cubes, one for each different context. The criteria will not be different between the cubes, but their scope, depth, and evaluation might be different, depending on the context.

\longPaper

The context of use as such is not mentioned in GDPR, but the context of processing is brought up often [2, e.g., the recitals 43, 47, 71, 74, 76]. The context of processing, as defined in the legislation, overlaps and has similar purpose with the notion of context of use defined in the ISO standard. However, unlike usability engineering/HCI where the context is a general concern, the data protection law requires the context to be considered only in certain special situations, e.g., when evaluating the risks to the rights and freedoms of natural persons. We go beyond this and consider for each proposed criterion the context and the group of users that the evaluation aims at. \workInProgress Our methodology thus contains questions useful for eliciting knowledge about the users and the context, presented in the Section 4.

\workInProgress\longPaper

4 Context of Use for Privacy

For evaluating how well a product meets requirements, variables such as the user capabilities, tasks, the field where the technology is going to be deployed (e.g., healthcare, industrial facilities), and the context of use should be defined. Preferably these definitions should be established in the requirements phase of a product’s lifecycle, but definitely these would be defined and considered when running the privacy evaluation based on the UP criteria that we propose here. We thus adopt the ergonomic approach [5] where usability is always considered in a specified context of use, since the usability to be applied to a certain technology can be significantly different for different combinations of users, goals, tasks and their respective contexts.

Example 3

ISO/IEC29100:2011 gives a good example of how the context of use is decisive for establishing if a certain type of information can be used to identify a natural person [1, 4.4.2 Other distinguishing characteristics, p.7]:

“The last name of a person is insufficient to identify a person at a global scale, but might be enough to identify that person at a company level.”

In the EuroPriSe only some aspects of the context of use (as understood by the ergonomics and HCI communities) are considered: the technical, organizational and legal framework within which the Target of the Evaluation (ToE) is used [18]. However, the GDPR text requires to consider the context of processing, which has several overlapping areas with the context of use. Through the data protection by design principle, the context of processing (i.e., nature, scope, context and purposes of processing) as well as the risks for the rights of natural persons, are required to be consider at the time of determining the means of processing and at the time of processing itself [Article 25. Data protection by design and by default of GDPR].

The context of processing is brought up in GDPR in several other cases; the ones below are representative.

Context example C.1 (for separate consent)


“Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, …”  [Recital (43) of GDPR]

The “individual case” relates to the specific needs that a certain data subject might have in respect to using a technology or service. When these are used only partially then the privacy agreement should allow for consenting to processing of data strictly related to the parts used and not to the service as a whole. ∎

Context example C.2 (for legitimate interest of the controller)


“At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place.”  [Recital (47) of GDPR]

One example from healthcare would be that of an emergency situation when the doctors need to get access to the health records of a person being at the moment unable to consent due to serious injuries, and that needs immediate intervention to be rescued.\jjshortHere and above specific examples from SCOTT can be used. ∎

Context example C.3 (for legitimate interest of data subject)


“The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing.”  [Recital (47) of GDPR]

Context example C.4 (for further processing)


“In order to ascertain whether a purpose of further processing is compatible with the purpose for which the personal data are initially collected, the controller, …, should take into account, … the context in which the personal data have been collected, in particular the reasonable expectations of data subjects based on their relationship with the controller as to their further use;”  [Article 6(4)(b) and Recital (50) of GDPR]

The relationship mentioned here belongs to the category of social or organizational context. ∎

Context example C.5 (for risk evaluation of automated decision-making and profiling)


“In order to ensure fair and transparent processing in respect of the data subject, taking into account the specific circumstances and context in which the personal data are processed, …”  [Recital (71) of GDPR]

The specific circumstances and context are considered here for minimizing the risks of automated decision-making and profiling. ∎

Context example C.6 (for measures required to be implemented by the controllers in order to promote and safeguard data protection)


“Those measures should take into account the nature, scope, context and purposes of the processing and the risk to the rights and freedoms of natural persons.”  [Recital (74 )of GDPR]

The recital refers to the accountability principle, according to which the controllers and processors are required to implement measures to promote and safeguard data protection in their protection activities, but also to demonstrate compliance with the data protection provisions. ∎

Context example C.7 (for risk to the rights and freedoms of the data subject)


“The likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing.”  [Recital (76) of GDPR]

4.1 Context eliciting criteria

The suggestion of [?] to consider the average cases first and then move to the special cases, as the number of exceptions is endless, applies for the use of our questions too. The questions we propose to elicit knowledge about the context of use are the following. \jjshortWrite about the fact that the questions are based on what is usually considered as context of use in the field of HCI and cite [28]

CQ.1  What data is the product/system processing?

Information about the type, volatility, size/amount, persistence, accuracy, and value of the data needs to be captured.

Based on this criteria it can be established if the data is personal or not. If the data gathered cannot be used to identify a natural person, a privacy evaluation will not be needed at all. If the data is of a sensitive nature, an implicit consent needs to be obtained from the data subject. If the data is of a volatile nature then control mechanisms need to be established to periodically check the accuracy and quality of the collected and stored data.

The following questions from EuroPriSe can also be considered for eliciting knowledge about the data processed: (i) questions from section “1.1.2 Processed Personal Data” (pp. 16-17), which are meant to establish if any personal data is processed or if these data constitute special kinds of data or data related to criminal convictions and offenses; and (ii) questions from section “C. Target of Evaluation (ToE)” (p. 13), such as “What types of personal data are processed when the product or service is used? Which of these data are primary, which are secondary data?”. ∎

CQ.2  Who are the privacy stakeholders of the product/solution?

We use the definition of privacy stakeholders from ISO/IEC 291000:2011 [1]:

“[Privacy stakeholders] natural or legal person, public authority, agency or any other body that can affect, be affected by, or perceive themselves to be affected by a decision or activity related to” data processing.

Considering the ISO/IEC 291000:2011 definition of stakeholders and translating it into GDPR terminology, we can reformulate the question above as follows:

Who are the data subjects, controllers, processors, recipients, third parties of the product/solution?

From EuroPriSe, the following questions are relevant to identify stakeholders: (i) from section “C. Target of Evaluation (ToE)” (pp. 13-14) “Which groups of data subjects are concerned when the product or service is used (e.g., consumers, employees of corporate customers, employees of the service provider)?”; and (ii) from section “1.1.3 Controller” (pp. 17-18) the questions meant to identify the controller of each processing operation. ∎

CQ.3  Which are the specific characteristics/attitudes/expectations/abilities/behaviors/capabilities/needs of the data subjects that need to be considered when designing for privacy?

Further classifications of users are recommended, e.g., based on their attitudes towards privacy and privacy preferences, such as the classification of [?] or the sub-scales of [?]. Assigning users to categories is an integral part of the requirements analysis phase of the usability engineering lifecycle. Personas are created to represent the key characteristics, skill sets, goals, top tasks, responsibilities, tools, or pain points of a category of users [?, Chapter 41: User Experience Requirements Analysis within the Usability Engineering Lifecycle]. Some common classifications are made based on the user’s abilities and skills, educational background or the frequency of use of the respective technology [28, Chapter 10: Establishing Requirements].

ISO/IEC29100:2011 [1, 4.5.4 Other factors, p.7] also acknowledges the importance of understanding the expectations and preferences of a natural person in respect to privacy. The capacity of a natural person to evaluate privacy risks depends on how well that person understands the technology s/he uses, her/his past experience or background, and other socio-psychological factors. ∎

CQ.4  Which are the characteristics of the data subject’s work environment and social context that could influence the privacy related requirements?

CQ.5  Which are the specific characteristics/needs/restriction of the considered industry or organization in respect to privacy?

Examples from the medical field: For assuring a balance between protecting patient information and allowing access by the clinicians in emergency medical situations, the processing should be done only under the responsibility of a professional who is subject to the obligation of professional secrecy.

One should identify whether privacy standards already exist for a specific industry (e.g., the health care industry has its own privacy standards for disclosure of patient identifiable information or training of healthcare workforce [?, Chapter 30: Human-Computer Interaction in Health Care]).

The question in the section “C. Target of Evaluation (ToE)” of EuroPriSe (p. 14) – What is the area of application of the product or service (e.g., is the product to be used in the medical or in the advertising sector)? – is relevant as a generic question in this category. ∎

CQ.6  What are the functional requirements for the system/product that might have privacy implications?

The functional requirements refer to what a product should do. In the example of an Electronic Health Records (EHR) system, this is meant to gather electronically health-related information on an individual that can be created, managed, and consulted by authorized clinicians and staff across several care organizations [?, Chapter 30: Human-Computer Interaction in Health Care]. From this functional description it becomes clear that the system is meant to deal with sensitive personal data.

For the context of use, [28, Chapter 10: Establishing Requirements] identifies four aspects of the environment that must be considered: the physical, social, organizational, and technical environment. ∎

CQ.7  Which aspects of the physical context could influence the privacy of a product/solution?

If the environment is very crowded, an IoT device could unintentionally gather data on other people happening to be in that area. ISO/IEC29100:2011 [1, 4.4.6 Unsolicited PII, p.9] says that collecting unsolicited data can be reduced by considering privacy safeguarding measures at the time of the design of the system. An example of a safeguarding measure in the case when the employees of an organization work in an open office landscape where people outside of the organization might get access, can be to lock the computer containing sensitive data before leaving for the day or being out of the office for longer periods. ∎

CQ.8  Which aspects of the social context could influence the privacy of a product/solution?

This criteria is concerned with establishing a good overview of who is collaborating with whom and how their work is coordinated, how they communicate and on which issues; e.g., data might need to be pseudonymized before being shared with other members of the organization, for further processing, that are not authorized to have access to that data. ∎

CQ.9  Which aspects of the organizational context could influence the privacy of a product/solution?

Privacy controls such as legal contracts, management practices, procedures, guidelines, organizational structures need to be put in place to safeguard the data and be properly communicated to all parties involved in the data processing. ∎

CQ.10  Which aspects of the technical environment could influence the privacy of a product/solution?

The type of the technology that is used to gather data, the existing technological limitations, and compatibility problems between applications and technologies need to be known. Langheinrich [23] and Hong and et. al. [?] give an overview of the privacy aspects concerning ubiquitous technologies (e.g., sensors embedded in our environments to gather temperature, light, and noise information can be object to secondary use and be analyzed to deduce information on living patterns of the respective individual).

The question in the section “C. Target of Evaluation (ToE)” of EuroPriSe meant to define what is the Target of Evaluation (p. 13) – which components of the product, which interfaces, which hardware and software components – are relevant for this category. ∎

5 EuroPriSe

EuroPriSe originated\jjshortRewrite this paragraph, and see not to repeat things below or in footnotes. from the Schleswig-Holstein Data Protection Seal, which was led by the Schleswig-Holstein Data Protection Authority (DPA) from ca. 2001 until the end of 2013, when it was transferred to the EuroPriSe GmbH company. The scheme has a history of eighteen years [18] and is one of the oldest privacy and data protection seals based on a law, i.e., the State Data Protection Act of the German federal State Schleswig-Holstein. The role of the seal is to help the vendors of IT products and services to comply with the data protection requirements derived from the applicable law in Europe [8, 10, 25]. \longPaperEuroPriSe, in collaboration with Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein (ULD), received support from EU to establish a trans-European privacy seal. EuroPriSe is now intended to provide EU-wide privacy certifications that assure compliance with European data protection law. In addition, the EuroPriSe criteria are already updated to consider the fairly new GDPR.

We have chosen EuroPriSe as the basis for our UP Cube because of its long history, its continuous improvement, strong list of well-developed criteria, being led in the past by a DPA, and being based on the European data protection legislation. EuroPriSe also integrates with widely acknowledged IT security certification methods s.a. ISO 27000 and the The Standard Data Protection Model15.

\onlyShortPaper
EuroPriSe Criteria: We list the names of (sub)sections as appearing in the EuroPriSe document [4], which has two parts, the second being subdivided into four sets of criteria, whereas the first contains preliminary issues, from where only section C is relevant for us.

Principles

Rights

Context

C. Target of Evaluation (ToE)
1.1.1 Processing Operations; Purpose(s)
1.1.2 Processed Personal Data
1.1.3 Controller
1.1.4 Transnational Operations
1.2.1 Data Protection by Design and by Default
1.2.2 Transparency
2.1 Legal Basis for the Processing of Personal Data
2.2 General Requirements
2.3.1 Data Collection (Information Duties)
2.3.2 Internal Data Disclosure
2.3.3 Disclosure of Data to Third Parties
2.3.4 Erasure of Data after Cessation of Requirement
2.4.1 Processing of Data by Joint Controllers
2.4.2 Processing of Data by a Processor
2.4.3 Transfer to the Third Countries
2.4.4 Automated Individual Decisions
2.4.5 Processing of Personal Data Relating to Children
2.5 Compliance with General Data Protection Principles
Set 3: Technical-Organisational Measures
Set 4: Data Subjects’ Rights
Table 1: Overview of the the EuroPriSe criteria categorized to fit into our UP Cube model, i.e., as the two axes with Principles and Rights, as well as Context of use.
\onlyShortPaper

The way the criteria are formulated, as questions, also fits with the form of our usable privacy evaluation criteria. In addition, the existing EuroPriSe evaluation, which is at the basis of our model, assures that the GDPR legal grounds are covered, including data protection principles and duties and data subject rights. The UP criteria evaluations come on top, fine-graining the EuroPriSe evaluation with usability measurements, showing how well the legislation is respected.

Another feature that is relevant for our user-centered approach is that the EuroPriSe criteria catalog has been updated to include the data protection by default paradigm, promoting built-in data protection and privacy-friendly default settings. Moreover, EuroPriSe takes into account the technical, organizational and legal framework within which the product or service is operated and asks for considering the requirements of all the parties involved in the system, aiming at strengthening the position of the data subjects. Our work shares with EuroPriSe its high-level goal of making transparent for the general public how companies are managing data protection in their products and services.

\longPaper

5.1 The EuroPriSe criteria in the UP Cube structure

\longPaper
EuroPriSe Criteria: We list the names of (sub)sections as appearing in the EuroPriSe document [4], which has two parts, the second being subdivided into four sets of criteria, whereas the first contains preliminary issues, from where only section C is relevant for us.

Principles

Rights

Context

C. Target of Evaluation (ToE)
1.1.1 Processing Operations; Purpose(s)
1.1.2 Processed Personal Data
1.1.3 Controller
1.1.4 Transnational Operations
1.2.1 Data Protection by Design and by Default
1.2.2 Transparency
2.1 Legal Basis for the Processing of Personal Data
2.2 General Requirements
2.3.1 Data Collection (Information Duties)
2.3.2 Internal Data Disclosure
2.3.3 Disclosure of Data to Third Parties
2.3.4 Erasure of Data after Cessation of Requirement
2.4.1 Processing of Data by Joint Controllers
2.4.2 Processing of Data by a Processor
2.4.3 Transfer to the Third Countries
2.4.4 Automated Individual Decisions
2.4.5 Processing of Personal Data Relating to Children
2.5 Compliance with General Data Protection Principles
Set 3: Technical-Organisational Measures
Set 4: Data Subjects’ Rights
Table 2: Overview of the the EuroPriSe criteria categorized to fit into our UP Cube model, i.e., as the two axes with Principles and Rights, as well as Context of use.
\onlyShortPaper

In order to build on EuroPriSe, we first look into how its methodology fits with our UP Cube model. We show how EuroPriSe criteria can be redistributed into one of the two axes at the basis, i.e., as either rights of the data subjects or as privacy principles, or otherwise as a context of use criterion. Table 2 gives an overview of this redistribution. The distinction between principles and rights is inspired by the structure in [16], where principles and rights represent the core of this handbook. One purpose of the principles, mentioned in [16], is to serve as the starting point when interpreting the more detailed provisions in the subsequent articles of data protection law. The law also requires that these principles should correspond to the rights presented in the articles 12 to 22. This correspondence can be visualized through the intersection between the respective rights and principles axes of the UP Cube.

In the following subsections we detail the Table 2.

\longPaper

Data protection principles.

EuroPriSe has a dedicated subset of criteria dealing with data protection principles, i.e., “2.5 Compliance with General Data Protection Principles”. In addition, several other criteria sets from EuroPriSe can be related to data protection principles, as detailed below.

  • The second part of the section “C. Target of Evaluation (ToE)”, called “Regulatory Analysis”, as well as the “Purpose(s)” part of the Subset 1.1.1, refer to the principle of purpose limitation. This principle requires that the purpose of data processing must be defined before processing is started.

  • The “1.2.1 Data Protection by Design and by Default” refers to the data minimization principle. This is pointed out in a note introducing the subset.

  • The criterion “How long are the data retained? Is this no longer than necessary for the purposes concerned?” of the “1.2.1.1 Data protection by Design” refers to the storage limitation principle.

  • “1.2.2 Transparency” relates to the transparency of processing principle.

  • “2.1 Legal Basis for the Processing of Personal Data” refers to the principle of lawfulness. Consent of the data subject or another legitimate ground (provided in the data protection legislation) are required as legal basis for processing personal data. This subset expounds as well on aspects that we categorize as belonging to the context of use\workInProgress such as 4.1 or 4.1.

  • The following sets of criteria are related to the accountability principle:

    • The subset “2.2.1 Record of Processing Activities” details how controllers can facilitate compliance with the accountability requirement through recording processing activities and making them available to the supervisory authority upon request (this is also explained in [16, 3.7. The accountability principle]).

    • The subset “2.2.2 Designation of a Data Protection Officer” details how controllers can facilitate compliance with the accountability requirement through designating a data protection officer who is involved in all issues relating to personal data protection (this is also explained in [16, 3.7. The accountability principle]).

    • The subset “2.2.4 Data Protection Impact Assessment” details how controllers can facilitate compliance with the accountability requirement through undertaking data protection impact assessments for types of processing likely to result in high risks to the rights and freedoms of natural persons (this is also explained in [16, 3.7. The accountability principle]).

    • The subset “2.2.5 Prior consultation” details how compliance is promoted through prior consultation of the relevant supervisory authority if the impact assessment indicates that processing presents risks that cannot be mitigated (this is also explained in [16, 4.3. Rules on accountability and promoting compliance]).

  • The following sets are related to the data security principle:

    • The subset “2.2.6 Notification of a personal Data Breach” details how the controller is required to notify the competent supervisory authority without undue delay, in cases where a personal data breach (with risks for rights and freedoms of individuals) takes place. The data subject being concerned needs to be informed as well (this is also explained in [16, 3.6. The data security principle]).

    • The set “Set 3: Technical-Organisational Measures: Accompanying Measures for Protection of the Data Subject” follows for the most the [2, Article 32: Security of processing].

The rights of the data subjects.

EuroPriSe has a dedicated subset of criteria dealing with data subjects’ rights: “Set 4: Data Subjects’ Rights”. In addition, several other criteria sets from EuroPriSe can be related to rights of the data subjects, as detailed bellow.

  • “2.3.1 Data Collection (Information Duties)” refers to the right to be informed, following [2, Articles 12, 13 and 14].

  • “2.3.4 Erasure of Data after Cessation of Requirement” refers to the right to erasure, following [2, Article 17].

  • “2.4.4 Automated Individual Decisions” refers to the rights related to automated individual decision-making, following [2, Article 22].

Mixed and context of use

  • “2.3.2 Internal Data Disclosure” and “2.3.3 Disclosure of Data to Third Parties” refer to a mixture of rights and principles. These subsets have pointers to the GDPR articles they are based on, which are indicators for which rights and principles the sets can be linked to. Article 5(b) refers to the principle of purpose limitation, (c) to the principle of data limitation, (f) to the data security principle, article 6 to the lawfulness of processing principle, while articles 13 and 14 to the right to be informed.

  • Section “C. Target of Evaluation” and other subsections from 1.1 and from 2.4 are seen as relevant for defining the context of use.

  • Criteria relevant for defining stakeholder groups\workInProgress in 4.1 are found in “2.2.7 Processing under the Authority of the Controller or Processor”, “2.4.1 Processing of data by Joint Controllers”, and “2.4.2 Processing of Data by a Processor”, and regard obligations of the controllers, processors, joint controllers, and third parties. These groups are also mentioned in [16, 2.3. Users of personal data] as users of the data. The criteria set “2.2.3 Designation of the Representative in the EU” defines representatives within the EU; these are also mentioned in [16, Data protection terminology], in the context of defining the controllers group. When the controller is established outside the EU it needs to appoint a representative within the EU territory.

    However, some of the criteria included in these sets could also be related to principles or rights, e.g., the criterion “Does the processor adhere to an approved code of conduct or an approved certification mechanism?” (“2.4.2 Processing of data by a Processor”) belongs to the accountability principle, which states that the controllers must be able to demonstrate compliance with data protection provisions.

    \workInProgress
    • Section “C. Target of Evaluation (ToE)/ ToE Analysis” helps with defining the object of the evaluation, types of data, groups of data subjects and area of application. As shown in section 4, such type of criteria are used to define the context of use: 4.1, 4.1, 4.1.

    • The questions in section “1.1.1 Processing operations” are meant to define the operations associated with the use of a product or service, which relate in our case with 4.1.

    • “1.1.2.2 Special Categories of Personal Data” and “1.1.2.3 Personal Data Relating to Criminal Convictions and Offenses” relate to the 4.1.

    • “1.1.3 Controller” relates to 4.1.

    • “1.1.4 Transnational Operations” and “2.4.3 Transfer to the Third Countries” point to the physical context 4.1, which encompasses in this case a larger territorial view, extending beyond the physical space of a building (office building or private house), to countries or transnational areas where the product might be used or data might be processed.

    • “2.4.5 Processing of Personal Data Relating to Children” refer to 4.1, 4.1.

6 Usable Privacy Goals

We identify usable privacy goals (henceforth called Usable Privacy goals, and abbreviated as UP goals) that appear in the GDPR text. These guide the work in Section 7 where we present the UP criteria meant to measure to what extent these goals are being achieved. \onlyShortPaperWe give here only some examples of goals, numbered as in the long version [?], where the full list of 30 UP goals can be found. The goals are listed in the order they appear in the legislation. The words emphasized in each goal relate to usability. The chosen words are those that can be interpreted differently based on the context they are used in, and can result in objective and perceived measurements when evaluated in usability tests. These words also capture goals that can be achieved up to certain degrees, and thus can be translated into a level in an evaluation scale. \longPaper A list of the Recitals and Articles from GDPR where the goals were extracted from can be found in full in the Annex A. In addition to the GDPR, there are more specific data protection laws, such as the proposed ePrivacy Regulation, that have implications for usability, from where one could eventually extract additional usability goals.

\longPaper

UPG.1 Ensuring a high level of protection of personal data.   [Recital (6) of GDPR]

UPG.2 Natural persons should have control of their own personal data.   [Recital (7) of GDPR]

\onlyShortPaper

UPG.3 Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her.   [Recital (32) of GDPR]

\longPaper

UPG.4 If the data subject’s consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided.   [Recital (32) of GDPR]

\longPaper

UPG.5 Any information and communication related to the processing of personal data should be easily accessible and easy to understand.   [Recital (39) of GDPR]

\longPaper

UPG.6 Any information and communication related to the processing of personal data should use clear and plain language.   [Recital (39) of GDPR]

UPG.7 Make the natural persons aware of risks, rules, safeguards and rights in relation to the processing of personal data.   [Recital (39) of GDPR]

\onlyShortPaper

UPG.8 Make the natural persons aware of how to exercise their rights in relation to processing of personal data.   [Recital (39) of GDPR]

\longPaper

UPG.9 The specific purposes for which personal data are processed should be explicit.   [Recital (39) of GDPR]

UPG.10 The personal data should be adequate, relevant and limited to what is necessary for the purposes for which they are processed.   [Recital (39) of GDPR]

UPG.11 Personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means.   [Recital (39) of GDPR]

\longPaper

UPG.12 In the context of a written declaration on another matter, safeguards should ensure that the data subject is aware of the fact that and the extent to which consent is given.   [Recital (42) of GDPR]

\longPaper

UPG.13 A declaration of consent pre-formulated by the controller should be provided in an intelligible and easily accessible form, using clear and plain language and it should not contain unfair terms.   [Recital (42) of GDPR]

\longPaper

UPG.14 The data subject should have genuine and free choice in giving the consent.  [Recital (42) of GDPR]

\longPaper

UPG.15 The data subject should be able to refuse or withdraw consent without detriment.   [Recital (42) of GDPR]

\longPaper

UPG.16 Carefully assess the existence of a legitimate interest of a controller taking into consideration the reasonable expectations of data subjects based on their relationship with the controller.   [Recital (47) of GDPR]

UPG.17 Assess if the interests and fundamental rights of the data subject could override the interest of the controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing.   [Recital (47) of GDPR]

\onlyShortPaper

UPG.18 Any information addressed to the public or to the data subject should be concise, easily accessible and easy to understand.   [Article 12 (1) and Recital (58) of GDPR]

UPG.19 Any information addressed to the public or to the data subject should use clear and plain language.   [Article 12 (1) and Recital (58) of GDPR]

\longPaper

UPG.20 Any information addressed to the public or to the data subject should use, when appropriate, visualization.   [Recital (58) of GDPR]

\onlyShortPaper

UPG.21 Provide information of the intended processing in an easily visible, intelligible and clearly legible manner.   [Article 12 (7) and Recital (60) of GDPR]

\longPaper

UPG.22 Provide a meaningful overview of the intended processing.   [Article 12 (7) and Recital (60) of GDPR]

\longPaper

UPG.23 A data subject should have the right of access to personal data which have been collected concerning him or her, and should exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing.   [Recital (63) of GDPR]

\onlyShortPaper

UPG.24 Allow the data subjects to quickly assess the level of data protection of relevant products and services.  [Recital (100) linking to Article 42 of GDPR]

\longPaper

UPG.25 The request for consent should be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language.   [Article 7 (2) of GDPR]

\longPaper

UPG.26 It should be as easy to withdraw as to give consent.   [Article 7 (3) of GDPR]

\longPaper

UPG.27 Facilitate the exercise of the data subjects rights under Articles 15 to 22 – right of access, right to rectification, right to erasure, right to restriction of processing, right to data portability, right to object and automated individual decision-making.   [Article 12 (2) of GDPR]

\longPaper

UPG.28 The data subject should obtain from the controller meaningful information about the logic involved, as well as the significance and the envisaged consequences of automated decision-making, including profiling to which s/he is object to.   [Article 15 (1) (h) of GDPR]

\longPaper

UPG.29 The right to object should be explicitly brought to the attention of the data subject and should be presented clearly and separately from any other information, at the latest at the time of the first communication with the data subject.   [Article 21 (4) of GDPR]

\longPaper

UPG.30 The data subject should have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.   [Article 22 (1) of GDPR]

7 Usable Privacy Criteria

\longPaper

The criteria presented in this section can be used in an evaluation process for establishing the level of effectiveness, efficiency, and satisfaction with which the goals from Section 6 are reached, wrt. a specific context of use.

The proposed criteria are always measurable, which makes the results of a privacy evaluation easier to present visually through the use of a privacy labeling scheme. The use of privacy labels will then fulfill the goal 6. This goal has a special significance from a usability point of view as it reduces considerably the effort spent by the data subject for evaluating privacy, which for most users is not the primary task [6] and it gets in the way of buying or using a product or service.

\longPaper

Evaluating privacy and compliance with GDPR is done by certification bodies, providing seals and marks with the purpose of enhancing consumer trust and promoting transparency and compliance with the data protection regulations. Prior to GDPR, the lack of legal constraints, disconnection from official regulatory oversight, and lack of effective enforcement has resulted in inaccurate, false, or outdated privacy certificates. This made the existing certification schemes to lose their trustworthiness with the users (e.g., see in [15] criticisms of TRUSTe – now known as TrustArc16). Though the certification is still voluntary, GDPR endorses and facilitates a certification mechanism as a means to demonstrate compliance with data protection provisions. In addition, the existence of a certificate makes the process of choosing processors easier for the controllers too, especially so since GDPR establishes responsibility and liability for any processing carried out on the controller’s behalf [20].

\longPaper

7.1 General considerations for UP Criteria

\longPaper

Some of the goals have a more general purpose, and their achieved levels can be decided based on measurements coming from more specific criteria. In the case of the \longPaper6\onlyShortPaper[?, UPG.2], in order to establish if the data subjects have reached a high level of “control of their own personal data”, the scores from evaluations of e.g., \longPaper6\onlyShortPaper[?, UPG.14] should also be high.

Another example of a general goal is 6, which related to the data minimization principle. This goal is of special importance for our model since when there is no processing we give automatically the highest score on the evaluation scale. The criterion in the section “1.1.2.1 Personal data” of EuroPriSe: “Are any personal data processed when the product or service is used?” can be used to establish if personal data is being processed. This can be complemented by another EuroPriSe criterion, found in the section “1.2.1.1 Data Protection by design” (p. 18): “Is it possible to carry out the processing without the use of identifiable data all together?”. This criterion has the function to encourage the companies – if possible – to not process identifiable data at all.

The \longPaper6\onlyShortPaper[?, UPG.1] is another general goal, dealing with the protection of personal data in general. We use this goal to exemplify how a criterion should be formulated to consider usability:

What is the level of the usability of the personal data protection / privacy that the product or service ensures?

For being able to establish a level of how usable the privacy protection is, the evaluation needs to produce measurable outcomes. The structure that we follow is the one proposed in the ISO 9241-11:2018 where the measures consider both the objective and the perceived outcomes of usability (the UP criteria are labeled accordingly). The measurements will produce counts or frequencies (e.g., how many errors the user does when probed to do certain privacy related tasks) and continuous data (e.g., how much time does the user spend on completing a task related to privacy). The evaluation based on the UP criteria proposed below will produce three main categories of measures:

  1. measures of accuracy and completeness,

  2. resource utilization (time, effort, financial, and material resources), and

  3. measures resulting from satisfaction scales.

In our structuring of the UP criteria we first give a high-level criterion, numbered e.g. \onlyShortPaper7.2.1\longPaper7.2, which is closely related – which can also be seen from the wording – to one of the goals that we identified in Section 6. The score for a main UP criterion is established based on evaluations of more specific UP criteria, called subcriteria. The resources used to achieve a criterion, i.e., time, effort, financial, and material (which we abbreviate TEFM), should be measured to be able to determine the efficiency with which a specific criterion was reached. In addition, the results from the evaluations should show the level of perception that the data subjects have about their data being protected. The data subjects need to be highly satisfied with the offered privacy protection. The “high satisfaction” level is defined based on the user satisfaction evaluation of the respective subcriteria.

\longPaper

Notation and organization principles.

We use here as example the first UP criterion, numbered 7.2, to explain the organization of the UP criteria and the various notations that we use in the rest of this section.

We categorize the UP criteria based on their area of application from the GDPR text. Figure 2 gives an overview of the number of criteria in each category. \longPaper

  1. Consent (lawful grounds for processing data principle).

  2. Information and communication addressed to the public or to the data subject (transparency principle),

  3. Rights of the data subjects (rights in general),

  4. Purpose of processing.

  5. Legitimate interest of either the processor or the data subject (lawful grounds for processing data principle).

Figure 2: An overview of the distribution of usable privacy criteria in each category.
\onlyShortPaper
\onlyShortPaper

A high-level UP criterion, like 7.2.1, is labeled with the goal that it is related to, 6.\longPaper A high-level UP criterion, like 7.2, is labeled with the goal that it is related to, e.g.:

[Based on goal 6].

Each UP criterion also is categorized using a label, e.g.:

[Type of criteria: generic].

\longPaper

For subsequent UP criteria we use a short version of these labels, which should be self-explanatory, e.g., we label 7.2.1 by the goal that it aims at [6].

For each high-level UP criterion we identify several specific UP subcriteria, numbered accordingly, e.g., 7.2.1. \longPaper

Considerations regarding the classification of UP subcriteria. We then classify each UP subcriterion (e.g., from 7.2.1 to 7.2.1) into either effectiveness, efficiency, or satisfaction, and label it accordingly\onlyShortPaper.\longPaper, e.g.:

a short version of the label [Effectiveness] would be [Es];

similarly [Ey] and respectively [S]. \longPaper

For the first UP criterion 7.2, we give objective and perceived subcriteria. This is the structure we recommend to be used in a real evaluation. However, with the intention of reducing the complexity of the present paper, for the subsequent UP criteria, we only label the various subcriteria with the respective labels and sublabels of effectiveness, efficiency, and satisfaction, e.g.:

7.2 is labeled with [Effectiveness] and [Measure:Objective].

We try to be exhaustive in our UP subcriteria and to give enough questions to cover all major aspects that need to be measured to achieve the respective goal that the high-level UP criterion relates to. The UP subcriteria are labeled with sublabels representing various specific measures of usability for the above three general categories, e.g.: \onlyShortPaper[Effectiveness:Completeness]. \longPaper

[Effectiveness:Completeness] or

[Satisfaction:Cognitive responses].

\longPaper

Considerations regarding the context of use. The specific context of use needs to be considered for each of our questions. To avoid repetition, we only give one example of how the questions should be formulated so that they relate to the context. This formulation applies to all the questions we propose in this section. For the example of 7.2 one would read it:

  • Without context
    What is the level of control the data subjects have over their data?

  • With context
    What is the level of control the specified type of data subjects have over their data in the specified context of use?

7.2 List of Usable Privacy criteria

\onlyShortPaper

We give few examples of the usable privacy criteria, while the full list of all 24 UP criteria can be found in the long version [?]. Since our criteria are modular (i.e., each high-level criterion is thought independent of the other) and can be ordered based on their importance for different application cases, they could be introduced gradually and selectively. It can be that certification bodies (like EuroPriSe) would start to include our UP criteria in their future test catalogs on an article-basis, e.g., a good candidate is Article 12 of GDPR (referring to rights that intersect with the transparency principle) as it contains five UP goals.

\longPaper

UPC.1  What is the level of control that the data subjects have over their data? [Based on goal 6][Type of criteria: generic]

  • UPC.1.1  To what degree are the data subjects in control of the personal data? [Effectiveness][Measure:Objective]

  • UPC.1.2  What is the data subjects’ perceived level of control? [Effectiveness][Measure:Perceived]

  • UPC.1.3  How much [Time / Effort / Financial / Material resources] do the data subjects need to invest in order to have control over the processed data? [Efficiency:Time used, Human effort expanded, Financial resources expanded, Materials expanded][Measure:Objective]

  • UPC.1.4  How much [Time / Effort / Financial / Material resources] do the data subjects perceive that they need to invest in order to have control over the processed data? [Efficiency:Time used, Human effort expanded, Financial resources expanded, Materials expanded][Measure:Perceived]

  • UPC.1.5  How frequent do the data subjects use the data controlling tools put to their disposition? [Satisfaction][Measure:Objective]

  • UPC.1.6  What is the level of satisfaction of the data subjects with the achieved level of control? [Satisfaction][Measure:Perceived]

UP criteria related to information and communication

\onlyShortPaper

UPC.2  Is any information and communication addressed to the public or to the data subjects related to the processing of personal data concise, easily accessible and easy to understand? [6][Type of criteria: Information and communication addressed to the public or to the data subjects]

  • How much TEFM do the data subjects need to invest in order to [ UPC.2.1  access, UPC.2.2  read through, UPC.2.3  understand] the information? [Efficiency:TEFM][O]

  • How much of the information were the data subjects able to [ UPC.2.4  access, UPC.2.5  understand, UPC.2.6  read through]? [O][Es:Completeness]

  • UPC.2.7  To what degree the data subjects perceive the information as concise? [Satisfaction:Cognitive responses] [P]

  • To what degree the data subjects perceive the information as easy to abcdef [ UPC.2.8  access, UPC.2.9  understand]? [Satisfaction:Cognitive responses][P]

Remark 1

The subcriteria in 7.2.1 refer to cognition and understanding, while the subcriteria in 7.2.1 refer to visual aspects of the information presented.

Remark 2

In different HCI works one can find different formulations that could seem related to how we formulate the subcriteria, e.g.: “Can the data subjects make sense of the information at all?”; “What is the extent to which the data subjects make sense of the information?”. However, we intend to measure the proportion of the information that is made sense of. Therefore we use formulations that give a statistically measurable outcome, such as “How much?”, “What is the percentage?”, “What is the degree?”.

UPC.3  Is the information about the intended processing provided in an easily visible, intelligible and clearly legible manner? [6][Type: Info]

  • How much TEFM do the data subjects need to invest in order to [ UPC.3.1 see/locate, and UPC.3.2 distinguish] the information? [Ey:TEFM]

  • How well were the data subjects able to [ UPC.3.3 visually locate and UPC.3.4 distinguish] the information? [Es:Accuracy]

  • How much of the information were the data subjects able to [ UPC.3.5 visually locate and UPC.3.6 distinguish]? [Es:Completeness]

  • To what degree the data subjects perceive the information as [ UPC.3.7 easily visible, UPC.3.8 intelligible, and UPC.3.9 clearly legible]? [S:Cognitive responses]

Remark 3

Poor visibility can affect the perception of trust, as information that has low visibility can appear to be hidden with a purpose. Poor legibility can reflect sloppiness in the way the content is produced, which again can give an impression of lack of professionalism. Poor visibility and legibility affects the satisfaction of the data subjects and it can cause physical discomfort (e.g., to the eyes, by having to read a text written in a very small font).

UPC.4  Is any information and communication addressed to the public or to the data subjects related to the processing of personal data using clear and plain language? [6][Type: Info]

  • What is the level of [ UPC.4.1 clearness and UPC.4.2 plainness] of the language? [Es:Accuracy]

  • UPC.4.3 What is the percentage of the data subjects that understand the language? [Es:Completeness]

  • What is the portion of the language considered [ UPC.4.4 plain and UPC.4.5 clear]? [Es:Completeness]

  • How [ UPC.4.6 clear and UPC.4.7 plain] do the data subjects perceive the language to be? [S:Cognitive responses]

\longPaper

UPC.5 Is the information and communication addressed to the public or to the data subjects using, when appropriate, visualization? [6][Type: Info]

  • UPC.5.1 How much of the data subjects’ expended TEFM is reduced by the use of visualization? [Ey:TEFM]

  • UPC.5.2 How well is the information understood when visualization is used, in comparison to when only text is used? [Es:Accuracy]

  • UPC.5.3 How many data subjects consider the use of visualization in the evaluated context of use as appropriate? [Es:Accuracy][P]

  • UPC.5.4 What is the percentage of data subjects that understand the information better, when visualization is used? [Es:Completeness]

  • UPC.5.5 To what degree is the understanding of the information improved by the use of visualization? [Es:Completeness]

  • UPC.5.6 What is the level of satisfaction of the data subjects when visualization is used? [S:Cognitive responses]

  • UPC.5.7 How appropriate do the data subjects perceive the use of visualization to be for the evaluated context of use? [S:Cognitive responses]

Remark 4

Some of the subcriteria in 7.2.1 mention the “understanding of information” in relation with visualization. However, measurements of other aspects, besides cognitive effort, such as how visualization improves the rate of finding and accessing information, should be evaluated here as well.

\longPaper

UPC.6  Are the data subjects provided a meaningful overview of the intended processing? [6][Type: Info]

  • UPC.6.1 How much of the data subjects’ expended TEFM is reduced by using the provided overview? [Ey:TEFM]

  • What is the percentage of the data subjects that [ UPC.6.2 use and UPC.6.3 understand the content better due to] the provided overview? [Es:Accuracy]

  • UPC.6.4 What is the degree of improvement that the overview brings to the understanding of the content by data subjects? [Es:Completeness]

  • UPC.6.5 What is the percentage of data subjects able to express the correct and intended meaning of the provided overview, when probed?
    [Es:Completeness]

  • UPC.6.6 How meaningful do the data subjects perceive the provided overview? [S:Cognitive responses]

\longPaper

UPC.7  Have the data subjects obtained from the controller meaningful information about the logic involved, as well as the significance and the envisaged consequences of automated decision-making, including profiling to which they are object to ? [6][Type: Info]

Remark 5

To avoid repetition, in the subordinated subcriteria, we write the above text between angle brackets in the following short form: LOGIC.

  • UPC.7.1 How much TEFM do the data subjects need to invest in order to understand the information about LOGIC? [Ey:TEFM]

  • UPC.7.2 What is the percentage of data subjects being able to express the correct and intended meaning of the information provided – in respect to LOGIC – when probed? [Es:Accuracy]

  • To what degree do the provided information [ UPC.7.3 affect the choices and actions and UPC.7.4 improve the understanding] of the data subjects in respect to LOGIC? [Es:Accuracy] resp. [Es:Completeness]

  • UPC.7.5 How much of the provided information – in respect to LOGIC – is understood by the data subjects? [Es:Completeness]

  • UPC.7.6 How meaningful do the data subjects perceive the provided information in respect to LOGIC? [S:Cognitive responses]

\longPaper

UP criteria related to consent

Several UP goals are found in the consent related provisions. These provisions are evaluated in detail in the EuroPriSe sections 2.1.1.1 Processing on the Basis of Consent and 2.1.1.2 Processing on the Basis of a Contract. The criteria we generate here are meant to complement the ones in the EuroPriSe through bringing in usability concerns. Marc Langheinrich presents several of the problems with how consent can be misused [23]. One of these is the “take it or leave it” dualism where the person does not have a real choice and thus getting consent comes very closed to blackmailing. This problem has been ameliorated in the GDPR law by asking the controllers to allow for separate consent for different data processing operations. A usability evaluation could help further by revealing how the data subjects perceive the consenting act, as well as whether the data subjects consider consent a real choice and if the options to consent to some of the processing operations only, are satisfactory. \jjshortTODO: Write about the dark patterns used by companies based on the psychological tendencies in users.

\onlyShortPaper

UPC.8 Is consent given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subjects’ agreement to the processing of personal data relating to them? [6][Type: Consent]

  • UPC.8.1 How much of the consent text do the data subjects understand? [Es:Completeness]

  • UPC.8.2 How much of the implications of consenting do the data subjects understand? [Es:Completeness]

  • To what degree do the data subjects perceive the agreement to be [ UPC.8.3 freely given, UPC.8.4 informed, and UPC.8.5 unambiguous]? [S:Cognitive responses]

\longPaper

UPC.9 Are the consents of the data subjects given by following a request by electronic means? If yes, is the request clear, concise and not unnecessarily disruptive to the use of the service for which it is provided? [6][Type: Consent]

  • UPC.9.1 How much TEFM do the data subjects need to invest in order to understand the request? [Ey:TEFM]

  • UPC.9.2 How much of the request do the data subjects understand?
    [Es:Completeness]

  • UPC.9.3 How much of the TEFM needed to fulfill the tasks that the data subjects are currently doing, is being wasted by attending to the request? [Ey:TEFM]

  • UPC.9.4 To what degree do the data subjects perceive the request to be unnecessarily disruptive? [S:Cognitive responses]

\longPaper

UPC.10 In the context of a written declaration on another matter, are safeguards ensured so that the data subjects are aware of the fact that, and the extent to which, consent is given? [6][Type: Consent]

  • UPC.10.1 What is the percentage of the data subjects being able to show that they are aware of the fact that, and the extent to which, consent is given, when probed? [Es:Accuracy]

  • UPC.10.2 Is the level of awareness showed matching the intended degree of awareness? [Es:Completeness]

  • UPC.10.3 How sufficient are the safeguards taken to ensure that the data subjects are being aware of the fact that, and the extent to which, consent is given? [Es:Completeness]

  • UPC.10.4 How much TEFM do the data subjects need to invest in order to become aware of the fact that, and the extent to which, consent is given? [Ey:TEFM]

  • To what degree do the data subjects perceive [ UPC.10.5 themselves as being aware, and UPC.10.6 that enough safeguards have been taken to help them become aware] of the fact that, and the extent to which, consent is given? [S:Cognitive responses]

\longPaper

UPC.11 Is the declaration of consent, pre-formulated by the controller, provided in an intelligible and easily accessible form, using clear and plain language, and not containing unfair terms? [6][Type: Consent]

  • How much TEFM do the data subjects need to invest in order to [ UPC.11.1  access, UPC.11.2  read, and UPC.11.3  understand] the declaration of consent? [O][Ey:TEFM]

  • How is the TEFM spent in relation to the TEFM expected by [ UPC.11.4 the controllers, or UPC.11.5 the data subjects]? Are the differences reasonable?\jjshortNeed to decide the exact labels for this one.

  • To what degree do the data subjects perceive [ UPC.11.6 the terms as unfair, UPC.11.7 the language of the declaration of consent as clear and plain, and UPC.11.8 the declaration of consent as being intelligible an having an easily accessible form]? [P][S:Cognitive responses]

Remark 6

The criterion 7.2.2 is similar to the criteria 7.2.1 and 7.2.1, only that it refers to the declaration of consent (or terms of services), and thus we expect that besides the above subcriteria one would also employ subcriteria analogous to those in 7.2.1.x and 7.2.1.x.

\longPaper

UPC.12 Is the request for consent presented in a manner clearly distinguishable from the other matters? [6][Type: Consent]

  • What is the percentage of the data subjects able to [ UPC.12.1 understand that their consent is requested, and UPC.12.2 clearly distinguish the request for consent from the other matters] when probed? [P][Es:Accuracy]

  • UPC.12.3 How much TEFM do the data subjects need to invest in order to distinguish the request for consent from the other matters? [Ey:TEFM]

  • UPC.12.4 To what degree do the data subjects perceive the request for consent as clearly distinguishable from other matters? [P][S:Cognitive responses]

Remark 7

The criterion 7.2.2 is to some extent similar to the criterion 7.2.1 only that it talks about distinguishability of the declaration of consent (or terms of services), and thus one can expect more subcriteria similar to those from 7.2.1 to be useful.

\longPaper

UPC.13 Do the data subjects have free and genuine choice in giving consent? [6][Type: Consent]

  • UPC.13.1 To what degree do the data subjects perceive the choice of consenting as free and genuine? [P][S:Cognitive responses]

  • UPC.13.2 Are the data subjects being offered any alternatives in case of not being able/not wanting to consent? [O][Es:Completeness]

\longPaper

UPC.14 Are the data subjects being able to refuse or withdraw consent without detriment? [6][Type: Consent]

  • UPC.14.1 How much TEFM losses are there for the data subjects, related to withdrawing the consent? [Ey:TEFM]

  • UPC.14.2 When evaluating the overall consequences for the data subjects, in case of withdrawing the consent, what is the degree of impact on the data subjects? [O][Es:Accuracy]

  • UPC.14.3 To what degree do the data subjects perceive that it is detrimental for them to refuse or withdraw consent? [P][S:Cognitive responses]

\longPaper

UPC.15 Is it as easy to withdraw consent as it is to give consent? [6][Type: Consent]

  • UPC.15.1 How much TEFM do the data subjects spend to withdraw consent? Compare this to the TEFM needed to give consent (i.e., sum up results from 7.2.2, 7.2.2, 7.2.2-7.2.2, and 7.2.2). [Ey:TEFM]

  • UPC.15.2 Do the data subjects perceive withdrawing of the consent similarly easy to giving consent? [P][S:Cognitive responses]

\longPaper

UP criteria related to data subject rights

UPC.16 Are the rights of the data subjects, under Articles 15 to 22, (i.e., right of access, right to rectification, right to erasure, right to restriction of processing, right to data portability, right to object, and rights related to automated individual decision-making) facilitated? [6][Type: Rights]

  • UPC.16.1 How much TEFM do the data subjects spend in order to exercise their rights? [Ey:TEFM]

  • UPC.16.2 How many of the rights under Articles 15 to 22 are facilitated; and to what degree? [Es:Completeness]

  • UPC.16.3 To what degree do the data subjects perceive that their rights are facilitated? [P][S:Cognitive responses]

  • UPC.16.4 What is the percentage of the data subjects that are able to exercise their rights with ease, when probed? [Es:Accuracy]

UPC.17 Are the data subjects being aware of how to exercise their rights in relation to processing of personal data? [6][Type: Rights]

  • UPC.17.1 To what degree do the data subjects feel that they are aware of how to exercise their rights in relation to processing of personal data? [P][S:Cognitive responses]

  • UPC.17.2 What is the percentage of data subjects being able to explain which are the ways they could use to exercise their rights in relation to processing of personal data, when probed? [Es:Accuracy]

UPC.18 Do the data subjects have the right of access to personal data that has been collected concerning them, and can they exercise this right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing? [6][Type: Rights]

  • UPC.18.1 How much TEFM do the data subjects spend in order to access the personal data that has been collected concerning them? [Ey:TEFM]

  • To what degree do the data subjects perceive [ UPC.18.2 accessing the personal data as easy, UPC.18.3 the intervals they are given access to the data as reasonable, and UPC.18.4 themselves as being aware of the lawfulness of the processing ]? [P][S:Cognitive responses]

  • What is the percentage of the data subjects [ UPC.18.5 being able to access the personal data as easy as intended, UPC.18.6 found to be aware of the lawfulness of the processing, UPC.18.7 that can verify the lawfulness of the processing], when probed? [Es:Accuracy]

  • UPC.18.8 How much of the personal data concerning them are the data subjects being able to access? [O][Es:Completeness]

UPC.19 Is the right to object explicitly brought to the attention of the data subjects and presented clearly and separately from any other information, at the latest at the time of the first communication with the data subjects? [6][Type: Rights]

  • UPC.19.1 How much TEFM do the data subjects spend to find the information related to the right to object? [Ey:TEFM]

  • What is the percentage of the data subjects being able to [ UPC.19.2 separate the right to object from any other information, and UPC.19.3 exercise their right to object] – when probed? [Es:Accuracy]

  • To what degree do the data subjects perceive [ UPC.19.4 the right to object as clearly presented, and UPC.19.5 the way the right to object has been brought to their attention as explicit]? [O][Es:Completeness]

UPC.20 Are the data subjects being aware of risks, rules, safeguards and rights in relation to the processing of their personal data? [6][Type: Rights]

  • UPC.20.1 How much TEFM do the data subjects spend in order to become aware of the risks, rules, safeguards and rights in relation to the processing of their personal data? [Ey:TEFM]

  • UPC.20.2 How accurately can the data subjects remember which are the risks, rules, safeguards and rights in relation to the processing of their personal data, when probed? [Ey:Cognitive responses]

  • UPC.20.3 How many of the risks, rules, safeguards and rights in relation to the processing of their personal data are the data subjects being able to remember? [Es:Completeness]

  • UPC.20.4 To what degree do the data subjects feel that they are aware of the risks, rules, safeguards and rights to their personal data? [P][S:Cognitive responses]

  • UPC.20.5 What is the percentage of the data subjects being able to understand the implications of the risks, rules, safeguards and rights to their personal data? [P][Es:Accuracy]

\longPaper

UP criteria related to the purpose of processing

UPC.21 Is the specific purpose for which personal data are processed explicit? [6][Type: Purpose]

  • UPC.21.1 How accurately can the data subjects remember the purpose? [Ey:Cognitive responses]

  • UPC.21.2 How many of the purposes can the data subjects remember correctly when several purposes are given? [Es:Completeness]

  • UPC.21.3 What is the percentage of the data subjects being able to show that they know what is the purpose for which personal data are processed? [P][Es:Accuracy]

UPC.22 Is the personal data adequate, relevant and limited to what is necessary for the purposes for which they are processed? [6][Type: Purpose]

  • UPC.22.1 To what degree do the data subjects feel that the processing of their personal data are adequate, relevant and limited to what is necessary for the given purposes? [P][S:Cognitive responses]

  • UPC.22.2 How many aspects do the data subjects point out to be inadequate, irrelevant and less or more than necessary? [Es:Completeness]

\longPaper

UP criteria related to the legitimate interest of either the processors or the data subjects

UPC.23 Is the existence of a legitimate interest of a controller carefully assessed, taking into consideration the reasonable expectations of the data subjects based on their relationship with the controller? [6][Type: Legitimate]

  • UPC.23.1 How much TEFM do the data subjects spend to assess the legitimate interest of the controller? [Ey:TEFM]

  • UPC.23.2 To what degree do the data subjects perceive the legitimate interest of the controller as carefully assessed? [P][S:Cognitive responses]

UPC.24 Has it been assessed whether the interests and fundamental rights of the data subjects could override the interest of the controllers where personal data are processed in circumstances where the data subjects do not reasonably expect further processing? [6][Type: Legitimate]

  • UPC.24.1 How much is known by the controllers about which are the circumstances where the data subjects do not reasonably expect further processing? How much of these knowledge have been confirmed by the data subjects? [Es:Completeness]

  • UPC.24.2 Do the data subjects and their controllers have a mutual agreement on what is considered to be reasonable further processing? [Es:Accuracy]

8 Interactions between the three axes

Characteristic to the data legislation text is that it always refers to how principles and rights intersect and depend on each other. In this section, we give examples of such references found in the recitals of GDPR, relevant for some of the identified usability goals. The recitals, though not legally biding, are meant to provide more details to the GDPR’s articles. The lawfulness, fairness, and transparency of processing principles, and the right to be informed appear to be closely interrelated, having also the highest occurrence of usability goals.

  1. The UP criterion \longPaper7.2\onlyShortPaper[?, UPC.1] refers to the control the data subjects have over their data. The criterion can be related to the right to data portability, through the Recital (68), where due to the aim of strengthening the control of the data subject, the “data subject should also be allowed to receive personal data concerning him or her, which he or she has provided to a controller in a structured, commonly used, machine-readable and interoperable format, and to transmit it to another controller …”. The same UP criterion can also be linked to data security principle through the provision in the Recital (75) where the “risk to the rights and freedoms of natural persons” can result in data subjects being deprived of their rights and freedoms or prevented from exercising control over their personal data. The “risk to the rights and freedoms of natural persons” is also mentioned by the [16, pp. 131, 134] in the context of data security principle.

  2. The UP criteria 7.2.1 and 7.2.1 are related to the transparency of processing principle, which is referred to directly in the Recital (58), where the respective goals are extracted from – “The principle of transparency requires that any information and communication related to the processing of those personal data …” – as well as to the principles of lawfulness and fairness, which are also directly referred to in the Recital (39) – “Any processing of the personal data should be lawful and fair”.

  3. The \longPapergoals\onlyShortPapergoal \longPaper6 and 6 \longPaperrelate\onlyShortPaperrelates to the fairness and transparency of processing principles, and \longPaperare\onlyShortPaperis placed under these respective categories also by the [16, pp. 117, 120].

  4. The \longPapergoals 6 and 6 are\onlyShortPapergoal 6 is mentioned in the context of the transparency principle, in the Recital (39), where the information to be given to the data subject relates to the purpose of processing. This connects the principle of transparency with \longPaperthe principle of purpose limitation in the case of 6 and the principle of data minimization\longPaper in the case of 6.

  5. \longPaper

    The UP criterion 7.2.4 is based on the goal 6 extracted from the the Recital (39) of GDPR.\onlyShortPaperThe UP criterion [?, UPC.22] about “the personal data [being] adequate, relevant and limited to what is necessary for the purposes for which they are processed”, is based on the [?, UPG.10], extracted from the Recital (39) of GDPR. This criterion is mentioned in Recital (39) as one of the requirements for complying with the transparency principle, while also referring to the purpose of processing. This connects the present criterion also with the principle of data minimization and in addition with the data protection by design principle. The link between the last two principles can also be seen in the EuroPriSe criteria catalog, where data minimization is the focus of the [4, 1.2.1 Data Protection by Design and by Default, p.18].

\workInProgress

9 Use Cases for Validation

The model\jjshortWrite about validation and more about the use cases in particular. we propose is exemplified through three use cases that make use of cyber-physical systems technologies (also known as Internet of Things – IoT – or ubiquitous computing [32]): Assisted Living and Community Care System, Air Quality Monitoring for healthy indoor environments, and Diabetes App. Our approach is especially relevant for the IoT technologies [32, ?, 31, 30] as the privacy protection is even more variable and context-dependent, given their nature characterized by: ubiquity (can be embedded everywhere in our environments or be attached to our bodies), invisibility (their small sizes make them easy to be hidden), sensing (accurately perceiving the attributes of our environments through temperature, light, noise sensors or audio and video recording), and memory amplification (every action, movement of ourselves and our environments can be continuously and unobtrusively recorded) [23]. These attributes make the IoT technologies able to generate granular and intimate data about people and everything or everyone in their surroundings, by that reducing privacy to zero.

The model we propose is meant to identify and consider the specifics related to industry, technology and social-cultural context, remaining at the same time sufficiently generic to be applicable to all IT services, systems and products. The model is thus suitable to be used for context aware systems by eliciting meaningful information about the context where the systems are deployed and the environment (the physical space, other devices or people) they are interacting with.

\workInProgress

10 UP criteria trade-offs (Future work?)

Achieving some privacy goals might imply renouncing to or lower the expectations for other goals. Which privacy goals are prioritized, contribute to business differentiation and support some type of users better then others. We thus add support for mapping between preferences/expectations and privacy goals. This lays the basis for future work as creating tools for the producers to clearly show why their products should be preferred over the others. Furthermore, out of this initiative, we envision the apparition of services that will specialize on reviewing IT products based on their privacy characteristics.\jjSecondaryReview this sentence.

11 Conclusion and Further Work

The benefits of the UP Cube model are multiple: (i) emphasizing both the perspectives of data subjects and of controllers; (ii) representing visually on the three variability axes the existing rights and principles criteria from EuroPriSe, together with our new UP criteria; (iii) visualizing intersections between the three axes; (iv) allowing ordering of the criteria on each axis.

The theory behind our usability evaluation of privacy is based on the well established standards ISO/IEC29100:2011 and ISO 9241-11:2018. We worked directly with the GDPR text, guided by [16], which also inspired our structuring of the EuroPriSe criteria into rights and principles. Our HCI and usability perspective on privacy is influenced by the seminal works [7, 17, 6, 22, 21, 27, 12].

To build the UP Cube we have:

  • identified from the GDPR text 30 UP goals,

  • created 24 UP criteria, each with measurable subcriteria, and

  • restructured the criteria of EuroPriSe, laid as the basis of the UP Cube.

Further Work.

The UP Cube is meant as the groundwork for building a certification methodology, extending EuroPriSe to evaluate the usability of privacy. The proposed UP criteria are designed to produce measurable evaluations, useful for generating privacy labels in order to guide stakeholders when choosing technological products, by representing and visualizing the different levels of privacy. To achieve this larger goal, one needs to investigate which existing HCI methods for usability testing should be used for each of the UP criteria, and in what way.

One example of such a usability method for measuring the perceived usability of a system is the System Usability Scale (SUS) [9], a ten-item attitude Likert scale questionnaire. The standard [5, Annex B: Usability measurements] also gives examples of methods that produce measurements relevant for our UP criteria, s.a. observing the user behavior to identify the actual usability problems, or asking the users to carry out tasks in a real or simulated context of use and measuring the outcomes. The experts can also run heuristic evaluations following design principles, theories and standards from the design and cognitive fields. More concrete examples of HCI methods and how these could be used for privacy and security solutions can be found in [21].

Which methods are appropriate to use, the number of test persons, and other test related concerns, depend on contextual factors, s.a. the type of technology, users and industry. Defining the required context is what our model offers support for. However, more work (e.g., providing guidelines and examples) is needed on how the context of use can be established.

HCI practices conduct user studies throughout the whole lifecycle of a product. These studies are run by the company itself, with the help of HCI (User Experience or Interaction Design) experts. For certification, the accredited data protection assessors would be using the results provided by the company to answer the UP criteria questions. In the cases of not enough or not reliable results, the assessors can recommend/require further testing. It would be valuable to have guidelines, e.g., in the form of a check-list, to help the assessors with establishing if the results from the company are reliable and sufficient. Recommendations for the businesses are useful as well, to guide how to conduct privacy related user testing, so that the results would be reliable later for certification.

With the same goal of achieving a complete methodology that can be taken in use by the accreditation bodies, building on the present model, one could create a visual representation of the evaluation, i.e., a translation of the measurements of usability of privacy provided by the UP criteria into a visually appealing privacy label. This should serve as a vertically graded scale to differentiate a customer product from another. According to ISO 9241-11:2018, “where usability is higher then expected, the system, product or service can have a competitive advantage (e.g. customer retention, or customers who are willing to pay a premium)”. The visuals will be thought to come in addition to the GDPR compliance seal and reflect the usability of the privacy implemented. The purpose will be the same as for the methodology, to help the businesses that have already achieved GDPR compliance to further differentiate themselves on the market. From the point of view of the user of the product, the visual scale would offer support for choosing the service or product that best respects her privacy expectations.

To further validate our UP Cube model and for exemplification, we are applying the UP criteria to three use cases taken from pilots done in an ongoing European project called Secure COnnected Trustable Things (SCOTT): (i) Assisted Living and Community Care System, (ii) Air Quality Monitoring for healthy indoor environments, and (iii) Diabetes App. These are examples of IoT systems [32, 31, 30] for which our model is especially relevant, as the privacy protection is even more variable and context-dependent. IoT technologies, due to their nature (i.e., ubiquity, invisibility, and continuous sensing) [23], are able to generate granular and intimate data about people and everything or everyone in their surroundings, by that reducing privacy to zero.

\longPaper

Appendix A Annexes

a.1 Annex A: A list of the Recitals and Articles of GDPR, in full text, from which the usability goals have been extracted.

(6) Rapid technological developments and globalisation have brought new challenges for the protection of personal data. The scale of the collection and sharing of personal data has increased significantly. Technology allows both private companies and public authorities to make use of personal data on an unprecedented scale in order to pursue their activities. Natural persons increasingly make personal information available publicly and globally. Technology has transformed both the economy and social life, and should further facilitate the free flow of personal data within the Union and the transfer to third countries and international organisations, while ensuring a high level of the protection of personal data.

(7) Those developments require a strong and more coherent data protection framework in the Union, backed by strong enforcement, given the importance of creating the trust that will allow the digital economy to develop across the internal market. Natural persons should have control of their own personal data. Legal and practical certainty for natural persons, economic operators and public authorities should be enhanced.

(32) Consent should be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of the data subject’s agreement to the processing of personal data relating to him or her, such as by a written statement, including by electronic means, or an oral statement. This could include ticking a box when visiting an internet website, choosing technical settings for information society services or another statement or conduct which clearly indicates in this context the data subject’s acceptance of the proposed processing of his or her personal data. Silence, pre-ticked boxes or inactivity should not therefore constitute consent. Consent should cover all processing activities carried out for the same purpose or purposes. When the processing has multiple purposes, consent should be given for all of them. If the data subject’s consent is to be given following a request by electronic means, the request must be clear, concise and not unnecessarily disruptive to the use of the service for which it is provided.

(39) Any processing of personal data should be lawful and fair. It should be transparent to natural persons that personal data concerning them are collected, used, consulted or otherwise processed and to what extent the personal data are or will be processed. The principle of transparency requires that any information and communi­cation relating to the processing of those personal data be easily accessible and easy to understand, and that clear and plain language be used. That principle concerns, in particular, information to the data subjects on the identity of the controller and the purposes of the processing and further information to ensure fair and transparent processing in respect of the natural persons concerned and their right to obtain confirmation and communication of personal data concerning them which are being processed. Natural persons should be made aware of risks, rules, safeguards and rights in relation to the processing of personal data and how to exercise their rights in relation to such processing. In particular, the specific purposes for which personal data are processed should be explicit and legitimate and determined at the time of the collection of the personal data. The personal data should be adequate, relevant and limited to what is necessary for the purposes for which they are processed. This requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum. Personal data should be processed only if the purpose of the processing could not reasonably be fulfilled by other means. In order to ensure that the personal data are not kept longer than necessary, time limits should be established by the controller for erasure or for a periodic review. Every reasonable step should be taken to ensure that personal data which are inaccurate are rectified or deleted. Personal data should be processed in a manner that ensures appropriate security and confidentiality of the personal data, including for preventing unauthorised access to or use of personal data and the equipment used for the processing.

(42) Where processing is based on the data subject’s consent, the controller should be able to demonstrate that the data subject has given consent to the processing operation. In particular in the context of a written declaration on another matter, safeguards should ensure that the data subject is aware of the fact that and the extent to which consent is given. In accordance with Council Directive 93/13/EEC (1) a declaration of consent pre-formulated by the controller should be provided in an intelligible and easily accessible form, using clear and plain language and it should not contain unfair terms. For consent to be informed, the data subject should be aware at least of the identity of the controller and the purposes of the processing for which the personal data are intended. Consent should not be regarded as freely given if the data subject has no genuine or free choice or is unable to refuse or withdraw consent without detriment.

(43) In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation. Consent is presumed not to be freely given if it does not allow separate consent to be given to different personal data processing operations despite it being appropriate in the individual case, or if the performance of a contract, including the provision of a service, is dependent on the consent despite such consent not being necessary for such performance.

(47) The legitimate interests of a controller, including those of a controller to which the personal data may be disclosed, or of a third party, may provide a legal basis for processing, provided that the interests or the fundamental rights and freedoms of the data subject are not overriding, taking into consideration the reasonable expectations of data subjects based on their relationship with the controller. Such legitimate interest could exist for example where there is a relevant and appropriate relationship between the data subject and the controller in situations such as where the data subject is a client or in the service of the controller. At any rate the existence of a legitimate interest would need careful assessment including whether a data subject can reasonably expect at the time and in the context of the collection of the personal data that processing for that purpose may take place. The interests and fundamental rights of the data subject could in particular override the interest of the data controller where personal data are processed in circumstances where data subjects do not reasonably expect further processing. Given that it is for the legislator to provide by law for the legal basis for public authorities to process personal data, that legal basis should not apply to the processing by public authorities in the performance of their tasks. The processing of personal data strictly necessary for the purposes of preventing fraud also constitutes a legitimate interest of the data controller concerned. The processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest.

(58) The principle of transparency requires that any information addressed to the public or to the data subject be concise, easily accessible and easy to understand, and that clear and plain language and, additionally, where appropriate, visualisation be used. Such information could be provided in electronic form, for example, when addressed to the public, through a website. This is of particular relevance in situations where the proliferation of actors and the technological complexity of practice make it difficult for the data subject to know and understand whether, by whom and for what purpose personal data relating to him or her are being collected, such as in the case of online advertising. Given that children merit specific protection, any information and communication, where processing is addressed to a child, should be in such a clear and plain language that the child can easily understand.

(60) The principles of fair and transparent processing require that the data subject be informed of the existence of the processing operation and its purposes. The controller should provide the data subject with any further information necessary to ensure fair and transparent processing taking into account the specific circumstances and context in which the personal data are processed. Furthermore, the data subject should be informed of the existence of profiling and the consequences of such profiling. Where the personal data are collected from the data subject, the data subject should also be informed whether he or she is obliged to provide the personal data and of the consequences, where he or she does not provide such data. That information may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner, a meaningful overview of the intended processing. Where the icons are presented electronically, they should be machine-readable.

(63) A data subject should have the right of access to personal data which have been collected concerning him or her, and to exercise that right easily and at reasonable intervals, in order to be aware of, and verify, the lawfulness of the processing. This includes the right for data subjects to have access to data concerning their health, for example the data in their medical records containing information such as diagnoses, examination results, assessments by treating physicians and any treatment or interventions provided. Every data subject should therefore have the right to know and obtain communication in particular with regard to the purposes for which the personal data are processed, where possible the period for which the personal data are processed, the recipients of the personal data, the logic involved in any automatic personal data processing and, at least when based on profiling, the consequences of such processing. Where possible, the controller should be able to provide remote access to a secure system which would provide the data subject with direct access to his or her personal data. That right should not adversely affect the rights or freedoms of others, including trade secrets or intellectual property and in particular the copyright protecting the software. However, the result of those considerations should not be a refusal to provide all information to the data subject. Where the controller processes a large quantity of information concerning the data subject, the controller should be able to request that, before the information is delivered, the data subject specify the information or processing activities to which the request relates.

(100) In order to enhance transparency and compliance with this Regulation, the establishment of certification mechanisms and data protection seals and marks should be encouraged, allowing data subjects to quickly assess the level of data protection of relevant products and services.

CHAPTER II. Principles

Article 7. Conditions for consent

2. If the data subject’s consent is given in the context of a written declaration which also concerns other matters, the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language. Any part of such a declaration which constitutes an infringement of this Regulation shall not be binding.

3. The data subject shall have the right to withdraw his or her consent at any time. The withdrawal of consent shall not affect the lawfulness of processing based on consent before its withdrawal. Prior to giving consent, the data subject shall be informed thereof. It shall be as easy to withdraw as to give consent.

CHAPTER III. Rights of the data subject

Section 1. Transparency and modalities

Article 12. Transparent information, communication and modalities for the exercise of the rights of the data subject

1. The controller shall take appropriate measures to provide any information referred to in Articles 13 and 14 and any communication under Articles 15 to 22 and 34 relating to processing to the data subject in a concise, transparent, intelligible and easily accessible form, using clear and plain language, in particular for any information addressed specifically to a child. The information shall be provided in writing, or by other means, including, where appropriate, by electronic means. When requested by the data subject, the information may be provided orally, provided that the identity of the data subject is proven by other means.

2. The controller shall facilitate the exercise of data subject rights under Articles 15 to 22. In the cases referred to in Article 11(2), the controller shall not refuse to act on the request of the data subject for exercising his or her rights under Articles 15 to 22, unless the controller demonstrates that it is not in a position to identify the data subject.

7. The information to be provided to data subjects pursuant to Articles 13 and 14 may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner a meaningful overview of the intended processing. Where the icons are presented electronically they shall be machine-readable.

Section 2. Information and access to personal data

Article 15. Right of access by the data subject

1. The data subject shall have the right to obtain from the controller confirmation as to whether or not personal data concerning him or her are being processed, and, where that is the case, access to the personal data and the following information: (h) the existence of automated decision-making, including profiling, referred to in Article 22(1) and (4) and, at least in those cases, meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject.

Section 3. Rectification and erasure

Article 17. Right to erasure (’right to be forgotten’)

2. Where the controller has made the personal data public and is obliged pursuant to paragraph 1 to erase the personal data, the controller, taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data.

Section 4. Right to object and automated individual decision-making

Article 21. Right to object

4. At the latest at the time of the first communication with the data subject, the right referred to in paragraphs 1 and 2 shall be explicitly brought to the attention of the data subject and shall be presented clearly and separately from any other information.

Article 22. Automated individual decision-making, including profiling

1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

Footnotes

  1. thanks: A long version of this paper is available as [?].
  2. thanks: We would like to thank the anonymous reviewers for helping improve the paper.
  3. thanks: We would like to thank the anonymous reviewers for helping improve the paper, as well as the members of the PriSec group at Karlstad University, for useful discussions and help with the present research while doing a secondment there.
  4. GDPR – General Data Protection Regulation from European Union [2].
  5. Note that system/product/service are used interchangeably throughout the paper.
  6. For this work the authors have received the 2019 PET Award for Outstanding Research in Privacy Enhancing Technologies.
  7. GDPR – General Data Protection Regulation from European Union [2].
  8. The two concepts “privacy” and “data protection” are used interchangeably throughout this paper. The right to privacy, as emerged in 1984 in the Universal Declaration of Human Rights, is closely related to the nowadays right to protection of personal data. Though the two rights are related, protecting similar values, the right to personal data is though broader as it applies to processing of all kinds of personal data, beyond data related to privacy.[16]. In addition, we address only the privacy issues concerning digital technology.
  9. At the moment of writing the paper, EuroPriSe’s criteria catalog has not been approved pursuant to Article 42(5) GDPR and EuroPriSe GmbH has not been accredited as a certification body pursuant to Article 43 GDPR yet. EuroPriSe is dedicated to receiving the approval of its certification criteria and the accreditation as a certification body in accordance with Art. 42 f. GDPR asap.
  10. This project has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Skłodowska-Curie grant agreement No 675730, within the Marie Skłodowska-Curie Innovative Training Networks (ITN-ETN) framework; https://privacyus.eu/
  11. https://www.ifip-summerschool.org/
  12. The Legal Design Alliance (LeDA) is formed of lawyers, designers, technologists, academics, and other professionals who are committed to making the legal system more human-centered and effective, through the use of design. https://www.legaldesignalliance.org/
  13. The Usable Privacy Policy Project, https://usableprivacy.org/. Visit also https://explore.usableprivacy.org/ to navigate privacy policy annotations extracted by both humans and machine learning techniques.
  14. CLAUDETTE (automated CLAUse DETectEr), http://claudette.eui.eu/about/index.html. See also their tool http://www.claudette.eu/gdpr/
  15. Following the requirement for a consistency mechanism set out in the Article 63 of GDPR, the work of the certifications bodies and DPAs in Germany is coordinated and made consistent through “The Standard Data Protection Model” (https://www.datenschutz-mv.de/datenschutz/datenschutzmodell/), issued by the Conference of the Independent Data Protection Authorities of the Bund and the Länder \longPaper(Germany) on 9-10 November 2016. This document is a good reference for methods and guidance for implementing the data protection principles.
  16. https://en.wikipedia.org/wiki/TrustArc#Criticism_and_Controversies

References

  1. Information technology – Security techniques – Privacy framework. Standard ISO/IEC 29100:2011 (2011)
  2. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. Official Journal of the European Union L 119/1 (2016)
  3. The House of Lords EU Committee, European Union Committee’s report on Online Platforms and the Digital Single Market (2016), https://publications.parliament.uk/pa/ld201516/ldselect/ldeucom/129/12909.htm#_idTextAnchor235
  4. EuroPriSe Criteria for the certification of IT products and IT-based services – v201701. Tech. rep. (2017), https://www.european-privacy-seal.eu/AppFile/GetFile/6a29f2ca-f918-4fdf-a1a8-7ec186b2e78a
  5. Ergonomics of human-system interaction – Part 11: Usability: Definitions and concepts. Standard ISO 9241-11:2018 (2018)
  6. Ackerman, M.S., Mainwaring, S.D.: Privacy Issues and Human-Computer Interaction. In: Cranor, L., Garfinkel, S. (eds.) Security and usability: designing secure systems that people can use, pp. 381–399. O’Reilly (2005)
  7. Adams, A., Sasse, M.A.: Users are not the enemy. Communications of the ACM 42(12), 41–46 (1999)
  8. Balboni, P., Dragan, T.: Controversies and challenges of trustmarks: Lessons for privacy and data protection seals. In: Privacy and Data Protection Seals, pp. 83–111. Springer (2018)
  9. Brooke, J.: SUS – A quick and dirty usability scale. Usability evaluation in industry 189(194),  4–7 (1996)
  10. Cavoukian, A., Chibba, M.: Privacy seals in the USA, Europe, Japan, Canada and Australia. In: Privacy and Data Protection Seals, pp. 59–82. Springer (2018)
  11. Cooper, A.: The Inmates Are Running the Asylum – Why High-Tech Products Drive Us Crazy and How to Restore the Sanity. Sams Publishing (2004)
  12. Cranor, L.F.: SIGCHI Social Impact Award Talk – Making Privacy and Security More Usable. CHI EA ’18, ACM (2018). https://doi.org/10.1145/3170427.3185061
  13. Cranor, L.F., Garfinkel, S.: Security and usability: designing secure systems that people can use. O’Reilly (2005)
  14. Dumas, J.S., Redish, J.C.: A Practical Guide to Usability Testing. Intellect Books, Revised edn. (1999)
  15. Edelman, B.: Adverse selection in online ”trust” certifications and search results. Electronic Commerce Research and Applications 10(1), 17–25 (2011)
  16. European Union Agency for Fundamental Rights: Handbook on European data protection law – 2018 edition. Luxembourg: Publications Office of the European Union (2018)
  17. Good, N.S., Krekelberg, A.: Usability and privacy: a study of Kazaa P2P file-sharing. In: Proceedings of the SIGCHI conference on Human factors in computing systems. pp. 137–144. ACM (2003)
  18. Hansen, M.: The Schleswig-Holstein data protection seal. In: Privacy and Data Protection Seals, pp. 35–48. Springer (2018)
  19. Iachello, G., Hong, J.: End-user Privacy in Human-Computer Interaction. Foundations and Trends in Human-Computer Interaction 1(1), 1–137 (2007)
  20. Kamara, I., De Hert, P.: Data protection certification in the EU: Possibilities, Actors and Building Blocks in a reformed landscape. In: Privacy and Data Protection Seals, pp. 7–34. Springer (2018)
  21. Karat, C.M., Brodie, C., Karat, J.: Usability design and evaluation for privacy and security solutions. In: Cranor, L., Garfinkel, S. (eds.) Security and usability: designing secure systems that people can use, pp. 47–74. O’Reilly (2005)
  22. Karat, C.M., Karat, J., Brodie, C.: Privacy Security and Trust: Human-Computer Interaction Challenges and Opportunities at their Intersection. The Human-Computer Interaction Handbook pp. 669–700 (2012)
  23. Langheinrich, M.: Privacy by design – Principles of privacy-aware ubiquitous systems. In: International Conference on Ubiquitous Computing. pp. 273–291. Springer (2001)
  24. Nissim, K., Bembenek, A., Wood, A., Bun, M., Gaboardi, M., Gasser, U., O’Brien, D.R., Steinke, T., Vadhan, S.: Bridging the gap between computer science and legal approaches to privacy. Harvard Journal of Law & Technology 31(2),  687 (Spring 2018)
  25. Papakonstantinou, V.: Introduction: Privacy and Data Protection Seals. In: Privacy and Data Protection Seals, pp. 1–6. Springer (2018)
  26. Patrick, A.S., Kenny, S.: From privacy legislation to interface design: Implementing information privacy in human-computer interactions. In: International Workshop on Privacy Enhancing Technologies. pp. 107–124. Springer (2003)
  27. Patrick, A.S., Kenny, S., Holmes, C., van Breukelen, M.: Human Computer Interaction. In: Handbook for Privacy and Privacy-Enhancing Technologies: The case of Intelligent Software Agents, chap. 12, pp. 249–290 (2003)
  28. Preece, J., Rogers, Y., Sharp, H.: Interaction design: beyond human-computer interaction. John Wiley & Sons (2015)
  29. Schneier, B.: Data and Goliath: The hidden battles to collect your data and control your world. WW Norton & Company (2015)
  30. Sicari, S., Rizzardi, A., Grieco, L.A., Coen-Porisini, A.: Security, privacy and trust in Internet of Things: The road ahead. Computer networks 76, 146–164 (2015)
  31. Stankovic, J.A.: Research directions for the internet of things. IEEE Internet of Things Journal 1(1),  3–9 (2014)
  32. Weiser, M.: Ubiquitous computing. Computer (10), 71–72 (1993)
  33. Whitten, A., Tygar, J.D.: Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0. In: USENIX Security Symposium. vol. 348 (1999)
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
408739
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description