Privacy in an age of the internet, social networks and Big Data

Position Paper | Year 2017
Privacy in an age of the internet, social networks and Big Data
Class of Technical Sciences

Evenementen 

The current use of the Internet, social media and big data severely affects the privacy of ordinary users. This positioning paper is primarily aimed at the private user young and old who did not have special education or training regarding ICT but still uses these services intensively and who, whether or not, rightly worries about the hazards to which his or her privacy is exposed. This requires not only a better and deeper understanding of the technological possibilities and limitations, but also the commercial interests, and their relation to the constraints and threats of our personal privacy when using the many often valuable services. The specific aspects of privacy as patients, or the privacy regulations for companies and institutions that track and process files with data from individuals, employees, students, or customers, is not dealt with but is referred to other reports. This positioning paper has been conceived by a working group of members of KVAB and external experts covering the different aspects of this interdisciplinary subject, that have met regularly over a period of one year.

Since the ICT world is often overwhelmed with "jargon" words, the scope of which does not penetrate or because the newspapers sometimes describe very frightening low-backed situations, we first discuss the main concepts both at the level of the machine learning, data extraction and the big data, as well as the privacy issues that arise, and finally the ways in which a better privacy can be acquired.

In order to make this more concrete for the modal reader, we discuss important privacy hazards in a number of concrete situations, such as the digital life of a family, the big data police in passenger profiles, the internet of things,  the context of smart cities, distributed information versus central collection, autonomous vehicles, and location information. Although this digital revolution is not over yet, the modal user can already modify his behavior.

There is extensive scientific literature on this subject, but there are also many widely accessible texts available recently, including websites, to which the interested reader is referred to in the bibliography.
The ten recommendations mainly focus on various target groups and situations.

Recommendation 1: Responsibilities. Privacy in the big data is an issue for citizens, engineers, consumers, companies, institutions, media and governments. This calls for the provision of sufficient resources to the supervisors, especially with regard to companies that derive their earnings model from big data analysis.

Recommendation 2: Alert citizens. Citizens, whose data are being processed, should try to maximize their rights under the GDPR. The verification of personal data requires that the individual gains insight into the use and misuse of the data, as a precondition for  genuine freedom of choice. Precisely because it is extremely difficult for individuals, we recommend that those concerned use the opportunity to exercise their claims through mandating to consumer or privacy organizations (Article 80 GDPR).

Recommendation 3: Providence, Profile Transparency, and Goal Binding. Although the profiles themselves are not related to a particular person and thus are not personal data themselves, the application applies to a person who fits within the "validation" of the profile, under the fundamental right to data protection (GDPR). The right to profile transparency implies the obligation to inform stakeholders and explain how they are profiled and this beyond a correlation or statistical relationship.

Recommendation 4: Power Unbalance. If the person responsible for an ICT service relies on the consent for the use of personal data, then it must be easy to withdraw, with a limitation of permission in time. They will not apply a manifest power imbalance between the data subject and the controller or processor, e.g. because the responsible person provides the dominant (or only) service in the market. The controller must demonstrate that there is no power imbalance or that this imbalance cannot affect the consent of the person concerned.

Recommendation 5: The builders of ICT and IoT devices must make use of technologies that maintain privacy and allow transparency for the end user. They need to work on ‘privacy by design', taking privacy from the start of the design as an important requirement, and not being "stuck" afterwards. The service providers must allow users to assemble services of different origins. The designers of algorithms must write their algorithms to ensure users' privacy. Application designers need to allow transparency, work on efficient and effective technologies that allow users to authorize their data usage. Additionally, one must make certification of applications so users are sure that the applications are safe. Typically privacy must be default.
Recommendation 6: Role of government and companies. It is the duty of government and companies to check for each big data solution whether the risks for the protection of personal data and the risks to society as a whole outweigh the benefits. In doing so, one should always check if it is not possible to achieve the same goal by using less data or aggregating data.

Recommendation 7: Preventing unwanted data bias. The responsible designers and service providers must always check whether inaccurate or unfair 'data bias', 'algorithm bias' or 'output bias' is hidden in the data sets with which algorithms are being trained, either in mathematical models themselves or in the output (indirect discrimination).

Recommendation 8: Limits to the use of big data by the government. The use of public sector big data, both in the field of detection of tax and social security fraud and in the context of national security, crime and law enforcement, should always be subject to a review by the relevant supervisors. In addition, the legitimacy and the related proportionality must be paramount, which also requires a marginal efficiency test. It is essential that legislation be provided that determines how and when the result of data mining and statistical analyzes (correlations) by the government may or may not be used as legal evidence to make decisions in individual cases (e.g. In dealing with fraud, law enforcement ...).

Recommendation 9: Establishing a digital clearing house. It is advisable to set up a Digital Clearing House (DCH) that monitors the quality of the various digital market regulators.

Recommendation 10: Task of education. Specific to young people, education has a task of bringing awareness, attitudes, skills and behavior from the actual life spheres such as home, school and friends (eg youth associations). It is important to point out to young people the "pitfalls" of their own behavior, as expressed, for example, in the privacy paradox.

Available documents

Author

  • Joos Vandewalle
  • Yolande Berbers
  • Mireille Hildebrandt
  • Willem Debeuckelaere
  • Paul De Hert
  • Yvo Desmedt
  • Frank De Smet
  • Karolien Poels
  • Jo Pierson
  • Bart Preneel