IDP2013 (X): Privacy (II)

Notes from the 9th Internet, Law and Politics Congress: Big Data: Challenges and Opportunities, organized by the Open University of Catalonia, School of Law and Political Science, and held in Barcelona, Spain, on 25-26 June 2013. More notes on this event: idp2013.

Moderadora: Mònica Vilasau. Lecturer, School of Law and Political Science (UOC).

The use of Big Data to generate behaviours
Ramon Miralles, Coordinator of Auditing and Security of Information. Catalan Data Protection Authority

Service providers are often accused of lack of clear information, lack of specific usage of the data they are collecting, etc. Besides — or added to — this lack of clarity, data is increasingly becoming a source of wealth, and thus leads to changes of relationships of power and new behaviours.

A detailed analysis of big data, can it induce to changes in behaviour? e.g. the Obama team found that women aged 35-50 y.o. usually had many photos of George Clooney on Facebook. After realizing that, there was a sensible increase of the number of public appearances of Barack Obama besides George Clooney and the number of photos that they shared… and which of course were distributed on social networking sites.

But are there behaviours which there is a consensus that they are bad (xenophobia, racism) and which could/should be fought with the use of big data? Is there still room for free will? Should we change our regulatory framework to adapt it to these new realities/policies? Would it be, on the other hand, fair or legitimate?

Or maybe the terms of use could include new clauses (premium clauses?) in which the service provider would inform the user of the usage of their personal data?

Automated Journalism: Artificial Intelligence Transforms Data into Stories — When data protection principles and privacy protect the right to express opinions freely and to receive accurate information
Cédric Goblet. Lawyer at the Brussels Bar

Can a robot replace a journalist? Narrative Science’s Quill is able to write human-readable articles or pieces of news after a collection of specific data. A robot implies loss of all editorial autonomy, no verification of the sources, lack of analysis of the information with a critical eye and independence, or the mistaken belief that a machine will be neutral and objective. It is very likely that machine-made pieces of news will result in a tendency towards infotainment and fostering an echo chamber effect.

Big Data: A Challenge for Data Protection
Philipp E. Fischer, Ph.D. candidate (IN3 Research Institute, UOC Barcelona), LL.M. in intellectual property law (Queen Mary University of London / TU Dresden); Ricardo Morte Ferrer, Lawyer (Abogado), Master of Laws (UOC). Tutor for law studies (Grado en Derecho) at the UOC. Legal adviser for the TClouds Project at the ULD, Kiel

One of the main challenges in data protection is the high asymmetries in the relationships of power between service providers and end users: there may be no alternative to that service, there may be not all the information in the terms of service, there may even not be the whole information in these terms of service, etc.

The right to data protection in the public administration
Rosa Cernada Badía, Investigadora de la Universidad de Valencia

In the administration of Justice, communications usually publish data from the citizens. Before a law of public information reutilization and another law protecting personal data, it is obvious that a conflict arises.

But not only a technical or legal solution is needed, but also a political commitment that settles interoperability, responsibilities, allocation of resources to manage information and data, etc.

Share:

9th Internet, Law and Politics Conference (2013)

7th Internet, Law and Politics Congress (X). Right to be forgotten, data protection and privacy

Notes from the 7th Internet, Law and Politics Congress: Net Neutrality and other challenges for the future of the Internet, organized by the Open University of Catalonia, School of Law and Political Science, and held in Barcelona, Spain, on 11-12 July 2011. More notes on this event: idp2011.

Track on the Right to be forgotten, data protection and privacy
Chairs: Mònica Vilasau Solana, Lecturer, School of Law and Political Science (UOC)

Pere Simon Castellano
The constitutional regime of the right to oblivion in the Internet

It is the principle of consent the one that gives us the legitimacy to claim for a right to privacy or data protection.

Especially related to search engines (though not only) is the legality of a given content another important factor when claiming for our privacy rights or the right to be forgotten.

Jelena Burnik
Behavioural advertising in electronic communications. A benefit to electronic communication development and an intrusion of individual’s right to privacy and data protection

Behavioural advertising tracks Internet users’ activities online and delivers only relevant advertisements, based on the data collected and analysed over a given period of time. It is normally enabled by cookies, that are placed by websites or advertisements on websites.

Behavioural advertising is defended in the name of relevance of advertisements, enhanced user experience, precise segmentation and less money spent on non-relevant audiences, support to free Internet content and a driver of innovation.

But it is a controversial practice that requires a fair balance between the interests of the industry and the rights of individuals. As cookies assign a unique ID with an IP address, there can be concerns on data protection. On the other hand, cookies are normally placed in the computer by default, while maybe a debate on opt-in vs. opt-out of cookie placing and cookie-based tracking should be considered.

A new “cookie” European directive should aim at shifting from an opt-out principle to an opt-in one, and cookies being placed only under explicit user’s concern. But how is the technological solution for an opt-in cookie principle?

In the US, though, what seems to be more acknowledged is an enhanced opt-out model.

But only true opt-in provides for transparency, and self-regulation of the industry will not suffice.

María Concepción Torres Diaz
Privacy and tracking cookies. A constitutional approach.

It is worth noting the difference between privacy, intimacy and personal data. And cookies can harm privacy. So, users should get all necessary information on cookies and tracking so they can decide whether a specific behaviour puts at stake their privacy. In case the user decides to go on, explicit consent should be provided to the service to perform its tracking activity.

We have to acknowledge that new technologies will bring with them new rights and new threats to old rights. Thus, we should be aware of the new technologies so that the law does not fall behind.

Philipp E. Fischer; Rafael Ferraz Vazquez
Data transfer from Germany or Spain to third countries – Questions of civil liability for privacy rights infringement

There are data transfers at the international level continuously. If those data got “lost”, the operator might have incurred in privacy rights infringement.

The European Directive on data transmission, it has been established that there can be data transmission within the European Union (nationally or internationally) or with 3rd countries with adequate level of data protection. There still are some issues with the US and there are other countries which are simply banned from data transmission between them and member states.

Faye Fangfei Wang
Legal Feasibility for Statistical Methods on Internet as a Source of Data Gathering in the EU

Privacy protection steps: suitable safeguards, duty to inform prior to obtaining consent (transparency), consent, and enforcement. Request for concern should be looked at as a very important step towards privacy protection. Consent must be freely given and informed.

There is an exemption clause in the UK legislation, to be used when gathering some data is strictly necessary for a service to run, or for scientific purposes, etc. But the exception clause must be used legally.

Ricardo Morte Ferrer
The ADAMS database of the Anti Doping World Agency. Data protection problems

The ADAMS database stores whereabouts, reporting where a sportsman is during 3 months, for a daily time span from 6:00 to 23:00 and including a full daily 1h detailed report of their whereabouts. Instead of presuming innocence, this database kind of presumes guiltiness.

That is a lot of information and, being the holder an international agency based in Canada, a threat on data protection as it implies a continuous traffic of personal data internationally.

Inmaculada López-Barajas Perea
Privacy in the Internet and penal research: challenges in justice in a globalized society

The possibility that personal information of citizens can be retrieved, remotely, by law enforcement institutions, is it just the digital version of the usual (and completely legal) surveillance methodologies, or is it something new and something that threatens citizens’ privacy?

More information

Share:

7th Internet, Law and Politics Conference (2011)

5th Internet, Law and Politics Conference (III). Data protection and Social Networking Sites

Notes from the 5th Internet, Law and Politics Conference: The Pros and Cons of Social Networking Sites, organized by the Open University of Catalonia, School of Law and Political Science, and held in Barcelona, Spain, on July 6th and 7th, 2009. More notes on this event: idp2009.

Data protection and Social Networking Sites
Chaired by Mònica Vilassau

Spain has circa 8,000,000 SNS users that usually set by default the lowest levels of data protection. It’s difficult to find out who’s the owner of data and who’s reliable of data protection. And it’s usual to find use of third parties’ data not only without their consent but without their knowledge.

Esther Mitjans, Director, Catalan Data Protection Agency

Embedded video at http://ictlogy.net/?p=2399

There’s an urgent need to raise awareness about the privacy risks of using social networking sites and being on the Net.

Parents letting their kids freely browse SNS is like letting them go outside and play on the street unsupervised and unaware of some basic issues.

On the other hand, be have also to build confidence in the digital environment, and Law should have a role in trying to bring back (or build) confidence on the system.

There are shared risks where one’s actions have impact on third parties. What happens when data usage goes beyond the household or domestic arena? It’s known that increasingly SNS users use data for commercial purposes or, to say the least, not for strictly personal reasons.

But who’s liable for data protection infringement in SNS? If there’s been a data mining process for commercial uses, liability is easy to track back. But if the origin is a misuse coming from a particular individual, liability becomes not that easy to stablish.

SNS management is an approach to risk management. We should minimize risks for those acting legally, while prosecuting those who act illegally.

Pablo Pérez San-José, Gerente del Observatorio de la Seguridad de la Información Instituto Nacional de Tecnologías de la Comunicación (INTECO)

Embedded video at http://ictlogy.net/?p=2399

The Observatorio de la Seguridad de la Información [Observatory for the Information Security] is aimed towards monitoring and promoting policies for the security of data and information.

Hugh success of SNS at the kids and youngsters level. 43% kids using Tuenti, 80% using YouTube. Attractive because of the online-offline combination.

Three main key points concerning security hazards in SNSs:

  • Creation of profile: terms of service not clear. TOS should be written in plain English. Quite often, users are asked to fill in lots of data that are legally very sensible. Even if these data are not compulsory, they appear on the sign up form and people normally fill them in. User age verification should be more effective (in Spain, you need parental consent to share data if you’re under 14 y.o.). Default privacy settings are very low, allowing maximum visibility.
  • Participation in the SNS: excessive personal information made public on your profile. Non-authorized indexation by search engines. Installation and generic usage of cookies without the user’s knowledge. Reception of hipercontextualized advertising. Giving away intellectual property rights. Malware, phishing, pharming, etc. that use the information available on SNSs to customize to a higher degree attacks to other users. Spam based on false profiles. Sensitive and inappropriate content for minors. Cyberbulling, grooming.
  • Signing off: impossibility to completely and definitely erase your profile. Information that remains on third parties’ profiles.

Recommendations:

  • Be proactive following the law
  • Better age verification
  • Appropriate content (depending on environment and target) and well tagged
  • Awareness raising
  • Fostering secure environments, good practices, and a harmonized international law, while enabling the enforcement of law

Facebook and risks to de-contextualization of personal information
Franck Dumortier, Researcher, Centre de recherches informatique et droit (FUNDP-CRID)

Embedded video at http://ictlogy.net/?p=2399

Facebook’s model is based on the presentation of the users’ profiles, the visualization of the network of relation to others, and, most important, use real-world identification signs, including real names and real places.

When is there de-contextualization?

  • Behaviours and information used in another context from that for which they were intended
  • Violation of contextual norms of appropriateness or distribution

While on the real world anyone more distant than the friend of a friend is a stranger, on Facebook anyone you don’t actively hate is a friend. This enables wider dissemination of sensitive and decontextualized content. The driver being the presence of a visible network, tagging, pressure to join the network, etc.

Privacy is a human right, and is normally treated as a data-subject. But he is also a contextual-human, so privacy should also be seen as a right to contextual integrity, and as a right to self-emancipation from one’s own context.

Facebook as a Foucault’s heterotopia: a place that includes all other places, including its relationships.

In this sense, dealing with the “data subject” (identifying someone by reference to one or more factors specific to one aspect of his identity) is a partial approach, and the right to protect data is the right provided to “dividuals” (as divided individuals, parts of individuals).

A prime effect of Facebook, as an heterotopical environment, is to artificially recompose individuals.

De-contextualization threatens data protection rights.

Proposals:

  • Define higher data-privacy compliant default settings
  • Raise awareness
  • Increasing liability of SNS operators is useless

Data protection in Google
Bárbara Navarro, European director of Institutional Relations of Google in Spain and Portugal.

Embedded video at http://ictlogy.net/?p=2399

Businesses are increasingly aware that data protection and privacy are important issues that need being addressed. There is a general claim that demands privacity on demand: I want to upload everything and then be able to manage my own privacy — and the SNS provider respect and protect it.

Some questions on Google and privacy: excessive data retention; Google Street View and Google Earth and their photos; contextual advertising: is it good or bad; cloud computing and jurisdiction; health records; etc.

In most cases, the user can opt-out (temporal or permanent) on specific aspects: ask the deletion of a photo, stop receiving contextualized advertising, etc. Google’s commitment is that the user is the owner of its own information and the things Google does with it.

Three axes of action:

  • Education
  • Collaboration
  • Regulation

Q&A

Q: Should the government rank and publish what SNS is more data privacy compliant? A: The government should enforce the law but, as it happens with any kind of crime, most information cannot be made public.

James Grimmelmann: If we prohibit sites like Facebook, is there a threat from behaving as more integrated individuals? If it is our will not to be “dividuals”, is there a threat against us if we ban heterotopies like Facebook? Franck Dumortier: Constituting a unique space is wrong because contexts might not fit, because different dimensions of the self might not be overlapping.

Q: How is it that there’s that much content on YouTube from TV channels? Bárbara Navarro: normally it’s individuals who upload videos on YouTube and TV Channels the ones that have to ask for this content to be retired. On the other hand, Google has created a scanning device that can identify protected content and not permit it being uploaded. It is also true, nevertheless, that most channels have their own YouTube channel and they normally allow protected content to be uploaded by individuals as it provides publicity.

Q: Imagine a user that joins a SNS focusing on a specific disease or illness, he then recovers and wants to quit the network and erase all data. How to? Esther Mitjans: The user made an cost-benefit analysis before joining and decided that it was worth joining the network, we should not forget about this. Notwithstanding, they should be following the requisites of the SNS to delete all their traces.

More Information

Share:

5th Internet, Law and Politics Conference (2009)

Sobre Mí