Student & Faculty Collaboration

The PRIPARE project

A Student–Faculty Collaboration on Online Privacy

A woman enters a liquor store. She slowly peruses the aisles. The man behind the counter watches her. Eventually, she walks up to the counter and casually hands him a twenty dollar bill. Within seconds, he has figured out her name, where she lives, her birthday, her height and her birthplace. Merely by glancing at her ID. Unwittingly, we do the same every single day. We hand over countless pieces of personal information that are irrelevant to the transactions we undertake, whether we are buying shoes online or signing up for a newsletter.

There is a dire need for policy to moderate this, Professors Claudia Roda and Susan Perry, of the Computer Science and International and Comparative Politics department respectively, argue. Professors Perry and Roda, together with Professor Kung, participate in the PRIPARE project, PReparing Industry to Privacy-by-design by supporting its Application in Research, in order to re-establish privacy as a human right. Eleven institutions have sponsored the project: from academic institutions such as AUP to large companies the likes of Atos. The project is funded by the European Commission, which awards grants to as little as 7% of the project proposals it receives. “There is a pressing need for legislation that protects citizens’ privacy,” argue both Professors Roda and Perry, “and an obligation to inform the general public about what that entails.”

Individuals like Edward Snowden have already raised the alarm on systems of global surveillance and the dangers they pose to individual private data. However, most citizens do not understand the extent of the threat. And neither do companies. Perry and Roda suggest that, “Companies do not take the consequences of non-compliance seriously.” They hope this will change once the European Union‘s Data Protection Act comes into force in 2017. “Right now,” they continue, “privacy is de-facto regulated by private companies as opposed to democratically elected bodies." Google and Facebook have control over privacy standards, which is, Perry argues, “like letting restaurants set their own hygiene standards.”

How does this translate in our everyday lives? Well, those private messages you send on Facebook might not be so private. Two years ago, an article on Buzzfeed revealed that Facebook has a team that reads through private messages if they have been flagged by an automated tool. The tool sifts through content, searching for violations of Facebook’s terms of service, such as “mature content”. If a private message is flagged, employees intervene and read it. The online consequences of privacy violations can be extremely serious: damaged reputations, stalking, peer-pressure, bullying, blackmail…

Perry elaborates, “Article 17 of the Universal Declaration of Human Rights outlines the right to property. Right now property, in the form of data, is being stolen from its owners. Users need to be able to knowingly accept the privacy settings of their data.” Perry and Roda go on to explain that there is a fundamental disconnect in privacy practices: users are required to put trust in data controllers, even if trust in these circumstances is not articulated in law. One solution would be data minimization through processes like PETs, Privacy Enhancing Technologies, which ensure that only essential data is collected and that it is destroyed as soon as it is no longer needed. This would be the equivalent of our protagonist buying alcohol and only showing proof that she is “of age.”

Roda explains, “Privacy needs to be more than just an afterthought, it needs to be directly implemented in the design. Unfortunately most engineers are not well versed in ethics.” Perry adds, “Most people in the IT industry don’t understand that the consequences of privacy violations may have a negative impact on the environment, on the fight against cyber-crime, on their branding. The GAFA (Google Apple Facebook Amazon) don’t take human rights into account in the design of their platforms and services.”

The PRIPARE project has been working closely with Atos, a French multinational IT services corporation, to ensure security and privacy go hand in hand in their designs. “Data functions much like money: big data is the currency of the twenty-first century, we just need better banks,” concludes Perry.

All in all data has become a valuable resource that enables companies to market to users but it also needs to be used sparingly and in compliance with users' privacy. 

Professors Perry and Roda have worked on various projects together, including an upcoming book on Human Rights and Digital Technology and a Topics course on Human Rights and Digital Technology. The European Union selected the latter as one of its recommended ENISA curricula for 2014. They hope to turn AUP into a privacy by design campus and set an example as to how universities can better combine privacy and technology.