GDPR, data protection, and human-centered design

2018-04-10_ethical-research-for-hcd_bg.png

Why am I getting all of these privacy policy emails?

Curious about all the new privacy policy alerts flooding your inbox? It’s due in part to the deadline for compliance with the General Data Protection Regulation (GDPR) which went into effect on Friday, May 25th, 2018. GDPR is the European Union’s new law protecting its citizens’ data. 

 

Corporations and institutions worldwide have been scrambling to get their data storage, handling, and sharing practices in line with GDPR to avoid massive non-compliance fines. With many companies being fast and loose with users’ data, GDPR is a long overdue push toward standardizing ethics and norms around data collection and usage within industry. This has been an ongoing conversation within the IxDA Seattle community. 

 
April 10, 2018-IxDA-7.jpg

Local research experts on data handling

 

Privacy violations, fake news, and troll bots on social media were the impetus for a panel IxDA held in April, “Ethical Research for Human-centered Design.” The panel focused on what UX design and research could learn from federal institutional guidelines for conducting human subjects research. The panel was comprised of cross-domain experts who’ve conducted research in both academia and industry, Sheetal Agarwal, Michelle Bayles-Simon, and Liz Sanocki, as well as a data ethnographer, Brittany Fiore-Gartland, and Neena Makhija who is an Institutional Review Board (IRB) compliance analyst at UW.

 
April 10, 2018-IxDA-panelpic.jpg

Guiding principles

 

 

 

These panelists discussed how they apply their academic training in human subjects research to the design research they do in industry. They shared some of their guiding principles with the audience: do no harm, get voluntary informed consent, build trust and empathy, ask don’t tell, incorporate different vantage points into human-driven design, and consider the future impact, including the unintended consequences, of our designs. Michelle Bayles-Simon pointed to a great resource, IDEO’s The Little Book of Design Research Ethics, as a practical guide. There is also a pocket-sized version of the federal guidelines on human subjects research that Neena Makhija encouraged the audience to download.

The panel took place on the first day of Mark Zuckerberg’s testimony to Congress after the Cambridge Analytica scandal broke. This, of course, shaped much of the panel conversation. Panelist Sheetal Agarwal argued that GDPR is a step in the right direction toward averting such breaches of public trust in the future. Many companies are adopting GDPR as their general data protection standard since it’s too complicated to institute different standards for different populations and regions. 

 
April 10, 2018-IxDA-10m.png

What is the Common Rule?

 

The enforcement of GDPR and the standardizing effects it is having in many ways follows the precedent of human subjects research. The U.S. government did not proactively institute the federal guidelines on the protection of human subjects, also known as the Common Rule. They were compelled to institute the guidelines after the public was confronted with a slew of unethical research conducted on vulnerable populations.

Such research was initially heralded for saving humanity from infectious diseases. A similar optimism was invested in social media’s potential to democratize mass communication and break down socio-economic barriers. But people underestimated how both of these data collection vehicles could be exploited and the effect that would have on individuals, communities, and public trust.

The Common Rule was instituted to amend such wrongs and ensure that federal funding was not complicit in unethical research of individuals and communities. The Common Rule requires that participants give voluntary, informed consent, have access to information about the study, and have the right to opt-out at any time. And researchers must protect participants’ privacy and well-being and demonstrate that the research benefits must outweigh any costs.

Cambridge Analytica violated the common rule tenets, most flagrantly the standards of informed consent and Facebook’s users’ privacy. Yet technically there was no illegality. As a private company, Facebook is not required to uphold the Common Rule. In fact, the Common Rule does not protect much of our data. We’ve willingly given our data to private companies in exchange for services such as cloud accounts, or by sharing it through social media.

However, GDPR will change this by applying mandates akin to the Common Rule to user data. GDPR now requires that organizations and companies have explicit consent to access individuals’ data. Users must be informed about how their data is used, individuals’ data must be anonymized, and processes must be put in place allowing people to see what data a company has about them and to delete it upon request. 

 

 
April 10, 2018-IxDA-20.jpg

Surprise! Design is not neutral.

The IxDA community is in the business of designing to support and extend human interaction within the world. Yet design is not neutral, just as algorithms and markets are not neutral. They are extensions of the people who create them, whose agendas are driven by their socio-cultural values. As Brittany Fiore-Gartland pointed out during the panel discussion, the unintended harmful consequences of data collection pose a high enough risk that there must be guiding principles. From this vantage, GDPR is justified. Yet it is just one step in instituting a human-centered ethos around data protection.

 

 
Guest User