GDPR, data protection, and human-centered design
Corporations and institutions worldwide have been scrambling to get their data storage, handling, and sharing practices in line with GDPR to avoid massive non-compliance fines. With many companies being fast and loose with users’ data, GDPR is a long overdue push toward standardizing ethics and norms around data collection and usage within industry. This has been an ongoing conversation within the IxDA Seattle community.
Privacy violations, fake news, and troll bots on social media were the impetus for a panel IxDA held in April, “Ethical Research for Human-centered Design.” The panel focused on what UX design and research could learn from federal institutional guidelines for conducting human subjects research. The panel was comprised of cross-domain experts who’ve conducted research in both academia and industry, Sheetal Agarwal, Michelle Bayles-Simon, and Liz Sanocki, as well as a data ethnographer, Brittany Fiore-Gartland, and Neena Makhija who is an Institutional Review Board (IRB) compliance analyst at UW.
These panelists discussed how they apply their academic training in human subjects research to the design research they do in industry. They shared some of their guiding principles with the audience: do no harm, get voluntary informed consent, build trust and empathy, ask don’t tell, incorporate different vantage points into human-driven design, and consider the future impact, including the unintended consequences, of our designs. Michelle Bayles-Simon pointed to a great resource, IDEO’s The Little Book of Design Research Ethics, as a practical guide. There is also a pocket-sized version of the federal guidelines on human subjects research that Neena Makhija encouraged the audience to download.
The panel took place on the first day of Mark Zuckerberg’s testimony to Congress after the Cambridge Analytica scandal broke. This, of course, shaped much of the panel conversation. Panelist Sheetal Agarwal argued that GDPR is a step in the right direction toward averting such breaches of public trust in the future. Many companies are adopting GDPR as their general data protection standard since it’s too complicated to institute different standards for different populations and regions.
The enforcement of GDPR and the standardizing effects it is having in many ways follows the precedent of human subjects research. The U.S. government did not proactively institute the federal guidelines on the protection of human subjects, also known as the Common Rule. They were compelled to institute the guidelines after the public was confronted with a slew of unethical research conducted on vulnerable populations.
Such research was initially heralded for saving humanity from infectious diseases. A similar optimism was invested in social media’s potential to democratize mass communication and break down socio-economic barriers. But people underestimated how both of these data collection vehicles could be exploited and the effect that would have on individuals, communities, and public trust.
The Common Rule was instituted to amend such wrongs and ensure that federal funding was not complicit in unethical research of individuals and communities. The Common Rule requires that participants give voluntary, informed consent, have access to information about the study, and have the right to opt-out at any time. And researchers must protect participants’ privacy and well-being and demonstrate that the research benefits must outweigh any costs.
Cambridge Analytica violated the common rule tenets, most flagrantly the standards of informed consent and Facebook’s users’ privacy. Yet technically there was no illegality. As a private company, Facebook is not required to uphold the Common Rule. In fact, the Common Rule does not protect much of our data. We’ve willingly given our data to private companies in exchange for services such as cloud accounts, or by sharing it through social media.
However, GDPR will change this by applying mandates akin to the Common Rule to user data. GDPR now requires that organizations and companies have explicit consent to access individuals’ data. Users must be informed about how their data is used, individuals’ data must be anonymized, and processes must be put in place allowing people to see what data a company has about them and to delete it upon request.
The IxDA community is in the business of designing to support and extend human interaction within the world. Yet design is not neutral, just as algorithms and markets are not neutral. They are extensions of the people who create them, whose agendas are driven by their socio-cultural values. As Brittany Fiore-Gartland pointed out during the panel discussion, the unintended harmful consequences of data collection pose a high enough risk that there must be guiding principles. From this vantage, GDPR is justified. Yet it is just one step in instituting a human-centered ethos around data protection.