Artificial Intelligence and Human Rights

Principled Artificial Intelligence: Mapping Consensus and Divergence in Ethical and Rights-Based Approaches

Artificial Intelligence and Human Rights: Opportunities & Risks

Alongside the rapid development of artificial intelligence, we’ve seen a proliferation of AI “principles,” or guidelines for how AI should be built and used. Is there enough commonality among these efforts to suggest the emergence of sectoral norms? Where are the most significant points of divergence? Our data visualization presents thirty-two sets of principles side by side, enabling comparison between efforts from governments, companies, advocacy groups, and multi-stakeholder initiatives. The dataset itself and a white paper detailing our assumptions, methodology and key findings will be made available later this summer. If you would like to be notified when the white paper is published, sign up here . Additionally, if you’d like to provide feedback on our draft data visualization, fill out this form.

This project seeks to advance the emerging conversation on AI and human rights by evaluating the human rights impacts of six current AI uses. Our framework recognizes that AI systems are not being deployed against a blank slate, but rather against the backdrop of social conditions that have complex pre-existing human rights impacts of their own. By digging deep into current AI implementations, we see how they impact the full range of human rights guaranteed by international law, privacy chief among them. We also gain insight into the unequal distribution of the positive and negative impacts of AI on human rights throughout society, and begin to explore the power of the human rights framework to address these disparate impacts.

Authors: Jessica Fjeld, Hannah Hilligoss, Nele Achten, Maia Levy Daniel, Sally Kagay, and Joshua Feldman

Visualization Designed By:
Arushi Singh

Filippo Raso, Hannah Hilligoss, Vivek Krishnamurthy, Christopher Bavitz, Levin Kim

Visualization Designed By:
Levin Kim

Published Sep 25, 2018