Alert button
Picture for Nina Grgić-Hlača

Nina Grgić-Hlača

Alert button

Blaming Humans and Machines: What Shapes People's Reactions to Algorithmic Harm

Add code
Bookmark button
Alert button
Apr 05, 2023
Gabriel Lima, Nina Grgić-Hlača, Meeyoung Cha

Figure 1 for Blaming Humans and Machines: What Shapes People's Reactions to Algorithmic Harm
Figure 2 for Blaming Humans and Machines: What Shapes People's Reactions to Algorithmic Harm
Figure 3 for Blaming Humans and Machines: What Shapes People's Reactions to Algorithmic Harm
Figure 4 for Blaming Humans and Machines: What Shapes People's Reactions to Algorithmic Harm
Viaarxiv icon

The Conflict Between Explainable and Accountable Decision-Making Algorithms

Add code
Bookmark button
Alert button
May 11, 2022
Gabriel Lima, Nina Grgić-Hlača, Jin Keun Jeong, Meeyoung Cha

Figure 1 for The Conflict Between Explainable and Accountable Decision-Making Algorithms
Viaarxiv icon

Human Perceptions on Moral Responsibility of AI: A Case Study in AI-Assisted Bail Decision-Making

Add code
Bookmark button
Alert button
Feb 01, 2021
Gabriel Lima, Nina Grgić-Hlača, Meeyoung Cha

Figure 1 for Human Perceptions on Moral Responsibility of AI: A Case Study in AI-Assisted Bail Decision-Making
Figure 2 for Human Perceptions on Moral Responsibility of AI: A Case Study in AI-Assisted Bail Decision-Making
Figure 3 for Human Perceptions on Moral Responsibility of AI: A Case Study in AI-Assisted Bail Decision-Making
Figure 4 for Human Perceptions on Moral Responsibility of AI: A Case Study in AI-Assisted Bail Decision-Making
Viaarxiv icon

Dimensions of Diversity in Human Perceptions of Algorithmic Fairness

Add code
Bookmark button
Alert button
May 02, 2020
Nina Grgić-Hlača, Adrian Weller, Elissa M. Redmiles

Figure 1 for Dimensions of Diversity in Human Perceptions of Algorithmic Fairness
Viaarxiv icon

Human Perceptions of Fairness in Algorithmic Decision Making: A Case Study of Criminal Risk Prediction

Add code
Bookmark button
Alert button
Feb 26, 2018
Nina Grgić-Hlača, Elissa M. Redmiles, Krishna P. Gummadi, Adrian Weller

Figure 1 for Human Perceptions of Fairness in Algorithmic Decision Making: A Case Study of Criminal Risk Prediction
Figure 2 for Human Perceptions of Fairness in Algorithmic Decision Making: A Case Study of Criminal Risk Prediction
Figure 3 for Human Perceptions of Fairness in Algorithmic Decision Making: A Case Study of Criminal Risk Prediction
Figure 4 for Human Perceptions of Fairness in Algorithmic Decision Making: A Case Study of Criminal Risk Prediction
Viaarxiv icon

On Fairness, Diversity and Randomness in Algorithmic Decision Making

Add code
Bookmark button
Alert button
Jun 30, 2017
Nina Grgić-Hlača, Muhammad Bilal Zafar, Krishna P. Gummadi, Adrian Weller

Figure 1 for On Fairness, Diversity and Randomness in Algorithmic Decision Making
Figure 2 for On Fairness, Diversity and Randomness in Algorithmic Decision Making
Figure 3 for On Fairness, Diversity and Randomness in Algorithmic Decision Making
Figure 4 for On Fairness, Diversity and Randomness in Algorithmic Decision Making
Viaarxiv icon