Picture for William Agnew

William Agnew

The Algorithmic Gaze: An Audit and Ethnography of the LAION-Aesthetics Predictor Model

Add code
Jan 14, 2026
Viaarxiv icon

How do data owners say no? A case study of data consent mechanisms in web-scraped vision-language AI training datasets

Add code
Nov 10, 2025
Viaarxiv icon

Expressing stigma and inappropriate responses prevents LLMs from safely replacing mental health providers

Add code
Apr 25, 2025
Viaarxiv icon

The Cake that is Intelligence and Who Gets to Bake it: An AI Analogy and its Implications for Participation

Add code
Feb 06, 2025
Viaarxiv icon

Data Defenses Against Large Language Models

Add code
Oct 17, 2024
Figure 1 for Data Defenses Against Large Language Models
Figure 2 for Data Defenses Against Large Language Models
Figure 3 for Data Defenses Against Large Language Models
Figure 4 for Data Defenses Against Large Language Models
Viaarxiv icon

Sound Check: Auditing Audio Datasets

Add code
Oct 17, 2024
Figure 1 for Sound Check: Auditing Audio Datasets
Figure 2 for Sound Check: Auditing Audio Datasets
Figure 3 for Sound Check: Auditing Audio Datasets
Figure 4 for Sound Check: Auditing Audio Datasets
Viaarxiv icon

'Simulacrum of Stories': Examining Large Language Models as Qualitative Research Participants

Add code
Sep 28, 2024
Viaarxiv icon

Who's in and who's out? A case study of multimodal CLIP-filtering in DataComp

Add code
May 13, 2024
Figure 1 for Who's in and who's out? A case study of multimodal CLIP-filtering in DataComp
Figure 2 for Who's in and who's out? A case study of multimodal CLIP-filtering in DataComp
Figure 3 for Who's in and who's out? A case study of multimodal CLIP-filtering in DataComp
Figure 4 for Who's in and who's out? A case study of multimodal CLIP-filtering in DataComp
Viaarxiv icon

The Surveillance AI Pipeline

Add code
Sep 26, 2023
Figure 1 for The Surveillance AI Pipeline
Figure 2 for The Surveillance AI Pipeline
Figure 3 for The Surveillance AI Pipeline
Figure 4 for The Surveillance AI Pipeline
Viaarxiv icon

Bound by the Bounty: Collaboratively Shaping Evaluation Processes for Queer AI Harms

Add code
Jul 25, 2023
Figure 1 for Bound by the Bounty: Collaboratively Shaping Evaluation Processes for Queer AI Harms
Figure 2 for Bound by the Bounty: Collaboratively Shaping Evaluation Processes for Queer AI Harms
Viaarxiv icon