Alert button
Picture for Omar Shaikh

Omar Shaikh

Alert button

Social Skill Training with Large Language Models

Add code
Bookmark button
Alert button
Apr 05, 2024
Diyi Yang, Caleb Ziems, William Held, Omar Shaikh, Michael S. Bernstein, John Mitchell

Viaarxiv icon

Grounding or Guesswork? Large Language Models are Presumptive Grounders

Add code
Bookmark button
Alert button
Nov 15, 2023
Omar Shaikh, Kristina Gligorić, Ashna Khetan, Matthias Gerstgrasser, Diyi Yang, Dan Jurafsky

Viaarxiv icon

Rehearsal: Simulating Conflict to Teach Conflict Resolution

Add code
Bookmark button
Alert button
Sep 21, 2023
Omar Shaikh, Valentino Chai, Michele J. Gelfand, Diyi Yang, Michael S. Bernstein

Figure 1 for Rehearsal: Simulating Conflict to Teach Conflict Resolution
Figure 2 for Rehearsal: Simulating Conflict to Teach Conflict Resolution
Figure 3 for Rehearsal: Simulating Conflict to Teach Conflict Resolution
Figure 4 for Rehearsal: Simulating Conflict to Teach Conflict Resolution
Viaarxiv icon

Modeling Cross-Cultural Pragmatic Inference with Codenames Duet

Add code
Bookmark button
Alert button
Jun 04, 2023
Omar Shaikh, Caleb Ziems, William Held, Aryan J. Pariani, Fred Morstatter, Diyi Yang

Figure 1 for Modeling Cross-Cultural Pragmatic Inference with Codenames Duet
Figure 2 for Modeling Cross-Cultural Pragmatic Inference with Codenames Duet
Figure 3 for Modeling Cross-Cultural Pragmatic Inference with Codenames Duet
Figure 4 for Modeling Cross-Cultural Pragmatic Inference with Codenames Duet
Viaarxiv icon

Can Large Language Models Transform Computational Social Science?

Add code
Bookmark button
Alert button
Apr 12, 2023
Caleb Ziems, William Held, Omar Shaikh, Jiaao Chen, Zhehao Zhang, Diyi Yang

Figure 1 for Can Large Language Models Transform Computational Social Science?
Figure 2 for Can Large Language Models Transform Computational Social Science?
Figure 3 for Can Large Language Models Transform Computational Social Science?
Figure 4 for Can Large Language Models Transform Computational Social Science?
Viaarxiv icon

On Second Thought, Let's Not Think Step by Step! Bias and Toxicity in Zero-Shot Reasoning

Add code
Bookmark button
Alert button
Dec 15, 2022
Omar Shaikh, Hongxin Zhang, William Held, Michael Bernstein, Diyi Yang

Figure 1 for On Second Thought, Let's Not Think Step by Step! Bias and Toxicity in Zero-Shot Reasoning
Figure 2 for On Second Thought, Let's Not Think Step by Step! Bias and Toxicity in Zero-Shot Reasoning
Figure 3 for On Second Thought, Let's Not Think Step by Step! Bias and Toxicity in Zero-Shot Reasoning
Figure 4 for On Second Thought, Let's Not Think Step by Step! Bias and Toxicity in Zero-Shot Reasoning
Viaarxiv icon

ConceptEvo: Interpreting Concept Evolution in Deep Learning Training

Add code
Bookmark button
Alert button
Mar 30, 2022
Haekyu Park, Seongmin Lee, Benjamin Hoover, Austin Wright, Omar Shaikh, Rahul Duggal, Nilaksh Das, Judy Hoffman, Duen Horng Chau

Figure 1 for ConceptEvo: Interpreting Concept Evolution in Deep Learning Training
Figure 2 for ConceptEvo: Interpreting Concept Evolution in Deep Learning Training
Figure 3 for ConceptEvo: Interpreting Concept Evolution in Deep Learning Training
Figure 4 for ConceptEvo: Interpreting Concept Evolution in Deep Learning Training
Viaarxiv icon

NeuroCartography: Scalable Automatic Visual Summarization of Concepts in Deep Neural Networks

Add code
Bookmark button
Alert button
Aug 29, 2021
Haekyu Park, Nilaksh Das, Rahul Duggal, Austin P. Wright, Omar Shaikh, Fred Hohman, Duen Horng Chau

Figure 1 for NeuroCartography: Scalable Automatic Visual Summarization of Concepts in Deep Neural Networks
Figure 2 for NeuroCartography: Scalable Automatic Visual Summarization of Concepts in Deep Neural Networks
Figure 3 for NeuroCartography: Scalable Automatic Visual Summarization of Concepts in Deep Neural Networks
Figure 4 for NeuroCartography: Scalable Automatic Visual Summarization of Concepts in Deep Neural Networks
Viaarxiv icon

EnergyVis: Interactively Tracking and Exploring Energy Consumption for ML Models

Add code
Bookmark button
Alert button
Mar 30, 2021
Omar Shaikh, Jon Saad-Falcon, Austin P Wright, Nilaksh Das, Scott Freitas, Omar Isaac Asensio, Duen Horng Chau

Figure 1 for EnergyVis: Interactively Tracking and Exploring Energy Consumption for ML Models
Figure 2 for EnergyVis: Interactively Tracking and Exploring Energy Consumption for ML Models
Figure 3 for EnergyVis: Interactively Tracking and Exploring Energy Consumption for ML Models
Figure 4 for EnergyVis: Interactively Tracking and Exploring Energy Consumption for ML Models
Viaarxiv icon

RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization

Add code
Bookmark button
Alert button
Feb 10, 2021
Austin P Wright, Omar Shaikh, Haekyu Park, Will Epperson, Muhammed Ahmed, Stephane Pinel, Duen Horng Chau, Diyi Yang

Figure 1 for RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization
Figure 2 for RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization
Figure 3 for RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization
Figure 4 for RECAST: Enabling User Recourse and Interpretability of Toxicity Detection Models with Interactive Visualization
Viaarxiv icon