Alert button
Picture for Andrew Y. Ng

Andrew Y. Ng

Alert button

Learning Neighborhood Representation from Multi-Modal Multi-Graph: Image, Text, Mobility Graph and Beyond

May 06, 2021
Tianyuan Huang, Zhecheng Wang, Hao Sheng, Andrew Y. Ng, Ram Rajagopal

Figure 1 for Learning Neighborhood Representation from Multi-Modal Multi-Graph: Image, Text, Mobility Graph and Beyond
Figure 2 for Learning Neighborhood Representation from Multi-Modal Multi-Graph: Image, Text, Mobility Graph and Beyond
Figure 3 for Learning Neighborhood Representation from Multi-Modal Multi-Graph: Image, Text, Mobility Graph and Beyond
Figure 4 for Learning Neighborhood Representation from Multi-Modal Multi-Graph: Image, Text, Mobility Graph and Beyond
Viaarxiv icon

Effect of Radiology Report Labeler Quality on Deep Learning Models for Chest X-Ray Interpretation

Apr 01, 2021
Saahil Jain, Akshay Smit, Andrew Y. Ng, Pranav Rajpurkar

Figure 1 for Effect of Radiology Report Labeler Quality on Deep Learning Models for Chest X-Ray Interpretation
Figure 2 for Effect of Radiology Report Labeler Quality on Deep Learning Models for Chest X-Ray Interpretation
Viaarxiv icon

MedSelect: Selective Labeling for Medical Image Classification Combining Meta-Learning with Deep Reinforcement Learning

Mar 26, 2021
Akshay Smit, Damir Vrabac, Yujie He, Andrew Y. Ng, Andrew L. Beam, Pranav Rajpurkar

Figure 1 for MedSelect: Selective Labeling for Medical Image Classification Combining Meta-Learning with Deep Reinforcement Learning
Figure 2 for MedSelect: Selective Labeling for Medical Image Classification Combining Meta-Learning with Deep Reinforcement Learning
Figure 3 for MedSelect: Selective Labeling for Medical Image Classification Combining Meta-Learning with Deep Reinforcement Learning
Viaarxiv icon

CheXbreak: Misclassification Identification for Deep Learning Models Interpreting Chest X-rays

Mar 24, 2021
Emma Chen, Andy Kim, Rayan Krishnan, Jin Long, Andrew Y. Ng, Pranav Rajpurkar

Figure 1 for CheXbreak: Misclassification Identification for Deep Learning Models Interpreting Chest X-rays
Figure 2 for CheXbreak: Misclassification Identification for Deep Learning Models Interpreting Chest X-rays
Figure 3 for CheXbreak: Misclassification Identification for Deep Learning Models Interpreting Chest X-rays
Figure 4 for CheXbreak: Misclassification Identification for Deep Learning Models Interpreting Chest X-rays
Viaarxiv icon

VisualCheXbert: Addressing the Discrepancy Between Radiology Report Labels and Image Labels

Mar 15, 2021
Saahil Jain, Akshay Smit, Steven QH Truong, Chanh DT Nguyen, Minh-Thanh Huynh, Mudit Jain, Victoria A. Young, Andrew Y. Ng, Matthew P. Lungren, Pranav Rajpurkar

Figure 1 for VisualCheXbert: Addressing the Discrepancy Between Radiology Report Labels and Image Labels
Figure 2 for VisualCheXbert: Addressing the Discrepancy Between Radiology Report Labels and Image Labels
Figure 3 for VisualCheXbert: Addressing the Discrepancy Between Radiology Report Labels and Image Labels
Figure 4 for VisualCheXbert: Addressing the Discrepancy Between Radiology Report Labels and Image Labels
Viaarxiv icon

CheXseen: Unseen Disease Detection for Deep Learning Interpretation of Chest X-rays

Mar 08, 2021
Siyu Shi, Ishaan Malhi, Kevin Tran, Andrew Y. Ng, Pranav Rajpurkar

Figure 1 for CheXseen: Unseen Disease Detection for Deep Learning Interpretation of Chest X-rays
Figure 2 for CheXseen: Unseen Disease Detection for Deep Learning Interpretation of Chest X-rays
Figure 3 for CheXseen: Unseen Disease Detection for Deep Learning Interpretation of Chest X-rays
Figure 4 for CheXseen: Unseen Disease Detection for Deep Learning Interpretation of Chest X-rays
Viaarxiv icon

MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation

Feb 21, 2021
Yen Nhi Truong Vu, Richard Wang, Niranjan Balachandar, Can Liu, Andrew Y. Ng, Pranav Rajpurkar

Figure 1 for MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation
Figure 2 for MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation
Figure 3 for MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation
Figure 4 for MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation
Viaarxiv icon

CheXternal: Generalization of Deep Learning Models for Chest X-ray Interpretation to Photos of Chest X-rays and External Clinical Settings

Feb 21, 2021
Pranav Rajpurkar, Anirudh Joshi, Anuj Pareek, Andrew Y. Ng, Matthew P. Lungren

Figure 1 for CheXternal: Generalization of Deep Learning Models for Chest X-ray Interpretation to Photos of Chest X-rays and External Clinical Settings
Figure 2 for CheXternal: Generalization of Deep Learning Models for Chest X-ray Interpretation to Photos of Chest X-rays and External Clinical Settings
Figure 3 for CheXternal: Generalization of Deep Learning Models for Chest X-ray Interpretation to Photos of Chest X-rays and External Clinical Settings
Figure 4 for CheXternal: Generalization of Deep Learning Models for Chest X-ray Interpretation to Photos of Chest X-rays and External Clinical Settings
Viaarxiv icon