Alert button
Picture for Masashi Sugiyama

Masashi Sugiyama

Alert button

Online Multiclass Classification Based on Prediction Margin for Partial Feedback

Feb 04, 2019
Takuo Kaneko, Issei Sato, Masashi Sugiyama

Figure 1 for Online Multiclass Classification Based on Prediction Margin for Partial Feedback
Figure 2 for Online Multiclass Classification Based on Prediction Margin for Partial Feedback
Figure 3 for Online Multiclass Classification Based on Prediction Margin for Partial Feedback
Figure 4 for Online Multiclass Classification Based on Prediction Margin for Partial Feedback
Viaarxiv icon

Semi-Supervised Ordinal Regression Based on Empirical Risk Minimization

Jan 31, 2019
Taira Tsuchiya, Nontawat Charoenphakdee, Issei Sato, Masashi Sugiyama

Figure 1 for Semi-Supervised Ordinal Regression Based on Empirical Risk Minimization
Figure 2 for Semi-Supervised Ordinal Regression Based on Empirical Risk Minimization
Figure 3 for Semi-Supervised Ordinal Regression Based on Empirical Risk Minimization
Figure 4 for Semi-Supervised Ordinal Regression Based on Empirical Risk Minimization
Viaarxiv icon

New Tricks for Estimating Gradients of Expectations

Jan 31, 2019
Christian J. Walder, Richard Nock, Cheng Soon Ong, Masashi Sugiyama

Figure 1 for New Tricks for Estimating Gradients of Expectations
Figure 2 for New Tricks for Estimating Gradients of Expectations
Figure 3 for New Tricks for Estimating Gradients of Expectations
Figure 4 for New Tricks for Estimating Gradients of Expectations
Viaarxiv icon

On Possibility and Impossibility of Multiclass Classification with Rejection

Jan 30, 2019
Chenri Ni, Nontawat Charoenphakdee, Junya Honda, Masashi Sugiyama

Figure 1 for On Possibility and Impossibility of Multiclass Classification with Rejection
Figure 2 for On Possibility and Impossibility of Multiclass Classification with Rejection
Figure 3 for On Possibility and Impossibility of Multiclass Classification with Rejection
Figure 4 for On Possibility and Impossibility of Multiclass Classification with Rejection
Viaarxiv icon

Domain Discrepancy Measure Using Complex Models in Unsupervised Domain Adaptation

Jan 30, 2019
Jongyeong Lee, Nontawat Charoenphakdee, Seiichi Kuroki, Masashi Sugiyama

Figure 1 for Domain Discrepancy Measure Using Complex Models in Unsupervised Domain Adaptation
Figure 2 for Domain Discrepancy Measure Using Complex Models in Unsupervised Domain Adaptation
Figure 3 for Domain Discrepancy Measure Using Complex Models in Unsupervised Domain Adaptation
Figure 4 for Domain Discrepancy Measure Using Complex Models in Unsupervised Domain Adaptation
Viaarxiv icon

Imitation Learning from Imperfect Demonstration

Jan 30, 2019
Yueh-Hua Wu, Nontawat Charoenphakdee, Han Bao, Voot Tangkaratt, Masashi Sugiyama

Figure 1 for Imitation Learning from Imperfect Demonstration
Figure 2 for Imitation Learning from Imperfect Demonstration
Figure 3 for Imitation Learning from Imperfect Demonstration
Figure 4 for Imitation Learning from Imperfect Demonstration
Viaarxiv icon

Revisiting Sample Selection Approach to Positive-Unlabeled Learning: Turning Unlabeled Data into Positive rather than Negative

Jan 29, 2019
Miao Xu, Bingcong Li, Gang Niu, Bo Han, Masashi Sugiyama

Figure 1 for Revisiting Sample Selection Approach to Positive-Unlabeled Learning: Turning Unlabeled Data into Positive rather than Negative
Figure 2 for Revisiting Sample Selection Approach to Positive-Unlabeled Learning: Turning Unlabeled Data into Positive rather than Negative
Figure 3 for Revisiting Sample Selection Approach to Positive-Unlabeled Learning: Turning Unlabeled Data into Positive rather than Negative
Figure 4 for Revisiting Sample Selection Approach to Positive-Unlabeled Learning: Turning Unlabeled Data into Positive rather than Negative
Viaarxiv icon

Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis

Jan 28, 2019
Yusuke Tsuzuku, Issei Sato, Masashi Sugiyama

Figure 1 for Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis
Figure 2 for Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis
Figure 3 for Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis
Figure 4 for Normalized Flat Minima: Exploring Scale Invariant Definition of Flat Minima for Neural Networks using PAC-Bayesian Analysis
Viaarxiv icon

An analytic formulation for positive-unlabeled learning via weighted integral probability metric

Jan 28, 2019
Yongchan Kwon, Wonyoung Kim, Masashi Sugiyama, Myunghee Cho Paik

Figure 1 for An analytic formulation for positive-unlabeled learning via weighted integral probability metric
Figure 2 for An analytic formulation for positive-unlabeled learning via weighted integral probability metric
Figure 3 for An analytic formulation for positive-unlabeled learning via weighted integral probability metric
Figure 4 for An analytic formulation for positive-unlabeled learning via weighted integral probability metric
Viaarxiv icon