Alert button
Picture for Roger Wattenhofer

Roger Wattenhofer

Alert button

SPECTRE : Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators

Apr 04, 2022
Karolis Martinkus, Andreas Loukas, Nathanaël Perraudin, Roger Wattenhofer

Figure 1 for SPECTRE : Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators
Figure 2 for SPECTRE : Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators
Figure 3 for SPECTRE : Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators
Figure 4 for SPECTRE : Spectral Conditioning Helps to Overcome the Expressivity Limits of One-shot Graph Generators
Viaarxiv icon

A Theoretical Comparison of Graph Neural Network Extensions

Jan 30, 2022
Pál András Papp, Roger Wattenhofer

Figure 1 for A Theoretical Comparison of Graph Neural Network Extensions
Figure 2 for A Theoretical Comparison of Graph Neural Network Extensions
Figure 3 for A Theoretical Comparison of Graph Neural Network Extensions
Figure 4 for A Theoretical Comparison of Graph Neural Network Extensions
Viaarxiv icon

The Price of Majority Support

Jan 28, 2022
Robin Fritsch, Roger Wattenhofer

Figure 1 for The Price of Majority Support
Figure 2 for The Price of Majority Support
Figure 3 for The Price of Majority Support
Figure 4 for The Price of Majority Support
Viaarxiv icon

DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks

Nov 11, 2021
Pál András Papp, Karolis Martinkus, Lukas Faber, Roger Wattenhofer

Figure 1 for DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
Figure 2 for DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
Figure 3 for DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
Figure 4 for DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
Viaarxiv icon

EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction

Nov 10, 2021
Ard Kastrati, Martyna Beata Płomecka, Damián Pascual, Lukas Wolf, Victor Gillioz, Roger Wattenhofer, Nicolas Langer

Figure 1 for EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction
Figure 2 for EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction
Figure 3 for EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction
Figure 4 for EEGEyeNet: a Simultaneous Electroencephalography and Eye-tracking Dataset and Benchmark for Eye Movement Prediction
Viaarxiv icon

3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers

Oct 17, 2021
Zai Shi, Zhao Meng, Yiran Xing, Yunpu Ma, Roger Wattenhofer

Figure 1 for 3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers
Figure 2 for 3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers
Figure 3 for 3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers
Figure 4 for 3D-RETR: End-to-End Single and Multi-View 3D Reconstruction with Transformers
Viaarxiv icon

On Isotropy Calibration of Transformers

Sep 27, 2021
Yue Ding, Karolis Martinkus, Damian Pascual, Simon Clematide, Roger Wattenhofer

Figure 1 for On Isotropy Calibration of Transformers
Figure 2 for On Isotropy Calibration of Transformers
Figure 3 for On Isotropy Calibration of Transformers
Figure 4 for On Isotropy Calibration of Transformers
Viaarxiv icon

A Plug-and-Play Method for Controlled Text Generation

Sep 20, 2021
Damian Pascual, Beni Egressy, Clara Meister, Ryan Cotterell, Roger Wattenhofer

Figure 1 for A Plug-and-Play Method for Controlled Text Generation
Figure 2 for A Plug-and-Play Method for Controlled Text Generation
Figure 3 for A Plug-and-Play Method for Controlled Text Generation
Figure 4 for A Plug-and-Play Method for Controlled Text Generation
Viaarxiv icon

BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification

Sep 15, 2021
Jens Hauser, Zhao Meng, Damián Pascual, Roger Wattenhofer

Figure 1 for BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification
Figure 2 for BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification
Figure 3 for BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification
Figure 4 for BERT is Robust! A Case Against Synonym-Based Adversarial Examples in Text Classification
Viaarxiv icon

Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models

Jul 15, 2021
Zhao Meng, Yihan Dong, Mrinmaya Sachan, Roger Wattenhofer

Figure 1 for Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models
Figure 2 for Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models
Figure 3 for Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models
Figure 4 for Self-Supervised Contrastive Learning with Adversarial Perturbations for Robust Pretrained Language Models
Viaarxiv icon