Alert button
Picture for Glorianna Jagfeld

Glorianna Jagfeld

Alert button

Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis

Add code
Bookmark button
Alert button
Apr 23, 2021
Glorianna Jagfeld, Fiona Lobban, Paul Rayson, Steven H. Jones

Figure 1 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 2 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 3 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Figure 4 for Understanding who uses Reddit: Profiling individuals with a self-reported bipolar disorder diagnosis
Viaarxiv icon

A computational linguistic study of personal recovery in bipolar disorder

Add code
Bookmark button
Alert button
Jun 03, 2019
Glorianna Jagfeld

Viaarxiv icon

Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity

Add code
Bookmark button
Alert button
Oct 11, 2018
Glorianna Jagfeld, Sabrina Jenne, Ngoc Thang Vu

Figure 1 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 2 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 3 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Figure 4 for Sequence-to-Sequence Models for Data-to-Text Natural Language Generation: Word- vs. Character-based Processing and Output Diversity
Viaarxiv icon

Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension

Add code
Bookmark button
Alert button
Aug 27, 2018
Matthias Blohm, Glorianna Jagfeld, Ekta Sood, Xiang Yu, Ngoc Thang Vu

Figure 1 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 2 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 3 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Figure 4 for Comparing Attention-based Convolutional and Recurrent Neural Networks: Success and Limitations in Machine Reading Comprehension
Viaarxiv icon

Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking

Add code
Bookmark button
Alert button
Aug 09, 2017
Glorianna Jagfeld, Ngoc Thang Vu

Figure 1 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 2 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 3 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Figure 4 for Encoding Word Confusion Networks with Recurrent Neural Networks for Dialog State Tracking
Viaarxiv icon