Get our free extension to see links to code for papers anywhere online!

Chrome logo  Add to Chrome

Firefox logo Add to Firefox

"Sentiment Analysis": models, code, and papers

BERT Fine-Tuning for Sentiment Analysis on Indonesian Mobile Apps Reviews

Jul 14, 2021
Kuncahyo Setyo Nugroho, Anantha Yullian Sukmadewa, Haftittah Wuswilahaken DW, Fitra Abdurrachman Bachtiar, Novanto Yudistira

User reviews have an essential role in the success of the developed mobile apps. User reviews in the textual form are unstructured data, creating a very high complexity when processed for sentiment analysis. Previous approaches that have been used often ignore the context of reviews. In addition, the relatively small data makes the model overfitting. A new approach, BERT, has been introduced as a transfer learning model with a pre-trained model that has previously been trained to have a better context representation. This study examines the effectiveness of fine-tuning BERT for sentiment analysis using two different pre-trained models. Besides the multilingual pre-trained model, we use the pre-trained model that only has been trained in Indonesian. The dataset used is Indonesian user reviews of the ten best apps in 2020 in Google Play sites. We also perform hyper-parameter tuning to find the optimum trained model. Two training data labeling approaches were also tested to determine the effectiveness of the model, which is score-based and lexicon-based. The experimental results show that pre-trained models trained in Indonesian have better average accuracy on lexicon-based data. The pre-trained Indonesian model highest accuracy is 84%, with 25 epochs and a training time of 24 minutes. These results are better than all of the machine learning and multilingual pre-trained models.

  
Access Paper or Ask Questions

Multilogue-Net: A Context Aware RNN for Multi-modal Emotion Detection and Sentiment Analysis in Conversation

Feb 20, 2020
Aman Shenoy, Ashish Sardana

Sentiment Analysis and Emotion Detection in conversation is key in a number of real-world applications, with different applications leveraging different kinds of data to be able to achieve reasonably accurate predictions. Multimodal Emotion Detection and Sentiment Analysis can be particularly useful as applications will be able to use specific subsets of the available modalities, as per their available data, to be able to produce relevant predictions. Current systems dealing with Multimodal functionality fail to leverage and capture the context of the conversation through all modalities, the current speaker and listener(s) in the conversation, and the relevance and relationship between the available modalities through an adequate fusion mechanism. In this paper, we propose a recurrent neural network architecture that attempts to take into account all the mentioned drawbacks, and keeps track of the context of the conversation, interlocutor states, and the emotions conveyed by the speakers in the conversation. Our proposed model out performs the state of the art on two benchmark datasets on a variety of accuracy and regression metrics.

* 10 pages, 4 figures, 6 tables 
  
Access Paper or Ask Questions

Re-presenting a Story by Emotional Factors using Sentimental Analysis Method

Jul 13, 2016
Hwiyeol Jo, Yohan Moon, Jong In Kim, Jeong Ryu

Remembering an event is affected by personal emotional status. We examined the psychological status and personal factors; depression (Center for Epidemiological Studies - Depression, Radloff, 1977), present affective (Positive Affective and Negative Affective Schedule, Watson et al., 1988), life orient (Life Orient Test, Scheier & Carver, 1985), self-awareness (Core Self Evaluation Scale, Judge et al., 2003), and social factor (Social Support, Sarason et al., 1983) of undergraduate students (N=64) and got summaries of a story, Chronicle of a Death Foretold (Gabriel Garcia Marquez, 1981) from them. We implement a sentimental analysis model based on convolutional neural network (LeCun & Bengio, 1995) to evaluate each summary. From the same vein used for transfer learning (Pan & Yang, 2010), we collected 38,265 movie review data to train the model and then use them to score summaries of each student. The results of CES-D and PANAS show the relationship between emotion and memory retrieval as follows: depressed people have shown a tendency of representing a story more negatively, and they seemed less expressive. People with full of emotion - high in PANAS - have retrieved their memory more expressively than others, using more negative words then others. The contributions of this study can be summarized as follows: First, lightening the relationship between emotion and its effect during times of storing or retrieving a memory. Second, suggesting objective methods to evaluate the intensity of emotion in natural language format, using a sentimental analysis model.

* Paper version of CogSci2016; We should correct poor English 
  
Access Paper or Ask Questions

Out of Context: A New Clue for Context Modeling of Aspect-based Sentiment Analysis

Jun 21, 2021
Bowen Xing, Ivor W. Tsang

Aspect-based sentiment analysis (ABSA) aims to predict the sentiment expressed in a review with respect to a given aspect. The core of ABSA is to model the interaction between the context and given aspect to extract the aspect-related information. In prior work, attention mechanisms and dependency graph networks are commonly adopted to capture the relations between the context and given aspect. And the weighted sum of context hidden states is used as the final representation fed to the classifier. However, the information related to the given aspect may be already discarded and adverse information may be retained in the context modeling processes of existing models. This problem cannot be solved by subsequent modules and there are two reasons: first, their operations are conducted on the encoder-generated context hidden states, whose value cannot change after the encoder; second, existing encoders only consider the context while not the given aspect. To address this problem, we argue the given aspect should be considered as a new clue out of context in the context modeling process. As for solutions, we design several aspect-aware context encoders based on different backbones: an aspect-aware LSTM and three aspect-aware BERTs. They are dedicated to generate aspect-aware hidden states which are tailored for ABSA task. In these aspect-aware context encoders, the semantics of the given aspect is used to regulate the information flow. Consequently, the aspect-related information can be retained and aspect-irrelevant information can be excluded in the generated hidden states. We conduct extensive experiments on several benchmark datasets with empirical analysis, demonstrating the efficacies and advantages of our proposed aspect-aware context encoders.

* Submitted to JAIR 
  
Access Paper or Ask Questions

The Multimodal Sentiment Analysis in Car Reviews (MuSe-CaR) Dataset: Collection, Insights and Improvements

Jan 15, 2021
Lukas Stappen, Alice Baird, Lea Schumann, Björn Schuller

Truly real-life data presents a strong, but exciting challenge for sentiment and emotion research. The high variety of possible `in-the-wild' properties makes large datasets such as these indispensable with respect to building robust machine learning models. A sufficient quantity of data covering a deep variety in the challenges of each modality to force the exploratory analysis of the interplay of all modalities has not yet been made available in this context. In this contribution, we present MuSe-CaR, a first of its kind multimodal dataset. The data is publicly available as it recently served as the testing bed for the 1st Multimodal Sentiment Analysis Challenge, and focused on the tasks of emotion, emotion-target engagement, and trustworthiness recognition by means of comprehensively integrating the audio-visual and language modalities. Furthermore, we give a thorough overview of the dataset in terms of collection and annotation, including annotation tiers not used in this year's MuSe 2020. In addition, for one of the sub-challenges - predicting the level of trustworthiness - no participant outperformed the baseline model, and so we propose a simple, but highly efficient Multi-Head-Attention network that exceeds using multimodal fusion the baseline by around 0.2 CCC (almost 50 % improvement).

  
Access Paper or Ask Questions

Would You Like Sashimi Even If It's Sliced Too Thin? Selective Neural Attention for Aspect Targeted Sentiment Analysis (SNAT)

Apr 27, 2020
Zhe Zhang, Chung-Wei Hang, Munindar P. Singh

Sentiments in opinionated text are often determined by both aspects and target words (or targets). We observe that targets and aspects interrelate in subtle ways, often yielding conflicting sentiments. Thus, a naive aggregation of sentiments from aspects and targets treated separately, as in existing sentiment analysis models, impairs performance. We propose SNAT, an approach that jointly considers aspects and targets when inferring sentiments. To capture and quantify relationships between targets and context words, SNAT uses a selective self-attention mechanism that handles implicit or missing targets. Specifically, SNAT involves two layers of attention mechanisms, respectively, for selective attention between targets and context words and attention over words based on aspects. On benchmark datasets, SNAT outperforms leading models by a large margin, yielding (absolute) gains in accuracy of 1.8% to 5.2%.

  
Access Paper or Ask Questions

Learning to Attend via Word-Aspect Associative Fusion for Aspect-based Sentiment Analysis

Dec 14, 2017
Yi Tay, Anh Tuan Luu, Siu Cheung Hui

Aspect-based sentiment analysis (ABSA) tries to predict the polarity of a given document with respect to a given aspect entity. While neural network architectures have been successful in predicting the overall polarity of sentences, aspect-specific sentiment analysis still remains as an open problem. In this paper, we propose a novel method for integrating aspect information into the neural model. More specifically, we incorporate aspect information into the neural model by modeling word-aspect relationships. Our novel model, \textit{Aspect Fusion LSTM} (AF-LSTM) learns to attend based on associative relationships between sentence words and aspect which allows our model to adaptively focus on the correct words given an aspect term. This ameliorates the flaws of other state-of-the-art models that utilize naive concatenations to model word-aspect similarity. Instead, our model adopts circular convolution and circular correlation to model the similarity between aspect and words and elegantly incorporates this within a differentiable neural attention framework. Finally, our model is end-to-end differentiable and highly related to convolution-correlation (holographic like) memories. Our proposed neural model achieves state-of-the-art performance on benchmark datasets, outperforming ATAE-LSTM by $4\%-5\%$ on average across multiple datasets.

* Accepted to AAAI2018 
  
Access Paper or Ask Questions

A New Approach To Text Rating Classification Using Sentiment Analysis

Mar 31, 2021
Thomas Konstantinovsky

Typical use cases of sentiment analysis usually revolve around assessing the probability of a text belonging to a certain sentiment and deriving insight concerning it; little work has been done to explore further use cases derived using those probabilities in the context of rating. In this paper, we redefine the sentiment proportion values as building blocks for a triangle structure, allowing us to derive variables for a new formula for classifying text given in the form of product reviews into a group of higher and a group of lower ratings and prove a dependence exists between the sentiments and the ratings.

* 9 pages,9 figures 
  
Access Paper or Ask Questions

Preparation of Sentiment tagged Parallel Corpus and Testing its effect on Machine Translation

Jul 28, 2020
Sainik Kumar Mahata, Amrita Chandra, Dipankar Das, Sivaji Bandyopadhyay

In the current work, we explore the enrichment in the machine translation output when the training parallel corpus is augmented with the introduction of sentiment analysis. The paper discusses the preparation of the same sentiment tagged English-Bengali parallel corpus. The preparation of raw parallel corpus, sentiment analysis of the sentences and the training of a Character Based Neural Machine Translation model using the same has been discussed extensively in this paper. The output of the translation model has been compared with a base-line translation model using automated metrics such as BLEU and TER as well as manually.

  
Access Paper or Ask Questions

Detection and Prediction of Users Attitude Based on Real-Time and Batch Sentiment Analysis of Facebook Comments

Jun 08, 2019
Hieu Tran, Maxim Shcherbakov

The most of the people have their account on social networks (e.g. Facebook, Vkontakte) where they express their attitude to different situations and events. Facebook provides only the positive mark as a like button and share. However, it is important to know the position of a certain user on posts even though the opinion is negative. Positive, negative and neutral attitude can be extracted from the comments of users. Overall information about positive, negative and neutral opinion can bring the understanding of how people react in a position. Moreover, it is important to know how attitude is changing during the time period. The contribution of the paper is a new method based on sentiment text analysis for detection and prediction negative and positive patterns for Facebook comments which combines (i) real-time sentiment text analysis for pattern discovery and (ii) batch data processing for creating opinion forecasting algorithm. To perform forecast we propose two-steps algorithm where: (i) patterns are clustered using unsupervised clustering techniques and (ii) trend prediction is performed based on finding the nearest pattern from the certain cluster. Case studies show the efficiency and accuracy (Avg. MAE = 0.008) of the proposed method and its practical applicability. Also, we discovered three types of users attitude patterns and described them.

* 12 pages, 6 figures, CSoNet 2016 
  
Access Paper or Ask Questions
<<
42
43
44
45
46
47
48
49
50
>>