E-Book, Englisch, 326 Seiten
Agarwal / Nayak / Mittal Deep Learning-Based Approaches for Sentiment Analysis
1. Auflage 2020
ISBN: 978-981-15-1216-2
Verlag: Springer Nature Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark
E-Book, Englisch, 326 Seiten
Reihe: Algorithms for Intelligent Systems
ISBN: 978-981-15-1216-2
Verlag: Springer Nature Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark
This book covers deep-learning-based approaches for sentiment analysis, a relatively new, but fast-growing research area, which has significantly changed in the past few years. The book presents a collection of state-of-the-art approaches, focusing on the best-performing, cutting-edge solutions for the most common and difficult challenges faced in sentiment analysis research. Providing detailed explanations of the methodologies, the book is a valuable resource for researchers as well as newcomers to the field.
Dr. Basant Agarwal is an Assistant Professor at the Indian Institute of Information Technology Kota (IIIT-Kota), India. He holds a Ph.D. from MNIT Jaipur, and worked as a Postdoc Research Fellow at the Norwegian University of Science and Technology (NTNU), Norway, under the prestigious ERCIM (European Research Consortium for Informatics and Mathematics) fellowship in 2016. He has also worked as a Research Scientist at Temasek Laboratories, National University of Singapore (NUS), Singapore. Dr. Richi Nayak holds an M.E. degree from the Indian Institute of Technology, Roorkee, India, and received her Ph.D. in Computer Science from the Queensland University of Technology (QUT), Brisbane, Australia, in 2001. She is currently an Associate Professor of Computer Science at QUT, where she is also Head of Data Science. She has been successful in attaining over $4 million in external research funding in the area of text mining over the past ten years. She is a consultant to a number of government agencies in the area of data, text, and social media analytics projects. She is member of the steering committee of Australasian Data Mining in Australia (AusDM). She is the founder and leader of the Applied Data Mining Research Group at QUT. She has received a number of awards and nominations for teaching, research, and other activities. Dr. Namita Mittal is an Associate Professor at the Department of Computer Science and Engineering, MNIT Jaipur, India. She is a recipient of the Career Award for Young Teachers (CAYT) by AICTE. She has published numerous research papers in respected international conferences and journals, and has also authored a book on the topic of sentiment analysis in the Springer book series 'Socio-Affective Computing'. She is an SMIEEE, and a member of ACM, CCICI, and SCRS. She has been involved in various FDPs/conferences/workshops, like the Ph.D. Colloquium FIRE 2017, and International Workshop on Text Analytics and Retrieval (WI 2018) in conjunction with Web Intelligence (WI), USA, to name a few. Dr. Srikanta Patnaik is a Professor at the Department of Computer Science and Engineering, Faculty of Engineering and Technology, SOA University, Bhubaneswar, India. He received his Ph.D. in Computational Intelligence from Jadavpur University, India, in 1999. Dr. Patnaik was the Principal Investigator of the AICTE-sponsored TAPTEC project 'Building Cognition for Intelligent Robot' and the UGC-sponsored Major Research Project 'Machine Learning and Perception using Cognition Methods'. He is the Editor-in-Chief of the International Journal of Information and Communication Technology and the International Journal of Computational Vision and Robotics. Dr. Patnaik is also the Editor of the Journal of Information and Communication Convergence Engineering, published by the Korean Institute of Information and Communication Engineering. He is also the Editor-in-Chief of Springer book series 'Modeling and Optimization in Science and Technology'.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;6
2;Contents;9
3;About the Editors;11
4; Application of Deep Learning Approaches for Sentiment Analysis;13
4.1;1 Introduction;14
4.2;2 Taxonomy of Sentiment Analysis;14
4.2.1;2.1 Sentiment Analysis, Polarity, and Output;16
4.2.2;2.2 Levels of Sentiment Analysis;16
4.2.3;2.3 Domain Applicability, Training, and Testing Strategy;17
4.2.4;2.4 Language Support;18
4.2.5;2.5 Evaluation Measures;18
4.3;3 Text Representation for Sentiment Analysis;18
4.3.1;3.1 Embedded Vectors;18
4.3.2;3.2 Strategy of Initializing the Embedded Vectors;21
4.3.3;3.3 Enhancing the Embedded Vectors;21
4.3.4;3.4 Approximation Methods;22
4.3.5;3.5 Sampling-Based Approaches;22
4.3.6;3.6 Softmax-Based Approaches;23
4.4;4 Deep Learning Approaches for Sentiment Analysis;23
4.5;5 Evaluation Metrics for Sentiment Analysis;30
4.6;6 Benchmarked Datasets and Tools;33
4.7;7 Conclusion;35
4.8;References;38
5; Recent Trends and Advances in Deep Learning-Based Sentiment Analysis;44
5.1;1 Introduction;45
5.2;2 Related Work;46
5.3;3 Machine Learning Approaches for Sentiment Analysis;46
5.4;4 Study Rationale;48
5.5;5 Deep Learning Architectures;49
5.5.1;5.1 Convolutional Neural Networks;49
5.5.2;5.2 Recurrent Neural Networks;50
5.5.3;5.3 Bi-directional Recurrent Neural Network;50
5.6;6 Long Short-Term Memory (LSTMs);51
5.7;7 Gated Recurrent Units (GRUs);53
5.8;8 Attention Mechanism;54
5.9;9 Research Methodology;55
5.10;10 Approach to Sentiment Analysis Task Categorization;55
5.11;11 Coarse-Grain Sentiment Analysis;56
5.12;12 Fine-Grain Sentiment Analysis;58
5.13;13 Cross-Domain Sentiment Analysis;61
5.14;14 Conclusion and Survey Highlights;62
5.15;References;62
6; Deep Learning Adaptation with Word Embeddings for Sentiment Analysis on Online Course Reviews;68
6.1;1 Introduction;69
6.2;2 State of the Art;71
6.2.1;2.1 Sentiment Analysis in E-Learning Systems;71
6.2.2;2.2 Deep Learning for Sentiment Analysis;72
6.2.3;2.3 Word Embeddings for Sentiment Analysis;73
6.3;3 Word Embedding Representations for Text Mining;74
6.3.1;3.1 Word2Vec;75
6.3.2;3.2 GloVe;75
6.3.3;3.3 FastText;75
6.3.4;3.4 Intel;76
6.4;4 Deep Learning Components for Text Mining;76
6.4.1;4.1 Feed-Forward Neural Network (FNN);76
6.4.2;4.2 Recurrent Neural Network (RNN);77
6.4.3;4.3 Long Short-Term Memory (LSTM) Network;78
6.4.4;4.4 Convolutional Neural Network (CNN);78
6.4.5;4.5 Normalization Layer (NL);79
6.4.6;4.6 Attention Layer (AL);79
6.4.7;4.7 Other Layers;79
6.5;5 Our Sentiment Predictor for E-Learning Reviews;80
6.5.1;5.1 Review Splitting;80
6.5.2;5.2 Word Embedding Modeling;81
6.5.3;5.3 Review Vectorization;82
6.5.4;5.4 Sentiment Model Definition;83
6.5.5;5.5 Sentiment Model Training and Prediction;85
6.6;6 Experimental Evaluation;85
6.6.1;6.1 Dataset;85
6.6.2;6.2 Baselines;85
6.6.3;6.3 Metrics;86
6.6.4;6.4 Deep Neural Network Model Regressor Performance;87
6.6.5;6.5 Contextual Word Embeddings Performance;87
6.7;7 Conclusions, Open Challenges, and Future Directions;90
6.8;References;92
7; Toxic Comment Detection in Online Discussions;95
7.1;1 Online Discussions and Toxic Comments;95
7.1.1;1.1 News Platforms and Other Online Discussions Forums;96
7.1.2;1.2 Classes of Toxicity;97
7.2;2 Deep Learning for Toxic Comment Classification;99
7.2.1;2.1 Comment Datasets for Supervised Learning;99
7.2.2;2.2 Neural Network Architectures;101
7.3;3 From Binary to Fine-Grained Classification;104
7.3.1;3.1 Why Is It a Hard Problem?;104
7.3.2;3.2 Transfer Learning;106
7.3.3;3.3 Explanations;107
7.4;4 Real-World Applications;109
7.4.1;4.1 Semi-automated Comment Moderation;110
7.4.2;4.2 Troll Detection;111
7.5;5 Current Limitations and Future Trends;112
7.5.1;5.1 Misclassification of Comments;112
7.5.2;5.2 Research Directions;114
7.6;6 Conclusions;115
7.7;References;116
8; Aspect-Based Sentiment Analysis of Financial Headlines and Microblogs;120
8.1;1 Introduction;121
8.2;2 Related Work;123
8.3;3 State-of-the-Art Models;124
8.3.1;3.1 ALA Model;124
8.3.2;3.2 IIIT Delhi Model;125
8.4;4 Our Methodology;126
8.4.1;4.1 Features;127
8.5;5 Aspect Classification Models;129
8.5.1;5.1 Models;129
8.5.2;5.2 Classification Model Training;131
8.6;6 Sentiment Models;131
8.6.1;6.1 Models;131
8.6.2;6.2 Sentiment Model Training;136
8.7;7 Evaluation;137
8.7.1;7.1 Data Set;137
8.7.2;7.2 Data Augmentation;138
8.7.3;7.3 Data Pre-processing;138
8.7.4;7.4 Metrics;140
8.7.5;7.5 Results;141
8.8;8 Conclusion and Future Work;142
8.9;References;143
9; Deep Learning-Based Frameworks for Aspect-Based Sentiment Analysis;147
9.1;1 Introduction;147
9.2;2 Problem Formulation;150
9.2.1;2.1 Aspect-Term Extraction;150
9.2.2;2.2 Aspect-Category Detection;150
9.3;3 Observation/Assumption in ABSA;150
9.4;4 Input Representation;151
9.5;5 Concepts Related to Deep Learning;152
9.5.1;5.1 Word-Embeddings;152
9.5.2;5.2 Long Short-Term Memory (LSTM);153
9.5.3;5.3 Bi-directional Long Short-Term Memory (Bi-LSTM);154
9.5.4;5.4 RNN with Attention;155
9.5.5;5.5 Convolution Neutral Network (CNN);157
9.6;6 Deep Learning Architectures Used in ABSA;158
9.6.1;6.1 Sentiment Analysis;158
9.6.2;6.2 Aspect-Term Extraction;158
9.6.3;6.3 Aspect-Category Extraction;159
9.6.4;6.4 Aspect-Based Sentiment Detection;160
9.7;7 Conclusion;164
9.8;References;164
10; Transfer Learning for Detecting Hateful Sentiments in Code Switched Language;167
10.1;1 Introduction;168
10.1.1;1.1 Hate Speech Problem;168
10.1.2;1.2 Code Switched and Code Mixed Languages;169
10.1.3;1.3 Challenges in Code Switched and Code Mixed Languages;170
10.1.4;1.4 Deep Learning;170
10.1.5;1.5 Overview;170
10.2;2 Background and Related Work;171
10.2.1;2.1 Language Identification;172
10.2.2;2.2 POS Tagging;172
10.2.3;2.3 Named Entity Recognition;174
10.2.4;2.4 Sentiment Analysis;175
10.3;3 Dataset and Evaluation;177
10.3.1;3.1 HOT Dataset;178
10.3.2;3.2 Bohra et al. bohra2018dataset dataset;179
10.3.3;3.3 HEOT Dataset;179
10.3.4;3.4 Davidson Dataset;181
10.4;4 Methodology;181
10.4.1;4.1 SVM and Random Forest;181
10.4.2;4.2 Ternary Trans-CNN Model;183
10.4.3;4.3 LSTM-Based Model;186
10.4.4;4.4 MIMCT Model;188
10.5;5 Results;191
10.5.1;5.1 SVM and Random Forest Classifier;191
10.5.2;5.2 Ternary Trans-CNN Model;192
10.5.3;5.3 LSTM Model with Transfer Learning;192
10.5.4;5.4 MIMCT Model;194
10.6;6 Conclusion;195
10.7;7 Future Work;195
10.8;References;197
11; Multilingual Sentiment Analysis;201
11.1;1 Introduction;202
11.1.1;1.1 Low Resource Language;202
11.1.2;1.2 Challenges of Sentiment Analysis;203
11.1.3;1.3 Deep Learning;204
11.2;2 Literature Survey;205
11.2.1;2.1 High Resource Languages;205
11.2.2;2.2 Lexicon-Based Approaches;206
11.2.3;2.3 Traditional Machine Learning-Based Approaches;207
11.2.4;2.4 Low Resource Languages;208
11.3;3 Word Embeddings for Sentiment Analysis;209
11.3.1;3.1 Refining Word Embeddings for Sentiment Analysis;210
11.3.2;3.2 Improving Word Embedding Coverage in Low Resource Languages;213
11.4;4 Deep Learning Techniques for Multilingual Sentiment Analysis;216
11.4.1;4.1 Convolutional Neural Networks;217
11.4.2;4.2 Recurrent Neural Networks;220
11.4.3;4.3 Autoencoders;228
11.4.4;4.4 Bilingual Constrained Recursive Autoencoders;230
11.4.5;4.5 AROMA;232
11.4.6;4.6 Siamese Neural Networks;235
11.5;5 Discussion;237
11.6;6 Conclusion;241
11.7;References;241
12; Sarcasm Detection Using Deep Learning-Based Techniques;245
12.1;1 Introduction;245
12.2;2 Related Work;248
12.3;3 Grice’s Maxims;251
12.4;4 Challenges in Sarcasm Detection;255
12.5;5 Dataset Description;256
12.6;6 Feature Description;258
12.7;7 Process Outline;263
12.8;8 Models Used;263
12.9;9 Experiments and Results;264
12.10;10 Future Scope;265
12.11;References;265
13; Deep Learning Approaches for Speech Emotion Recognition;267
13.1;1 Introduction;268
13.2;2 Feature Extraction;269
13.3;3 Feature Selection;271
13.4;4 Classical Approaches;271
13.4.1;4.1 Speaker-Dependent SER;272
13.4.2;4.2 Speaker-Independent SER;273
13.4.3;4.3 Other Models;277
13.5;5 Deep Learning Approaches;278
13.6;6 System Overview;281
13.6.1;6.1 Classical Approach for SER;281
13.6.2;6.2 Deep Learning Approaches for SER;282
13.6.3;6.3 Critical Comparision;287
13.7;7 Evaluation;288
13.7.1;7.1 Dataset Description;288
13.7.2;7.2 Original Results;288
13.7.3;7.3 Results Obtained;290
13.8;8 Comparison of Existing Approaches;290
13.9;9 Conclusions;291
13.10;References;291
14; Bidirectional Long Short-Term Memory-Based Spatio-Temporal in Community Question Answering;298
14.1;1 Introduction;299
14.2;2 Related Works;301
14.3;3 Methodology;304
14.3.1;3.1 Preprocessing Steps;304
14.3.2;3.2 Best Answer Prediction;305
14.4;4 Experimental Setup: Answer Classification;310
14.5;5 Experiment II: Answer Ranking;313
14.6;6 Conclusion;315
14.7;References;315
15; Comparing Deep Neural Networks to Traditional Models for Sentiment Analysis in Turkish Language;318
15.1;1 Introduction;319
15.2;2 Methodology;320
15.3;3 Experimental Setup and Results;321
15.3.1;3.1 Dataset;321
15.3.2;3.2 Traditional BOW Approach;321
15.3.3;3.3 Deep Learning Architecture;323
15.4;4 Conclusion;325
15.5;References;325




