E-Book, Englisch, 96 Seiten, eBook
Mou / Jin Tree-Based Convolutional Neural Networks
1. Auflage 2018
ISBN: 978-981-13-1870-2
Verlag: Springer Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark
Principles and Applications
E-Book, Englisch, 96 Seiten, eBook
Reihe: SpringerBriefs in Computer Science
ISBN: 978-981-13-1870-2
Verlag: Springer Singapore
Format: PDF
Kopierschutz: 1 - PDF Watermark
This book proposes a novel neural architecture, tree-based convolutional neural networks (TBCNNs),for processing tree-structured data. TBCNNsare related to existing convolutional neural networks (CNNs) and recursive neural networks (RNNs), but they combine the merits of both: thanks to their short propagation path, they are as efficient in learning as CNNs; yet they are also as structure-sensitive as RNNs.
In this book, readers will also find a comprehensive literature review of related work, detailed descriptions of TBCNNs and their variants, and experiments applied to program analysis and natural language processing tasks. It is also an enjoyable read for all those with a general interest in deep learning.
Zielgruppe
Research
Autoren/Hrsg.
Weitere Infos & Material
1 Introduction
1.1 Deep Learning Background
1.2 Structure-Sensitive Neural Networks1.3 The Proposed Tree-Based Convolutional Neural Networks
1.4 Overview of the Book2 Preliminaries and Related Work
2.1 General Neural Networks2.1.1 Neurons and Multi-Layer Perceptrons
2.1.2 Training of Neural Networks: Backpropagations2.1.3 Pros and Cons of Multi-Layer Perceptrons
2.1.4 Pretraining of Neural Networks
2.2 Neural Networks Applied in Natural Language Processing
2.2.1 The Characteristics of Natural Language
2.2.2 Language Models
2.2.3 Word Embeddings
2.3 Existing Structure-Sensitive Neural Networks2.3.1 Convolutional Neural Networks
2.3.2 Recurrent Neural Networks2.3.3 Recursive Neural Networks
2.4 Summary and Discussions
3 General Concepts of Tree-Based Convolutional Neural Networks (TBCNNs)
3.1 Idea and Formulation
3.2 Applications of TBCNNs
3.3 Issues in designing TBCNNs
4 TBCNN for Programs’ Abstract Syntax Trees (ASTs)4.1 Background of Program Analysis
4.2 Proposed Model4.2.1 Overview
4.2.2 Representation Learning of AST nodes
4.2.3 Encoding Layer
4.2.4 AST-Based Convolutional Layer
4.2.5 Dynamic Pooling4.2.6 Continuous Binary Tree
4.3 Experiments
4.3.1 Unsupervised Representation Learning
4.3.2 Program Classification
4.3.3 Detecting Bubble Sort4.3.4 Model Analysis
4.4 Summary and Discussions
5 TBCNN for Constituency Trees in Natural Language Processing
5.1 Background of Sentence Modeling and Constituency Trees
5.2 Proposed Model
5.2.1 Constituency Trees as Input
5.2.2 Recursively Representing Intermediate Layers5.2.3 Constituency Tree-Based Convolutional Layer
5.2.4 Dynamic Pooling Layer5.3 Experiments
5.3.1 Sentiment Analysis
5.3.2 Question Classification
5.4 Summary and Discussions
6 TBCNN for Dependency Trees in Natural Language Processing6.1 Background of Dependency Trees
6.2 Proposed Model6.2.1 Dependency Trees as Input
6.2.2 Dependency Tree-Based Convolutional Layer
6.2.3 Dynamic Pooling Layer
6.2.4 Dependency TBCNN Applied to Sentence Matching
6.3 Experiments
6.3.1 Sentence Classification
6.3.2 Sentence Matching6.3.3 Model Analysis
6.3.4 Visualization
6.4 Summary and Discussions
7 Concluding Remarks
7.1 More Structure-Sensitive Neural Models7.2 Conclusion




