Abney | Semisupervised Learning for Computational Linguistics | E-Book | sack.de
E-Book

E-Book, Englisch, 320 Seiten

Reihe: Chapman & Hall/CRC Computer Science & Data Analysis

Abney Semisupervised Learning for Computational Linguistics


Erscheinungsjahr 2010
ISBN: 978-1-4200-1080-0
Verlag: Taylor & Francis
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)

E-Book, Englisch, 320 Seiten

Reihe: Chapman & Hall/CRC Computer Science & Data Analysis

ISBN: 978-1-4200-1080-0
Verlag: Taylor & Francis
Format: PDF
Kopierschutz: Adobe DRM (»Systemvoraussetzungen)



The rapid advancement in the theoretical understanding of statistical and machine learning methods for semisupervised learning has made it difficult for nonspecialists to keep up to date in the field. Providing a broad, accessible treatment of the theory as well as linguistic applications, Semisupervised Learning for Computational Linguistics offers self-contained coverage of semisupervised methods that includes background material on supervised and unsupervised learning.

The book presents a brief history of semisupervised learning and its place in the spectrum of learning methods before moving on to discuss well-known natural language processing methods, such as self-training and co-training. It then centers on machine learning techniques, including the boundary-oriented methods of perceptrons, boosting, support vector machines (SVMs), and the null-category noise model. In addition, the book covers clustering, the expectation-maximization (EM) algorithm, related generative methods, and agreement methods. It concludes with the graph-based method of label propagation as well as a detailed discussion of spectral methods.

Taking an intuitive approach to the material, this lucid book facilitates the application of semisupervised learning methods to natural language processing and provides the framework and motivation for a more systematic study of machine learning.

Abney Semisupervised Learning for Computational Linguistics jetzt bestellen!

Zielgruppe


Researchers, developers, and students in computational linguistics, machine learning, data mining, statistics, bioinformatics, and security assessment.


Autoren/Hrsg.


Weitere Infos & Material


INTRODUCTION
A brief history
Semisupervised learning
Organization and assumptions

SELF-TRAINING AND CO-TRAINING
Classification
Self-training
Co-training

APPLICATIONS OF SELF-TRAINING AND CO-TRAINING
Part-of-speech tagging
Information extraction
Parsing
Word senses

CLASSIFICATION
Two simple classifiers
Abstract setting
Evaluating detectors and classifiers that abstain
Binary classifiers and ECOC

MATHEMATICS FOR BOUNDARY-ORIENTED METHODS
Linear separators
The gradient
Constrained optimization

BOUNDARY-ORIENTED METHODS
The perceptron
Game self-teaching
Boosting
Support vector machines (SVMs)
Null-category noise model

CLUSTERING
Cluster and label
Clustering concepts
Hierarchical clustering
Self-training revisited
Graph mincut
Label propagation
Bibliographic notes

GENERATIVE MODELS
Gaussian mixtures
The EM algorithm

AGREEMENT CONSTRAINTS
Co-training
Agreement-based self-teaching
Random fields
Bibliographic notes

PROPAGATION METHODS
Label propagation
Random walks
Harmonic functions
Fluids
Computing the solution
Graph mincuts revisited
Bibliographic notes

MATHEMATICS FOR SPECTRAL METHODS
Some basic concepts
Eigenvalues and eigenvectors
Eigenvalues and the scaling effects of a matrix
Bibliographic notes

SPECTRAL METHODS
Simple harmonic motion
Spectra of matrices and graphs
Spectral clustering
Spectral methods for semisupervised learning
Bibliographic notes

BIBLIOGRAPHY
INDEX



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.