Chen / Zhu / Hu | System Parameter Identification | E-Book | www.sack.de
E-Book

E-Book, Englisch, 266 Seiten

Chen / Zhu / Hu System Parameter Identification

Information Criteria and Algorithms
1. Auflage 2013
ISBN: 978-0-12-404595-8
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark

Information Criteria and Algorithms

E-Book, Englisch, 266 Seiten

ISBN: 978-0-12-404595-8
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark



Recently, criterion functions based on information theoretic measures (entropy, mutual information, information divergence) have attracted attention and become an emerging area of study in signal processing and system identification domain. This book presents a systematic framework for system identification and information processing, investigating system identification from an information theory point of view. The book is divided into six chapters, which cover the information needed to understand the theory and application of system parameter identification. The authors' research provides a base for the book, but it incorporates the results from the latest international research publications. - Named a 2013 Notable Computer Book for Information Systems by Computing Reviews - One of the first books to present system parameter identification with information theoretic criteria so readers can track the latest developments - Contains numerous illustrative examples to help the reader grasp basic methods

Badong Chen received the B.S. and M.S. degrees in control theory and engineering from Chongqing University, in 1997 and 2003, respectively, and the Ph.D. degree in computer science and technology from Tsinghua University in 2008. He was a Post-Doctoral Researcher with Tsinghua University from 2008 to 2010, and a Post-Doctoral Associate at the University of Florida Computational NeuroEngineering Laboratory (CNEL) during the period October, 2010 to September, 2012. He is currently a professor at the Institute of Artificial Intelligence and Robotics (IAIR), Xi'an Jiaotong University. His research interests are in system identification and control, information theory, machine learning, and their applications in cognition and neuroscience.
Chen / Zhu / Hu System Parameter Identification jetzt bestellen!

Weitere Infos & Material


Symbols and Abbreviations


The main symbols and abbreviations used throughout the text are listed as follows.

    absolute value of a real number

    Euclidean norm of a vector

    inner product

    indicator function

    expectation value of a random variable

    first-order derivative of the function

    second-order derivative of the function

    gradient of the function with respect to

    sign function

    Gamma function

    vector or matrix transposition

    identity matrix

    inverse of matrix

    determinant of matrix

    trace of matrix

    rank of matrix

    natural logarithm function

    unit delay operator

    real number space

    -dimensional real Euclidean space

    correlation coefficient between random variables and

    variance of random variable

    probability of event

    Gaussian distribution with mean vector and covariance matrix

    uniform distribution over interval

    chi-squared distribution with degree of freedom

    Shannon entropy of random variable

    -entropy of random variable

    -order Renyi entropy of random variable

    -order information potential of random variable

    survival information potential of random variable

    -entropy of discrete random variable

    mutual information between random variables and

    KL-divergence between random variables and

    -divergence between random variables and

    Fisher information matrix

    Fisher information rate matrix

    probability density function

    Mercer kernel function

    kernel function for density estimation

    kernel function with width

    Gaussian kernel function with width

    reproducing kernel Hilbert space induced by Mercer kernel

    feature space induced by Mercer kernel

    weight vector

    weight vector in feature space

    weight error vector

    step size

    sliding data length

MSE    mean square error

LMS    least mean square

NLMS    normalized least mean square

LS    least squares

RLS    recursive least squares

MLE    maximum likelihood estimation

EM    expectation-maximization

FLOM    fractional lower order moment

LMP    least mean -power

LAD    least absolute deviation

LMF    least mean fourth

FIR    finite impulse response

IIR    infinite impulse response

AR    auto regressive

ADALINE    adaptive linear neuron

MLP    multilayer perceptron

RKHS    reproducing kernel Hilbert space

KAF    kernel adaptive filtering

KLMS    kernel least mean square

KAPA    kernel affine projection algorithm

KMEE    kernel minimum error entropy

KMC    kernel maximum correntropy

PDF    probability density function

KDE    kernel density estimation

GGD    generalized Gaussian density

    symmetric -stable

MEP    maximum entropy principle

DPI    data processing inequality

EPI    entropy power inequality

MEE    minimum error entropy

MCC    maximum correntropy criterion

IP    information potential

QIP    quadratic information potential

CRE    cumulative residual entropy

SIP    survival information potential

QSIP    survival quadratic information potential

KLID    Kullback–Leibler information divergence

EDC    Euclidean distance criterion

MinMI    minimum mutual information

MaxMI    maximum mutual information

AIC    Akaike’s information criterion

BIC    Bayesian information criterion

MDL    minimum description length

FIM    Fisher information matrix

FIRM    Fisher information rate matrix

MIH    minimum identifiable horizon

ITL    information theoretic learning

BIG    batch information gradient

FRIG    forgetting recursive information gradient

SIG    stochastic information gradient

SIDG    stochastic information divergence gradient

SMIG    stochastic mutual information gradient

FP    fixed point

FP-MEE    fixed-point minimum error...



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.