E-Book, Englisch, 169 Seiten
Huang / Yang / King Machine Learning
1. Auflage 2008
ISBN: 978-3-540-79452-3
Verlag: Springer Berlin Heidelberg
Format: PDF
Kopierschutz: 1 - PDF Watermark
Modeling Data Locally and Globally
E-Book, Englisch, 169 Seiten
Reihe: Advanced Topics in Science and Technology in China
ISBN: 978-3-540-79452-3
Verlag: Springer Berlin Heidelberg
Format: PDF
Kopierschutz: 1 - PDF Watermark
Machine Learning - Modeling Data Locally and Globally presents a novel and unified theory that tries to seamlessly integrate different algorithms. Specifically, the book distinguishes the inner nature of machine learning algorithms as either 'local learning'or 'global learning.'This theory not only connects previous machine learning methods, or serves as roadmap in various models, but - more importantly - it also motivates a theory that can learn from data both locally and globally. This would help the researchers gain a deeper insight and comprehensive understanding of the techniques in this field. The book reviews current topics,new theories and applications. Kaizhu Huang was a researcher at the Fujitsu Research and Development Center and is currently a research fellow in the Chinese University of Hong Kong. Haiqin Yang leads the image processing group at HiSilicon Technologies. Irwin King and Michael R. Lyu are professors at the Computer Science and Engineering department of the Chinese University of Hong Kong.
Autoren/Hrsg.
Weitere Infos & Material
1;Preface;6
2;Contents;7
3;1 Introduction;11
3.1;1.1 Learning and Global Modeling;11
3.2;1.2 Learning and Local Modeling;13
3.3;1.3 Hybrid Learning;15
3.4;1.4 Major Contributions;15
3.5;1.5 Scope;18
3.6;References;19
4;2 Global Learning vs. Local Learning;23
4.1;2.1 Problem De.nition;25
4.2;2.2 Global Learning;26
4.3;2.4 Hybrid Learning;33
4.4;2.5 Maxi-Min Margin Machine;34
4.5;References;35
5;3 A General Global Learning Model: MEMPM;39
5.1;3.1 Marshall and Olkin Theory;40
5.2;3.2 Minimum Error Minimax Probability Decision Hyperplane;41
5.3;3.3 Robust Version;55
5.4;3.4 Kernelization;56
5.5;3.5 Experiments;60
5.6;3.6 How Tight Is the Bound?;66
5.7;3.7 On the Concavity of MEMPM;70
5.8;3.8 Limitations and Future Work;75
5.9;3.9 Summary;76
5.10;References;77
6;4 Learning Locally and Globally: Maxi-Min Margin Machine;79
6.1;4.1 Maxi-Min Margin Machine;81
6.2;4.2 Bound on the Error Rate;92
6.3;4.3 Reduction;94
6.4;4.4 Kernelization;95
6.5;4.5 Experiments;98
6.6;4.6 Discussions and Future Work;103
6.7;4.7 Summary;103
6.8;References;104
7;5 Extension I: BMPM for Imbalanced Learning;107
7.1;5.1 Introduction to Imbalanced Learning;108
7.2;5.2 Biased Minimax Probability Machine;108
7.3;5.3 Learning from Imbalanced Data by Using BMPM;110
7.4;5.4 Experimental Results;112
7.5;5.5 When the Cost for Each Class Is Known;124
7.6;5.6 Summary;125
7.7;References;125
8;6 Extension II: A Regression Model from M4;129
8.1;6.1 A Local Support Vector Regression Model;131
8.2;6.2 Connection with Support Vector Regression;132
8.3;6.3 Link with Maxi-Min Margin Machine;134
8.4;6.4 Optimization Method;134
8.5;6.5 Kernelization;135
8.6;6.6 Additional Interpretation on wTSiw;137
8.7;6.7 Experiments;138
8.8;6.8 Summary;141
8.9;References;141
9;7 Extension III: Variational Margin Settings within Local Data in Support Vector Regression;143
9.1;7.1 Support Vector Regression;144
9.2;7.2 Problem in Margin Settings;146
9.3;7.3 General -insensitive Loss Function;146
9.4;7.4 Non-.xed Margin Cases;149
9.5;7.5 Experiments;151
9.6;7.6 Discussions;165
9.7;References;168
10;8 Conclusion and Future Work;171
10.1;8.1 Review of the Journey;171
10.2;8.2 Future Work;173
10.3;References;174
11;Index;177




