Kanal / Lemmer | Uncertainty in Artificial Intelligence 2 | E-Book | sack.de
E-Book

E-Book, Englisch, Band Volume 5, 469 Seiten, Web PDF

Reihe: Machine Intelligence and Pattern Recognition

Kanal / Lemmer Uncertainty in Artificial Intelligence 2


1. Auflage 2014
ISBN: 978-1-4832-9653-1
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark

E-Book, Englisch, Band Volume 5, 469 Seiten, Web PDF

Reihe: Machine Intelligence and Pattern Recognition

ISBN: 978-1-4832-9653-1
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark



This second volume is arranged in four sections: Analysis contains papers which compare the attributes of various approaches to uncertainty. Tools provides sufficient information for the reader to implement uncertainty calculations. Papers in the Theory section explain various approaches to uncertainty. The Applications section describes the difficulties involved in, and the results produced by, incorporating uncertainty into actual systems.

Kanal / Lemmer Uncertainty in Artificial Intelligence 2 jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Front Cover;1
2;Uncertainty in Artificial Intelligence 2;4
3;Copyright Page;5
4;Table of Contents;10
5;PREFACE;6
6;CONTRIBUTORS;8
7;PART I. ANALYSIS;14
7.1;CHAPTER 1. MODELS VS. INDUCTIVE INFERENCE FOR DEALING WITH PROBABILISTIC KNOWLEDGE;16
7.1.1;1. Introduction;16
7.1.2;2. Structures on Event Spaces;16
7.1.3;3. Webs;18
7.1.4;4. Induction and maximum-entropy;19
7.1.5;5. Discussion;20
7.1.6;References;21
7.2;CHAPTER 2. AN AXIOMATIC FRAMEWORK FOR BELIEF UPDATES;24
7.2.1;1. INTRODUCTION;24
7.2.2;2. SCOPE OF THE AXIOMIZATION;25
7.2.3;3. FUNDAMENTAL PROPERTIES FOR A MEASURE OF ABSOLUTE BELIEF;25
7.2.4;4. FUNDAMENTAL PROPERTIES FOR A MEASURE OF CHANGE IN BELIEF;27
7.2.5;5. A CONSEQUENCE OF THE AXIOMS;28
7.2.6;6. PROBABILISTIC BELIEF UPDATES;30
7.2.7;7. CONCLUSIONS;33
7.2.8;ACKNOWLEDGEMENTS;33
7.2.9;NOTES;33
7.3;CHAPTER 3. THE MYTH OF MODULARITY IN RULE-BASED SYSTEMS FOR REASONING WITH UNCERTAINTY;36
7.3.1;1. INTRODUCTION;36
7.3.2;2. OVERVIEW OF THE MYCIN CERTAINTY FACTOR MODEL;38
7.3.3;3. DEFINITION OF SEMANTIC MODULARITY;38
7.3.4;4. CONSEQUENCES OF MODULARITY;39
7.3.5;5. THE MYTH OF MODULARITY;43
7.3.6;6. A WEAKER NOTION OF MODULARITY;43
7.3.7;7. SUMMARY;46
7.3.8;ACKNOWLEDGEMENTS;46
7.3.9;NOTES;46
7.3.10;REFERENCES;47
7.4;CHAPTER 4. IMPRECISE MEANINGS AS A CAUSE OF UNCERTAINTY IN MEDICAL KNOWLEDGE-BASED SYSTEMS;48
7.4.1;1. INTRODUCTION: LEXICAL IMPRECISION;48
7.4.2;2. EFFECTS ON KNOWLEDGE-BASED SYSTEMS;50
7.4.3;3. LEXICAL IMPRECISION IS NOT LEXICAL AMBIGUITY;51
7.4.4;4. COPING WITH LEXICAL IMPRECISION;52
7.4.5;5. SUMMARY;53
7.4.6;ACKNOWLEDGMENTS;54
7.4.7;REFERENCES;54
7.5;CHAPTER 5. Evidence as Opinions of Experts;56
7.5.1;Abstract;56
7.5.2;1. Introduction;56
7.5.3;2. The Rule of Combination and Normalization;57
7.5.4;3. Spaces of Opinions of Experts;58
7.5.5;4. Equivalence with the Dempster/Shafer Rule of Combination;62
7.5.6;5. An Alternative Method for Combining Evidence;62
7.5.7;6. Conclusions;65
7.5.8;Acknowledgements;65
7.5.9;References;66
7.6;CHAPTER 6. PROBABILISTIC LOGIC: SOME COMMENTS AND POSSIBLE USE FOR NONMONOTONIC REASONING;68
7.6.1;Introduction;68
7.6.2;Probabilistic Logic;68
7.6.3;Nonmonotonic Reasoning;71
7.6.4;An Inconsistent System;71
7.6.5;Discussion of the Entailment Results;72
7.6.6;Some Computational Observations;74
7.6.7;Conclusions;74
7.6.8;References:;75
7.7;CHAPTER 7. EXPERIMENTS WITH INTERVAL-VALUED UNCERTAINTY;76
7.7.1;1. INTRODUCTION;76
7.7.2;2. THE INFORMATION RETRIEVAL MODEL;76
7.7.3;3. MODELS OF UNCERTAINTY;77
7.7.4;4. THE INFERENCE ENGINE;79
7.7.5;5. EXPERIMENTS;80
7.7.6;6. DISCUSSION;86
7.7.7;REFERENCES;88
7.8;CHAPTER 8. EVALUATION OF UNCERTAIN INFERENCE MODELS I: PROSPECTOR;90
7.8.1;1. INTRODUCTION;90
7.8.2;2. OVERVIEW OF THE PROSPECTOR MODEL;91
7.8.3;3. METHOD;92
7.8.4;4. RESULTS;94
7.8.5;5. DISCUSSION AND .CONCLUSIONS;98
7.8.6;REFERENCES;99
7.9;CHAPTER 9. Experimentally Comparing Uncertain Inference Systems to Probability;102
7.9.1;1. Abstract;102
7.9.2;2. Introduction;102
7.9.3;3. Outline of Our Method;102
7.9.4;4. Outline of Biases in MYC and TSM;106
7.9.5;5. Experiments on Performance;109
7.9.6;6. Best and Worst Results for MYC and TSM;110
7.9.7;7. Summary;111
7.9.8;8. References;111
8;PART II: TOOLS;114
8.1;CHAPTER 10. KNOWLEDGE ENGINEERING WITHIN A GENERALIZED BAYESIAN FRAMEWORK;116
8.1.1;1. INTRODUCTION;116
8.1.2;2. KNOWLEDGE ENGINEERING WITHIN THE GENERALIZED BAYESIAN FRAMEWORK;117
8.1.3;3. GENERALIZED BAYESIAN INFERENCE;122
8.1.4;4. GENERALIZED BAYESIAN EXPLANATION;125
8.1.5;5. FUTURE WORK;125
8.1.6;6. SUMMARY;126
8.1.7;REFERENCES;126
8.2;CHAPTER 11. LEARNING TO PREDICT: AN INDUCTIVE APPROACH;128
8.2.1;1. INTRODUCTION;128
8.2.2;2. BASIC INDUCTIVE COMPONENTS FOR INCREMENTAL LEARNING;131
8.2.3;3. CONCLUSIONS;136
8.2.4;ACKNOWLEDGEMENTS;136
8.2.5;REFERENCES;136
8.3;CHAPTER 12. TOWARDS A GENERAL-PURPOSE BELIEF MAINTENANCE SYSTEM;138
8.3.1;1. INTRODUCTION;138
8.3.2;2. DESIGN;138
8.3.3;3. EXAMPLES;142
8.3.4;4. CONCLUSIONS;144
8.3.5;ACKNOWLEDGEMENTS;144
8.3.6;REFERENCES;144
8.4;CHAPTER 13. A NON-ITERATIVE MAXIMUM ENTROPY ALGORITHM;146
8.4.1;1 Introduction;146
8.4.2;2 Formal Problem Definition;147
8.4.3;3 The Maximum Entropy Principle;148
8.4.4;4 Iterative Maximum Entropy Methods;149
8.4.5;5 Non-Iterative Techniques;150
8.4.6;6 Acyclic Hypergraphs;151
8.4.7;7 A New Maximum Entropy Method;153
8.4.8;8 Spiegelhalter's Algorithm;157
8.4.9;9 Comparisons;158
8.4.10;10 Conclusions and Open Problems;159
8.4.11;References;159
8.5;CHAPTER 14. Propagating Uncertainty in Bayesian Networks by Probabilistic Logic Sampling;162
8.5.1;1. Introduction;162
8.5.2;2. Bayesian Belief Networks;163
8.5.3;3. Dependent evidence and multiply connected networks;165
8.5.4;4. Probabilistic Logic sampling;167
8.5.5;5. Pulse: An Implementation;171
8.5.6;6. Explanation and sensitivity analysis;171
8.5.7;7. Precision and Computational effort;172
8.5.8;8. Improvements to efficiency;174
8.5.9;9. Final remarks;174
8.5.10;Acknowledgments;175
8.5.11;References;175
8.6;CHAPTER 15. AN EXPLANATION MECHANISM FOR BAYESIAN INFERENCING SYSTEMS;178
8.6.1;1. INTRODUCTION;178
8.6.2;2. THE GENERALIZED BAYESIAN INFERENCING SYSTEM;179
8.6.3;3. EXPLANATION FACILITIES;180
8.6.4;4. CONCLUDING REMARKS;185
8.6.5;FOOTNOTES AND REFERENCES;186
8.7;CHAPTER 16. ON THE RATIONAL SCOPE OF PROBABILISTIC RULE-BASED INFERENCE SYSTEMS;188
8.7.1;1. INTRODUCTION;188
8.7.2;2. BACKGROUND AND NOMENCLATURE;190
8.7.3;3. THE CF LANGUAGE AND ITS RATIONAL INTERPRETATION;191
8.7.4;4. THE CF LANGUAGE AS A SPECIAL CASE OF THE BAYESIAN LANGUAGE;192
8.7.5;5. DISCUSSION;193
8.7.6;6. IMPLICATIONS ON KNOWLEDGE ENGINEERING AND FUTURE RESEARCH;196
8.7.7;7. CONCLUSION;198
8.7.8;8. Appendix: Proofs;198
8.7.9;REFERENCES;201
8.8;CHAPTER 17. DAVID: Influence Diagram Processing System for the Macintosh;204
8.8.1;REFERENCES;209
8.9;CHAPTER 18. Qualitative Probabilistic Networks for Planning Under Uncertainty;210
8.9.1;1 Introduction;210
8.9.2;2 Probabilistic Networks;211
8.9.3;3 Qualitative Influences;211
8.9.4;4 An Example: The Generic Test/Treat Decision;215
8.9.5;5 Conclusions;219
8.9.6;References;220
8.10;CHAPTER 19. ON IMPLEMENTING USUAL VALUES;222
8.10.1;1. INTRODUCTION;222
8.10.2;2. ON POSSIBILITY-PROBABILITY GRANULES;223
8.10.3;3. ON USUAL VALUES AND THEIR REPRESENTATION;224
8.10.4;4. TRANSLATION OF COMPOUND STATEMENTS;225
8.10.5;5. LOGICAL TRANSLATION RULES;226
8.10.6;6. REASONING WITH USUAL VALUES;227
8.10.7;6. ARITHMETIC OPERATIONS WITH USUAL OPERATIONS;228
8.10.8;7. CONCLUSION;229
8.10.9;REFERENCES;229
9;PART Ill: THEORY;232
9.1;CHAPTER 20. SOME EXTENSIONS OF PROBABILISTIC LOGIC;234
9.1.1;1. INTRODUCTION;234
9.1.2;2. EVIDENTIAL LOGIC;235
9.1.3;3. SEMANTICS AS RANDOM VARIABLES;236
9.1.4;4. PROBABILISTIC LOGIC AS CONSISTENT LABELING;238
9.1.5;REFERENCES;239
9.2;CHAPTER 21. Belief as Summarization and Meta-Support;242
9.2.1;1. Introduction;242
9.2.2;2. The Network Model;243
9.2.3;3. Computing Belief and Reliability values;245
9.2.4;4. A Network of Cognitive Units;247
9.2.5;5. Conclusions;248
9.2.6;References;248
9.3;CHAPTER 22. NON-MONOTONICITY IN PROBABILISTIC REASONING;250
9.3.1;1 Introduction;250
9.3.2;2 Probabilistic Logic;250
9.3.3;3 Non-Monotonic Probabilistic Theories;251
9.3.4;4 Default Inheritance of Probabilities;254
9.3.5;5 Specificity-Prioritized Maximization of Conditional Independence;255
9.3.6;6 Non-Monotonicity in "Evidential" Reasoning;255
9.3.7;7 Graphoids, Influence Diagrams, and Irrelevance;256
9.3.8;8 A Circumscriptive Formalization of (SP) MCI;257
9.3.9;9 Maximum Entropy;258
9.3.10;10 Discussion;259
9.3.11;11 Conclusion;260
9.3.12;12 Directions for Future Research;260
9.3.13;Acknowledgements;261
9.3.14;Notes;261
9.3.15;References;261
9.4;CHAPTER 23. A SEMANTIC APPROACH TO NON-MONOTONICENTAILMBNTS;264
9.4.1;1. OVERVIEW;264
9.4.2;2. TRum;265
9.4.3;3. ENTAILMENT;267
9.4.4;4. PROBABILITY;271
9.4.5;5. CONCLUSION;275
9.4.6;ACKNOWLEDGEMENTS;275
9.4.7;REFERENCES;275
9.5;CHAPTER 24. KNOWLEDGE;276
9.5.1;1. BACKGROUND.;276
9.5.2;2. SUBJECTIVE MEASURES.;276
9.5.3;3. PROBABILITY;278
9.5.4;4. UNCERTAIN KNOWLEDGE;280
9.5.5;5. DECISION;284
9.5.6;BIBLIOGRAPHY;285
9.6;CHAPTER 25. Computing Reference Classes;286
9.6.1;1. Reference Classes.;286
9.6.2;2. Kyburg's Strategy and Its Capabilities.;288
9.6.3;3. Lessons from Implementation;294
9.6.4;4. Concluding Discussion.;299
9.6.5;Acknowledgements;302
9.6.6;References;302
9.7;CHAPTER 26. DISTRIBUTED REVISION OF BELIEF COMMITMENT IN COMPOSITE EXPLANATIONS;304
9.7.1;ABSTRACT;304
9.7.2;1. INTRODUCTION;304
9.7.3;2. REVIEW OF BELIEF UPDATING IN BAYESIAN BELIEF NETWORKS;306
9.7.4;3. BELIEF REVISION IN SINGLY-CONNECTED NETWORKS;309
9.7.5;4. COPING WITH LOOPS;316
9.7.6;5. A MEDICAL DIAGNOSIS EXAMPLE;319
9.7.7;CONCLUSIONS;325
9.7.8;ACKNOWLEDGMENT;326
9.7.9;REFERENCES;326
9.8;CHAPTER 27. A BACKWARDS VIEW FOR ASSESSMENT;330
9.8.1;1. INTRODUCTION;330
9.8.2;2. INFLUENCE DIAGRAMS;331
9.8.3;3. DETERMINISTIC MODELS;332
9.8.4;4. PROBABILISTIC MODELS;333
9.8.5;5. CONCLUSIONS;335
9.8.6;ACKNOWLEDGEMENTS;336
9.8.7;NOTES;336
9.8.8;REFERENCES;336
9.9;CHAPTER 28. PROPAGATION OF BELIEF FUNCTIONS: A DISTRIBUTED APPROACH;338
9.9.1;I. Abstract and Introduction;338
9.9.2;II. Belief Functions;339
9.9.3;II.. Qualitative Markov Trees;340
9.9.4;IV. Propagating Belief Functions in Qualitative Markov Trees;342
9.9.5;V. Conclusion;347
9.9.6;VI. Acknowledgements;347
9.9.7;VII. References;348
9.10;CHAPTER 29. GENERALIZING FUZZY LOGIC PROBABILISTIC INFERENCES;350
9.10.1;1. INTRODUCTION;350
9.10.2;2. GENERATING FUNCTIONS FOR BOOLEAN FORMULAS;353
9.10.3;3. COMPUTING PROJECTIONS;364
9.10.4;4. COMPOSITION OF FACES;371
9.10.5;5. A TRANSFORNATION OF ANY BOOLEAN FORMULA TO A FACET OF BN 2;372
9.10.6;6. CONCLUSIONS;373
9.10.7;REFERENCES;373
10;PART IV: APPLICATIONS;376
10.1;CHAPTER 30. THE SUM-AND-LATTICE-POINTS METHOD BASED ON AN EVIDENTIAL-REASONING SYSTEM APPLIED TO THE REAL-TIME VEHICLE GUIDANCE PROBLEM;378
10.1.1;1. INTRODUCTION;378
10.1.2;2. PROBLEM;378
10.1.3;3. RELEVANCE;378
10.1.4;4. THE NECESSITY OF INTRODUCING GENERAL EVIDENTIAL REASONING;379
10.1.5;5. THE GENERAL EVIDENTIAL REASONING MODEL;380
10.1.6;6. APPROACH: SUM-AND-LATTICE-POINTS METHOD;381
10.1.7;7. THE EQUIVALENCE PROOF AND THE TRANSITION GRAPH;382
10.1.8;8. PARALLEL IMPLEMENTATION;382
10.1.9;9. ADVANTAGES;382
10.1.10;10. CONCLUSION;383
10.1.11;REFERENCES;383
10.2;CHAPTER 31. Probabilistic Reasoning About Ship Images;384
10.2.1;1. INTRODUCTION;384
10.2.2;2. REASONING ABOUT SHIP IMAGES;385
10.2.3;3. A SIMPLE PROTOTYPE;387
10.2.4;4. SCALING UP TO MORE REALISTIC PROBLEMS;388
10.2.5;5. THE BMS APPROACH;389
10.2.6;6. CONCLUSIONS;391
10.2.7;Acknowledgements;391
10.2.8;REFERENCES;392
10.3;CHAPTER 32. Information and Multi-Sensor Coordination;394
10.3.1;1 Introduction;394
10.3.2;2 A Team-Theoretic Formulation of Multi-Sensor Systems;395
10.3.3;3 Simulation Studies;400
10.3.4;4 Evaluation and Speculation;403
10.3.5;5 Conclusions and Future Research;405
10.3.6;References;406
10.4;CHAPTER 33. Planning, Scheduling, and Uncertainty in the Sequence of Future Events;408
10.4.1;Abstract;408
10.4.2;1. Statement of the Problem;408
10.4.3;2. Candidate Solutions;409
10.4.4;3. The Proposed Solution;410
10.4.5;4. Empirical Results;411
10.4.6;5. Conclusion;412
10.4.7;References;412
10.5;CHAPTER 34. EVIDENTIAL REASONING IN A COMPUTER VISION SYSTEM;416
10.5.1;ABSTRACT;416
10.5.2;1. INTRODUCTION;416
10.5.3;2. A SET-THEORETICAL EVIDENTIAL REASONING APPROACH TO COMPUTERVISION;417
10.5.4;3. PROGRAMMING RESULTS;421
10.5.5;4. CONCLUSION;425
10.5.6;ACKNOWLEDGEMENTS;425
10.5.7;REFERENCES;425
10.6;CHAPTER 35. BAYES IAN INFERENCE FOR RADAR IMAGERY BASED SURVEILLANCE;426
10.6.1;I. INTRODUCTION;426
10.6.2;2. EVIDENTIAL ACCRUAL;429
10.6.3;3. APPROXIMATE CONFLICT RESOLUTION;431
10.6.4;4. ISSUES;433
10.6.5;ACKNOWLEDGEMENTS;433
10.6.6;REFERENCES;433
10.7;CHAPTER 36. A CAUSAL BAYESIAN MODEL FOR THE DIAGNOSIS OF APPENDICITIS;436
10.7.1;1. INTRODUCTION AND OVERVIEW;436
10.7.2;2. STANDARD BAYESIAN ASSUMPTIONS AND CRITIQUES;437
10.7.3;3. THE CAUSAL BAYESIAN MODEL;438
10.7.4;4. THE KNOWLEDGE ENGINEERING;440
10.7.5;5. IMPLEMENTING THE MODEL;441
10.7.6;6. TESTING THE MODEL;443
10.7.7;7. WORK IN PROGRESS;444
10.7.8;8. CONCLUSIONS;445
10.7.9;REFERENCES;445
10.8;CHAPTER 37. Estimating Uncertain Spatial Relationships in Robotics;448
10.8.1;1 Introduction;448
10.8.2;2 The Stochastic Map;449
10.8.3;3 Reading the Map;454
10.8.4;4 Building the Map;460
10.8.5;5 Developed Example;466
10.8.6;6 Discussion and Conclusions;469
10.8.7;Appendix A;470
10.8.8;Relationships Using Euler Angles;470
10.8.9;Relationships Using Roll, Pitch and Yaw Angles;472
10.8.10;References;473



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.