E-Book, Englisch, Band Volume 9, 422 Seiten, Web PDF
Levitt / Kanal / Lemmer Uncertainty in Artificial Intelligence 4
1. Auflage 2014
ISBN: 978-1-4832-9654-8
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
E-Book, Englisch, Band Volume 9, 422 Seiten, Web PDF
Reihe: Machine Intelligence and Pattern Recognition
ISBN: 978-1-4832-9654-8
Verlag: Elsevier Science & Techn.
Format: PDF
Kopierschutz: 1 - PDF Watermark
Clearly illustrated in this volume is the current relationship between Uncertainty and AI.It has been said that research in AI revolves around five basic questions asked relative to some particular domain: What knowledge is required? How can this knowledge be acquired? How can it be represented in a system? How should this knowledge be manipulated in order to provide intelligent behavior? How can the behavior be explained? In this volume, all of these questions are addressed. From the perspective of the relationship of uncertainty to the basic questions of AI, the book divides naturally into four sections which highlight both the strengths and weaknesses of the current state of the relationship between Uncertainty and AI.
Autoren/Hrsg.
Weitere Infos & Material
1;Front Cover;1
2;Uncertainty in Artificial Intelligence 4;4
3;Copyright Page;5
4;Table of Contents;8
5;PREFACE;6
6;LIST OF CONTRIBUTORS;12
7;Section I: CAUSAL MODELS;14
7.1;CHAPTER 1. ON THE LOGIC OF CAUSAL MODELS;16
7.1.1;1. INTRODUCTION AND SUMMARY OF RESULTS;16
7.1.2;2. SOUNDNESS AND COMPLETENESS;18
7.1.3;3. EXTENSIONS AND ELABORATIONS;22
7.1.4;ACKNOWLEDGMENT;24
7.1.5;REFERENCES;24
7.1.6;APPENDIX;26
7.2;Chapter 2. Process, Structure, and Modularity in Reasoning with Uncertainty;28
7.2.1;Abstract;28
7.2.2;1 Introduction;28
7.2.3;2 Related Research;29
7.2.4;3 Hybrid Uncertainty Management;29
7.2.5;4 Summary;36
7.2.6;References;37
7.3;Chapter 3. Probabilistic Causal Reasoning;40
7.3.1;Abstract;40
7.3.2;1 Introduction;41
7.3.3;2 Causal Theories;41
7.3.4;3 Probabilistic Projection;44
7.3.5;4 The Algorithm;47
7.3.6;5 Acquiring Rules;51
7.3.7;6 Conclusions;53
7.3.8;References;54
7.4;Chapter 4. Generating Decision Structures and Causal Explanations For Decision Making;56
7.4.1;ABSTRACT;56
7.4.2;1. INTRODUCTION;56
7.4.3;2. LEARNING A DECISION STRUCTURE;57
7.4.4;3. CAUSAL EXPLANATION IN A DETERMINISTIC UNIVERSE WITH PERFECT INFORMATION;61
7.4.5;4. CAUSAL EXPLANATION IN AN UNCERTAIN UNIVERSE;65
7.4.6;5. TESTING THE THEORY;68
7.4.7;6. CONCLUSIONS AND FUTURE RESEARCH;68
7.4.8;7. ACKNOWLEDGEMENTS;69
7.4.9;REFERENCES;69
7.5;Chapter 5. Control of Problem Solving: Principles and Architecture;72
7.5.1;1 Introduction;72
7.5.2;2 Decision-Theoretic Selection;73
7.5.3;3 The Architecture;76
7.5.4;4 Conclusion;79
7.5.5;5 Acknowledgements;80
7.5.6;References;80
7.6;CHAPTER 6. CAUSAL NETWORKS: SEMANTICS AND EXPRESSIVENESS;82
7.6.1;1. INTRODUCTION;82
7.6.2;2. UNDIRECTED GRAPHS;83
7.6.3;3. DIRECTED-ACYCLIC GRAPHS (DAGS);84
7.6.4;4. FUNCTIONAL DEPENDENCIES;88
7.6.5;5. CONCLUSIONS;88
7.6.6;ACKNOWLEDGMENT;88
7.6.7;REFERENCES;89
8;Section II: UNCERTAINTY CALCULI AND COMPARISONS;90
8.1;Part 1: Uncertainty Calculi;92
8.1.1;CHAPTER 7. STOCHASTIC SENSITIVITY ANALYSIS USING FUZZY INFLUENCE DIAGRAMS;92
8.1.1.1;1. INTRODUCTION AND OBJECTIVE;92
8.1.1.2;2. BAYESIAN FUZZY PROBABILITIES : BASICS;94
8.1.1.3;3. FUZZY PROBABILISTIC INFERENCE;97
8.1.1.4;4. SOLVING DECISION PROBLEMS;99
8.1.1.5;5. CONCLUSIONS;102
8.1.1.6;ACKNOWLEDGEMENTS;103
8.1.1.7;REFERENCES;103
8.1.2;CHAPTER 8. A LINEAR APPROXIMATION METHOD FOR PROBABILISTIC INFERENCE;106
8.1.2.1;1. INTRODUCTION;106
8.1.2.2;2. NOTATION AND BASIC FRAMEWORK;108
8.1.2.3;3. VARIABLE TRANSFORMATIONS;110
8.1.2.4;4. EXPERIMENTAL OBSERVATIONS;112
8.1.2.5;5. LINEAR APPROXIMATION ALGORITHM;113
8.1.2.6;6. CONCLUSIONS;114
8.1.2.7;ACKNOWLEDGEMENTS;115
8.1.2.8;REFERENCES;115
8.1.3;Chapter 9. Minimum Cross Entropy Reasoning in Recursive Causal Networks;118
8.1.3.1;1 Introduction;118
8.1.3.2;2 The Principle of Minimum Cross Entropy;120
8.1.3.3;3 Recursive Causal Networks;122
8.1.3.4;4 Reasoning with Multiple Uncertain Evidence;125
8.1.3.5;5 Other Important Issues;128
8.1.3.6;6 Conclusions;129
8.1.3.7;Acknowledgement;130
8.1.3.8;References;130
8.1.4;CHAPTER 10. PROBABILISTIC SEMANTICS AND DEFAULTS;134
8.1.4.1;1. INTRODUCTION;134
8.1.4.2;2. WHAT'S IN A DEFAULT?;135
8.1.4.3;3. INFERENCE GRAPHS;136
8.1.4.4;4. THE FAVOURS RELATION;138
8.1.4.5;5. EXAMPLES;140
8.1.4.6;6. CONCLUSIONS;141
8.1.4.7;ACKNOWLEDGEMENTS;142
8.1.4.8;REFERENCES;142
8.1.5;CHAPTER 11. Modal Logics of Higher-Order Probability;146
8.1.5.1;1 Introduction;146
8.1.5.2;2 Probability as a Modal Operator;147
8.1.5.3;3 Flat Probability Models;148
8.1.5.4;4 Coherence Principles;150
8.1.5.5;5 Staged Probability Models;152
8.1.5.6;6 Relation to Modal Logic;155
8.1.5.7;7 Summary and Future Research;158
8.1.5.8;Acknowledgements;159
8.1.5.9;Notes;159
8.1.5.10;References;160
8.1.6;CHAPTER 12. A GENERAL NON-PROBABILISTIC THEORY OF INDUCTIVE REASONING;162
8.1.6.1;1. INTRODUCTION;162
8.1.6.2;2. THE THEORY;163
8.1.6.3;3. A COMPARISON WITH PROBABILITY THEORY;165
8.1.6.4;4. OTHER COMPARISONS;167
8.1.6.5;NOTES;169
8.1.6.6;REFERENCES;170
8.1.7;CHAPTER 13. EPISTEMOLOGICAL RELEVANCE AND STATISTICAL KNOWLEDGE;172
8.1.7.1;1. BACKGROUND;172
8.1.7.2;2. ASSUMPTIONS;173
8.1.7.3;3. INTERFERENCE I;175
8.1.7.4;4. INTERFERENCE II;176
8.1.7.5;5. INTERFERENCE III;177
8.1.7.6;6. DISCUSSION;178
8.1.7.7;7. INEXACT KNOWLEDGE;179
8.1.7.8;8. COMPUTATION;180
8.1.7.9;9. CONCLUSIONS;180
8.1.7.10;REFERENCES;181
8.1.8;CHAPTER 14. AXIOMS FOR PROBABILITY AND BELIEF-FUNCTION PROPAGATION;182
8.1.8.1;1. INTRODUCTION;182
8.1.8.2;2. SOME CONCEPTS FROM GRAPH THEORY;183
8.1.8.3;3. AN AXIOMATIC FRAMEWORK FOR LOCAL COMPUTATION;186
8.1.8.4;4. PROBABILITY PROPAGATION;202
8.1.8.5;5. BELIEF-FUNCTION PROPAGATION;207
8.1.8.6;ACKNOWLEDGEMENTS;209
8.1.8.7;REFERENCES;209
8.1.9;Chapter 15. A Summary of A New Normative Theory of Probabilistic Logic;212
8.1.9.1;ABSTRACT;212
8.1.9.2;What is Probabilistic Logic ?;214
8.1.9.3;New Axioms for Probabilistic Logic;215
8.1.9.4;Possible Interpretations for the Set P;216
8.1.9.5;When Must P Be Like The Real Numbers?;217
8.1.9.6;References;218
8.1.10;CHAPTER 16. HIERARCHICAL EVIDENCE AND BELIEF FUNCTIONS;220
8.1.10.1;1. INTRODUCTION;220
8.1.10.2;2. EXAMPLE;221
8.1.10.3;3. THREE WAYS OF DEFINING JOINT BELIEFS;223
8.1.10.4;4. SO WHICH METHOD DO I USE?;225
8.1.10.5;5. "CAVEAT MODELOR";226
8.1.10.6;REFERENCES;227
8.1.11;Chapter 17. On Probability Distributions Over Possible Worlds;230
8.1.11.1;Abstract;230
8.1.11.2;1 Introduction;231
8.1.11.3;2 The Propositional Case;231
8.1.11.4;3 First-Order Languages;232
8.1.11.5;4 The Representation of Statistical Knowledge;234
8.1.11.6;5 The Representation of Defaults;236
8.1.11.7;6 Conclusions;238
8.1.11.8;7 Acknowledgement;238
8.1.11.9;References;238
8.1.12;Chapter 18. A Framework of Fuzzy Evidential Reasoning;240
8.1.12.1;1 Introduction;240
8.1.12.2;2 Basics of the Dempster-Shafer Theory;241
8.1.12.3;3 Previous Work;242
8.1.12.4;4 Our Approach;243
8.1.12.5;5 Conclusions;251
8.1.12.6;Acknowledgements;252
8.1.12.7;References;252
8.2;Part 2: Comparisons;254
8.2.1;Chapter 19. Parallel Belief Revision;254
8.2.1.1;Abstract;254
8.2.1.2;1 Introduction;254
8.2.1.3;2 Spohnian Belief Revision;255
8.2.1.4;3 Influence Diagrams and Spohnian Conditional Independence;256
8.2.1.5;4 Soundness and Completeness Results;258
8.2.1.6;5 Spohnian Networks;259
8.2.1.7;6 Updating on a Single Piece of Uncertain Evidence;260
8.2.1.8;7 Simultaneous Updating on Multiple Evidence Events;261
8.2.1.9;8 Updating on Multiple Pieces of Uncertain Evidence;263
8.2.1.10;9 Discussion;263
8.2.1.11;References;264
8.2.2;CHAPTER 20. EVIDENTIAL REASONING COMPARED IN A NETWORK USAGE PREDICTION TESTBED: PRELIMINARY REPORT;266
8.2.2.1;1 TESTBED;266
8.2.2.2;2 BETTING;268
8.2.2.3;3 UNCERTAINTY CALCULI;268
8.2.2.4;4 SOME PRELIMINARY DATA;271
8.2.2.5;5 BIAS OF NET FOR REPEATED CHOICES;278
8.2.2.6;6 FUTURE WORK;279
8.2.2.7;7 REFERENCES;282
8.2.3;Chapter 21. A Comparison of Decision Analysis and Expert Rules for Sequential Diagnosis;284
8.2.3.1;Abstract;284
8.2.3.2;1. Introduction;284
8.2.3.3;2. Decision analytic approach;286
8.2.3.4;3. Experiment;287
8.2.3.5;4. Discussion;292
8.2.3.6;5. Conclusions;293
8.2.3.7;Acknowledgements;293
8.2.3.8;References;293
8.2.4;Chapter 22. An Empirical Comparison of Three Inference Methods;296
8.2.4.1;1 Introduction;296
8.2.4.2;2 The Domain;297
8.2.4.3;3 The Inference Methods;297
8.2.4.4;4 The Evaluation Procedure;301
8.2.4.5;5 Utility Assessment;306
8.2.4.6;6 Details of the Experiment;308
8.2.4.7;7 Results;309
8.2.4.8;8 Discussion;311
8.2.4.9;9 Future Work;314
8.2.4.10;Acknowledgments;314
8.2.4.11;References;314
8.2.5;CHAPTER 23. MODELING UNCERTAIN AND VAGUE KNOWLEDGE IN POSSIBILITY AND EVIDENCE THEORIES;316
8.2.5.1;1. INTRODUCTION;316
8.2.5.2;2. REPRESENTING UNCERTAINTY;316
8.2.5.3;3. A SHORT DISCUSSION OF COX'S AXIOMATIC FRAMEWORK FOR PROBABILITY;323
8.2.5.4;4. MODELING VAGUENESS;324
8.2.5.5;CONCLUSION;329
8.2.5.6;REFERENCES;330
8.2.6;CHAPTER 24. PROBABILISTIC INFERENCE AND NON-MONOTONIC INFERENCE;332
8.2.6.1;1. INTRODUCTION;332
8.2.6.2;2. McCARTHY AND HAYES;333
8.2.6.3;3. NON-MONOTONIC INFERENCE;334
8.2.6.4;4. THE CANONICAL EXAMPLES;335
8.2.6.5;5. CONSISTENCY;336
8.2.6.6;6. CONCLUSIONS;338
8.2.6.7;REFERENCES;338
8.2.7;Chapter 25. Multiple decision trees;340
8.2.7.1;1. Introduction;340
8.2.7.2;2. Overview of ID3;341
8.2.7.3;3. Background theory;342
8.2.7.4;4. Experiments;343
8.2.7.5;5. Results;344
8.2.7.6;6. Conclusion;346
8.2.7.7;Acknowledgement;347
8.2.7.8;References;347
9;Section III: KNOWLEDGE ACQUISITION AND EXPLANATION;350
9.1;Chapter 26. KNET: Integrating Hypermedia and Normative Bayesian Modeling;352
9.1.1;Abstract;352
9.1.2;1. Motivation;352
9.1.3;2. Knowledge engineering in the Bayesian framework;356
9.1.4;3. Using the Bayesian model;358
9.1.5;4. Applications;358
9.1.6;5. Future work;360
9.1.7;Acknowledgments;361
9.1.8;References;361
9.2;CHAPTER 27. GENERATING EXPLANATIONS OF DECISION MODELS BASED ON AN AUGMENTED REPRESENTATION OF UNCERTAINTY;364
9.2.1;1. Introduction;364
9.2.2;2. Motivation for Using a Decision Network Model;365
9.2.3;3. Augmenting the Uncertainty Representation;368
9.2.4;4. Defining a Generic Model;370
9.2.5;5. Efficient Generation of Patient-Specific Models;372
9.2.6;6. Computer-Generated Explanation;373
9.2.7;7. Conclusion;376
9.2.8;Acknowledgements;377
9.2.9;References;377
10;Section IV: APPLICATIONS;380
10.1;CHAPTER 28. INDUCTION AND UNCERTAINTY MANAGEMENT TECHNIQUES APPLIED TO VETERINARY MEDICAL DIAGNOSIS;382
10.1.1;ABSTRACT;382
10.1.2;1.0 INTRODUCTION;382
10.1.3;2.1 CLASSICAL STATISTICAL METHODS;384
10.1.4;3.1 OVERVIEW;387
10.1.5;4.0 CONCLUSIONS AND FURTHER WORK;392
10.1.6;Acknowledgements;392
10.1.7;References;392
10.2;Chapter 29. Predicting the Likely Behaviors of Continuous Nonlinear Systems in Equilibrium;396
10.2.1;1 Introduction;396
10.2.2;2 Other Techniques;397
10.2.3;3 Simple Example Using PV R;398
10.2.4;4 SAB: Overview;400
10.2.5;5 PV R Example Revisited;401
10.2.6;6 SAB: Details;402
10.2.7;7 Discussion;405
10.2.8;A Some Region Probability Bounds Derivations;405
10.2.9;Acknowledgments;407
10.2.10;References;408
10.3;Chapter 30. The structure of Bayes networks for visual recognition;410
10.3.1;I. The problem;410
10.3.2;II. Nature of the vision problem;410
10.3.3;III. Issues in formulation;411
10.3.4;IV. Single verses multiply connected networks;414
10.3.5;V. Further directions;417
10.3.6;References;417
10.4;Chapter 31. Utility-Based Control for Computer Vision;420
10.4.1;1 Introduction;420
10.4.2;2 Bayesian Network for Evidential Accrual;420
10.4.3;3 Computing Values for Inference Actions;423
10.4.4;4 Control of the Dynamic Influence Diagram;426
10.4.5;5 Examples;427
10.4.6;6 Conclusions;433
10.4.7;7 Acknowledgments;434
10.4.8;References;434