The Guide for Developers and Users
Buch, Englisch, 860 Seiten, Format (B × H): 158 mm x 249 mm, Gewicht: 1837 g
ISBN: 978-3-527-41339-3
Verlag: WILEY-VCH
Beschreibt die Hardware und Software, die bei industriellen Anwendungen der Bilderfassung und Bildverarbeitung erforderlich ist, und deckt die neuesten Fortschritte im 3D-Bereich ab.
Autoren/Hrsg.
Fachgebiete
Weitere Infos & Material
Preface Second Edition xxiii
Preface First Edition xxv
List of Contributors xxvii
1 Processing of Information in the Human Visual System 1
Frank Schaeffel
1.1 Preface 1
1.2 Design and Structure of the Eye 1
1.3 Optical Aberrations and Consequences for Visual Performance 3
1.4 Chromatic Aberration 10
1.5 Neural Adaptation to Monochromatic Aberrations 11
1.6 Optimizing Retinal Processing with Limited Cell Numbers, Space, and Energy 11
1.7 Adaptation to Different Light Levels 12
1.8 Rod and Cone Responses 14
1.9 Spiking and Coding 16
1.10 Temporal and Spatial Performance 17
1.11 ON/OFF Structure, Division of the Whole Illuminance Amplitude 18
1.12 Consequences of the Rod and Cone Diversity on Retinal Wiring 18
1.13 Motion Sensitivity in the Retina 19
1.14 Visual Information Processing in Higher Centers 20
1.14.1 Morphology 21
1.14.2 Functional Aspects – Receptive Field Structures and Cortical Modules 22
1.15 Effects of Attention 23
1.16 Color Vision, Color Constancy, and Color Contrast 23
1.17 Depth Perception 25
1.18 Adaptation in the Visual System to Color, Spatial, and Temporal Contrast 26
1.19 Conclusions 26
Acknowledgements 28
References 28
2 Introduction to Building a Machine Vision Inspection 31
Axel Telljohann
2.1 Preface 31
2.2 Specifying a Machine Vision System 32
2.2.1 Task and Benefit 32
2.2.2 Parts 33
2.2.2.1 Different Part Types 33
2.2.3 Part Presentation 33
2.2.4 Performance Requirements 34
2.2.4.1 Accuracy 34
2.2.4.2 Time Performance 34
2.2.5 Information Interfaces 34
2.2.6 Installation Space 35
2.2.7 Environment 35
2.2.8 Checklist 35
2.3 Designing a Machine Vision System 36
2.3.1 Camera Type 36
2.3.2 Field of View 37
2.3.3 Resolution 38
2.3.3.1 Camera Sensor Resolution 38
2.3.3.2 Spatial Resolution 38
2.3.3.3 Measurement Accuracy 38
2.3.3.4 Calculation of Resolution 39
2.3.3.5 Resolution for a Line Scan Camera 39
2.3.4 Choice of Camera, Frame Grabber, and Hardware Platform 40
2.3.4.1 Camera Model 40
2.3.4.2 Frame Grabber 40
2.3.4.3 Pixel Rate 40
2.3.4.4 Hardware Platform 41
2.3.5 Lens Design 41
2.3.5.1 Focal Length 42
2.3.5.2 Lens Flange Focal Distance 43
2.3.5.3 Extension Tubes 43
2.3.5.4 Lens Diameter and Sensor Size 43
2.3.5.5 Sensor Resolution and Lens Quality 43
2.3.6 Choice of Illumination 44
2.3.6.1 Concept: Maximize Contrast 44
2.3.6.2 Illumination Setups 44
2.3.6.3 Light Sources 45
2.3.6.4 Approach to the Optimum Setup 45
2.3.6.5 Interfering Lighting 46
2.3.7 Mechanical Design 46
2.3.8 Electrical Design 46
2.3.9 Software 46
2.3.9.1 Software Library 47
2.3.9.2 Software Structure 47
2.3.9.3 General Topics 48
2.4 Costs 48
2.5 Words on Project Realization 49
2.5.1 Development and Installation 49
2.5.2 Test Run and Acceptance Test 49
2.5.3 Training and Documentation 50
2.6 Examples 50
2.6.1 Diameter Inspection of Rivets 50
2.6.1.1 Task 50
2.6.1.2 Specification 51
2.6.1.3 Design 51
2.6.2 Tubing Inspection 55
2.6.2.1 Task 55
2.6.2.2 Specification 55
2.6.2.3 Design 56
3 Lighting in Machine Vision 63
Irmgard Jahr
3.1 Introduction 63
3.1.1 Prologue 63
3.1.2 The Involvement of Lighting in the Complex Machine Vision Solution 63
3.2 Demands on Machine Vision lighting 67
3.3 Light used in Machine Vision 70
3.3.1 What is Light? Axioms of Light 70
3.3.2 Light and Light Perception 73
3.3.3 Light Sources for Machine Vision 76
3.3.3.1 Incandescent Lamps/Halogen Lamps 77
3.3.3.2 Metal Vapor Lamps 78
3.3.3.3 Xenon Lamps 79
3.3.3.4 Fluorescent Lamps 81
3.3.3.5 LEDs (Light Emitting Diodes) 82
3.3.3.6 Lasers 85
3.3.4 The Light Sources in Comparison 86
3.3.5 Considerations for Light Sources: Lifetime, Aging, Drift 86
3.3.5.1 Lifetime 86
3.3.5.2 Aging and Drift 88
3.4 Interaction of Test Object and Light 91
3.4.1 Risk Factor Test Object 91
3.4.1.1 What Does the Test Object do With the Incoming Light? 92
3.4.1.2 Reflection/Reflectance/Scattering 92
3.4.1.3 Total Reflection 95
3.4.1.4 Transmission/Transmittance 96
3.4.1.5 Absorption/Absorbance 97
3.4.1.6 Diffraction 99
3.4.1.7 Refraction 100
3.4.2 Light Color and Part Color 101
3.4.2.1 Visible Light (VIS) – Monochromatic Light 101
3.4.2.2 Visible Light (VIS) – White Light 103
3.4.2.3 Infrared Light (IR) 104
3.4.2.4 Ultraviolet (UV) Light 106
3.4.2.5 Polarized Light 107
3.5 Basic Rules and Laws of Light Distribution 109
3.5.1 Basic Physical Quantities of Light 110
3.5.2 The Photometric Inverse Square Law 111
3.5.3 The Constancy of Luminance 113
3.5.4 What Light Arrives at the Sensor – Light Transmission Through the Lens 114
3.5.5 Light Distribution of Lighting Components 115
3.5.6 Contrast 118
3.5.7 Exposure 120
3.6 Light Filters 121
3.6.1 Characteristic Values of Light Filters 121
3.6.2 Influences of Light Filters on the Optical Path 123
3.6.3 Types of Light Filters 124
3.6.4 Anti-Reflective Coatings (AR) 126
3.6.5 Light Filters for Machine Vision 127
3.6.5.1 UV Blocking Filter 127
3.6.5.2 Daylight Suppression Filter 128
3.6.5.3 IR Suppression Filter 128
3.6.5.4 Neutral Filter/Neutral Density Filter/Gray Filter 129
3.6.5.5 Polarization Filter 130
3.6.5.6 Color Filters 130
3.6.5.7 Filter Combinations 131
3.7 Lighting Techniques and Their Use 131
3.7.1 How to Find a Suitable Lighting? 131
3.7.2 Planning the Lighting Solution – Influence Factors 133
3.7.3 Lighting Systematics 135
3.7.3.1 Directional Properties of the Light 135
3.7.3.2 Arrangement of the Lighting 138
3.7.3.3 Properties of the Illuminated Field 138
3.7.4 The Lighting Techniques in Detail 140
3.7.4.1 Diffuse Bright Field Incident Light (No. 1, Table 3.14) 140
3.7.4.2 Directed Bright Field Incident Light (No. 2, Table 3.14) 142
3.7.4.3 Telecentric Bright Field Incident Light (No. 3, Table 3.14) 143
3.7.4.4 Structured Bright Field Incident Light (No. 4, Table 3.14) 145
3.7.4.5 Diffuse Directed Partial Bright Field Incident Light (Nos. 1 and 2, Table 3.14) 148
3.7.4.6 Diffuse/Directed Dark Field Incident Light (Nos. 5 and 6, Table 3.14) 152
3.7.4.7 The Limits of the Incident Lighting 154
3.7.4.8 Diffuse Bright Field Transmitted Lighting (No. 7, Table 3.14) 155
3.7.4.9 Directed Bright Field Transmitted Lighting (No. 8, Table 3.14) 157
3.7.4.10 Telecentric Bright Field Transmitted Lighting (No. 9, Table 3.14) 158
3.7.4.11 Diffuse/Directed Transmitted Dark Field Lighting (Nos. 10 and 11, Table 3.14) 161
3.7.5 Combined Lighting Techniques 162
3.8 Lighting Control 163
3.8.1 Reasons for Light Control – The Environmental Industrial Conditions 164
3.8.2 Electrical Control 164
3.8.2.1 Stable Operation 164
3.8.2.2 Brightness Control 166
3.8.2.3 Temporal Control: Static-Pulse-Flash 167
3.8.2.4 Some Considerations for the Use of Flash Light 168
3.8.2.5 Temporal and Local Control: Adaptive Lighting 171
3.8.3 Geometrical Control 173
3.8.3.1 Lighting from Large Distances 173
3.8.3.2 Light Deflection 175
3.8.4 Suppression of Ambient and Extraneous Light – Measures for a Stable Lighting 175
3.9 Lighting Perspectives for the Future 176
References 177
4 Optical Systems in Machine Vision 179
Karl Lenhardt
4.1 A Look at the Foundations of Geometrical Optics 179
4.1.1 From Electrodynamics to Light Rays 179
4.1.2 Basic Laws of Geometrical Optics 181
4.2 Gaussian Optics 183
4.2.1 Reflection and Refraction at the Boundary between two Media 183
4.2.2 Linearizing the Law of Refraction – The Paraxial Approximation 185
4.2.3 Basic Optical Conventions 186
4.2.3.1 Definitions for Image Orientations 186
4.2.3.2 Definition of the Magnification Ratio ß 186
4.2.3.3 Real and Virtual Objects and Images 187
4.2.3.4 Tilt Rule for the Evaluation of Image Orientations by Reflection 188
4.2.4 Cardinal Elements of a Lens in Gaussian Optics 189
4.2.4.1 Focal Lengths f and f ' 192
4.2.4.2 Convention 192
4.2.5 Thin Lens Approximation 193
4.2.6 Beam-Converging and Beam-Diverging Lenses 193
4.2.7 Graphical Image Constructions 195
4.2.7.1 Beam-Converging Lenses 195
4.2.7.2 Beam-Diverging Lenses 195
4.2.8 Imaging Equations and Their Related Coordinate Systems 195
4.2.8.1 Reciprocity Equation 196
4.2.8.2 Newton’s Equations 197
4.2.8.3 General Imaging Equation 198
4.2.8.4 Axial Magnification Ratio 200
4.2.9 Overlapping of Object and Image Space 200
4.2.10 Focal Length, Lateral Magnification, and the Field of View 200
4.2.11 Systems of Lenses 202
4.2.12 Consequences of the Finite Extension of Ray Pencils 205
4.2.12.1 Effects of Limitations of the Ray Pencils 205
4.2.12.2 Several Limiting Openings 207
4.2.12.3 Characterizing the Limits of Ray Pencils 210
4.2.12.4 Relation to the Linear Camera Model 212
4.2.13 Geometrical Depth of Field and Depth of Focus 214
4.2.13.1 Depth of Field as a Function of the Object Distance p 215
4.2.13.2 Depth of Field as a Function of ß 216
4.2.13.3 Hyperfocal Distance 217
4.2.13.4 Permissible Size for the Circle of Confusion d ' 218
4.2.14 Laws of Central Projection–Telecentric System 219
4.2.14.1 Introduction to the Laws of Perspective 219
4.2.14.2 Central Projection from Infinity – Telecentric Perspective 228
4.3 Wave Nature of Light 235
4.3.1 Introduction 235
4.3.2 Rayleigh–Sommerfeld Diffraction Integral 236
4.3.3 Further Approximations to the Huygens–Fresnel Principle 238
4.3.3.1 Fresnel’s Approximation 239
4.3.4 Impulse Response of an Aberration-Free Optical System 241
4.3.4.1 Case of Circular Aperture, Object Point on the Optical Axis 243
4.3.5 Intensity Distribution in the Neighborhood of the Geometrical Focus 244
4.3.5.1 Special Cases 246
4.3.6 Extension of the Point Spread Function in a Defocused Image Plane 248
4.3.7 Consequences for the Depth of Field Considerations 249
4.3.7.1 Diffraction and Permissible Circle of Confusion 249
4.3.7.2 Extension of the Point Spread Function at the Limits of the Depth of Focus 250
4.3.7.3 Useful Effective f -Number 251
4.4 Information Theoretical Treatment of Image Transfer and Storage 252
4.4.1 Physical Systems as Linear Invariant Filters 252
4.4.1.1 Invariant Linear Systems 255
4.4.1.2 Note to the Representation of Harmonic Waves 259
4.4.2 Optical Transfer Function (OTF) and the Meaning of Spatial Frequency 260
4.4.2.1 Note on the Relation Between the Elementary Functions in the Two Representation Domains 261
4.4.3 Extension to the Two-Dimensional Case 261
4.4.3.1 Interpretation of Spatial Frequency Components (r, s) 261
4.4.3.2 Reduction to One-Dimensional Representations 262
4.4.4 Impulse Response and MTF for Semiconductor Imaging Devices 265
4.4.5 Transmission Chain 267
4.4.6 Aliasing Effect and the Space-Variant Nature of Aliasing 267
4.4.6.1 Space-Variant Nature of Aliasing 274
4.5 Criteria for Image Quality 277
4.5.1 Gaussian Data 277
4.5.2 Overview on Aberrations of the Third Order 277
4.5.2.1 Monochromatic Aberrations of the Third Order (Seidel Aberrations) 278
4.5.2.2 Chromatic Aberrations 278
4.5.3 Image Quality in the Space Domain: PSF, LSF, ESF, and Distortion 278
4.5.3.1 Distortion 280
4.5.4 Image Quality in the Spatial Frequency Domain: MTF 281
4.5.4.1 Parameters that Influence the Modulation Transfer Function 282
4.5.5 Other Image Quality Parameters 283
4.5.5.1 Relative Illumination (Relative Irradiance) 283
4.5.5.2 Deviation from Telecentricity (for Telecentric Lenses only) 284
4.5.6 Manufacturing Tolerances and Image Quality 284
4.5.6.1 Measurement Errors due to Mechanical Inaccuracies of the Camera System 285
4.6 Practical Aspects: How to Specify Optics According to the Application Requirements? 285
4.6.1 Example for the Calculation of an Imaging Constellation 287
References 289
5 Camera Calibration 291
Robert Godding
5.1 Introduction 291
5.2 Terminology 292
5.2.1 Camera, Camera System 292
5.2.2 Coordinate Systems 292
5.2.3 Interior Orientation and Calibration 293
5.2.4 Exterior and Relative Orientation 293
5.2.5 System Calibration 293
5.3 Physical Effects 293
5.3.1 Optical System 293
5.3.2 Camera and Sensor Stability 294
5.3.3 Signal Processing and Transfer 294
5.4 Mathematical Calibration Model 295
5.4.1 Central Projection 295
5.4.2 Camera Model 295
5.4.3 Focal Length and Principal Point 297
5.4.4 Distortion and Affinity 297
5.4.5 Radial Symmetrical Distortion 297
5.4.6 Radial Asymmetrical and Tangential Distortion 299
5.4.7 Affinity and Nonorthogonality 299
5.4.8 Variant Camera Parameters 299
5.4.9 Sensor Flatness 301
5.4.10 Other Parameters 301
5.5 Calibration and Orientation Techniques 302
5.5.1 In the Laboratory 302
5.5.2 Using Bundle Adjustment to Determine Camera Parameters 302
5.5.2.1 Calibration Based Exclusively on Image Information 302
5.5.2.2 Calibration and Orientation with Additional Object Information 304
5.5.2.3 Extended System Calibration 307
5.5.3 Other Techniques 307
5.6 Verification of Calibration Results 308
5.7 Applications 309
5.7.1 Applications with Simultaneous Calibration 309
5.7.2 Applications with Precalibrated Cameras 311
5.7.2.1 Tube Measurement within a Measurement Cell 311
5.7.2.2 Online Measurements in the Field of Car Safety 312
5.7.2.3 High Resolution 3D Scanning with White Light Scanners 312
5.7.2.4 Other Applications 313
References 314
6 Camera Systems in Machine Vision 317
Horst Mattfeldt
6.1 Camera Technology 317
6.1.1 History in Brief 317
6.1.2 Machine Vision versus Closed Circuit TeleVision (CCTV) 317
6.2 Sensor Technologies 319
6.2.1 Spatial Differentiation: 1D and 2D 319
6.2.2 CCD Technology 320
6.2.2.1 Interline Transfer 321
6.2.2.2 Progressive Scan Interline Transfer 321
6.2.2.3 Interlaced Scan Readout 322
6.2.2.4 Enhancing Frame Rate by Multitap Sensors 324
6.2.2.5 SONY HAD Technology 325
6.2.2.6 SONY SuperHAD (II) and ExViewHAD (II) Technology 325
6.2.2.7 CCD Image Artifacts 326
6.2.2.8 Blooming 326
6.2.2.9 Smear 326
6.2.3 CMOS Image Sensor 328
6.2.3.1 Advantages of CMOS Sensor 328
6.2.3.2 CMOS Sensor Shutter Concepts 331
6.2.3.3 Performance Comparison of CMOS versus CCD 336
6.2.3.4 Integration Complexity of CCD versus CMOS Camera Technology 336
6.2.3.5 CMOS Sensor Sensitivity Enhancements 337
6.2.4 MATRIX VISION Available Cameras 338
6.2.4.1 Why So Many Different Models? How to Choose Among These? 338
6.2.4.2 Resolution and Video Standards 338
6.2.4.3 Sensor Sizes and Dimensions 344
6.3 Block Diagrams and Their Description 344
6.3.1