E-Book, Englisch, 402 Seiten
Steinicke / Visell / Campos Human Walking in Virtual Environments
1. Auflage 2013
ISBN: 978-1-4419-8432-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
Perception, Technology, and Applications
E-Book, Englisch, 402 Seiten
ISBN: 978-1-4419-8432-6
Verlag: Springer
Format: PDF
Kopierschutz: 1 - PDF Watermark
This book presents a survey of past and recent developments on human walking in virtual environments with an emphasis on human self-motion perception, the multisensory nature of experiences of walking, conceptual design approaches, current technologies, and applications. The use of Virtual Reality and movement simulation systems is becoming increasingly popular and more accessible to a wide variety of research fields and applications. While, in the past, simulation technologies have focused on developing realistic, interactive visual environments, it is becoming increasingly obvious that our everyday interactions are highly multisensory. Therefore, investigators are beginning to understand the critical importance of developing and validating locomotor interfaces that can allow for realistic, natural behaviours. The book aims to present an overview of what is currently understood about human perception and performance when moving in virtual environments and to situate it relative to the broader scientific and engineering literature on human locomotion and locomotion interfaces. The contents include scientific background and recent empirical findings related to biomechanics, self-motion perception, and physical interactions. The book also discusses conceptual approaches to multimodal sensing, display systems, and interaction for walking in real and virtual environments. Finally, it will present current and emerging applications in areas such as gait and posture rehabilitation, gaming, sports, and architectural design.
Frank Steinicke is a professor of computer science in media at the Department of Computer Science and the Department of Human-Computer-Media at the University of Würzburg. He received his Ph.D. in computer science from the University of Munster.Yon Visell is assistant professor at Drexel University in Philadelphia, in the Department of Electrical and Computer Engineering. His research concerns engineering and scientific aspects of haptic and multisensory interaction in virtual and augmented reality environments. Dr. Campos is a Scientist at Toronto Rehab where her research focus is on multisensory integration, perception-action coupling and visuomotor control. Anatole Lécuyer is a senior researcher at Inria in Rennes, France. His research concerns Virtual Reality, 3D User Interfaces, Haptic Feedback and Brain-Computer Interfaces.
Autoren/Hrsg.
Weitere Infos & Material
1;Foreword;5
2;Contents;9
3;Part I Perception;11
4;1 Sensory Contributions to Spatial Knowledge of Real and Virtual Environments;12
4.1;1.1 External Sensory Information;14
4.2;1.2 Internal Sensory Information;16
4.3;1.3 Efferent Sources of Information;17
4.4;1.4 Relative Influence of External and Internal Sensory Information;19
4.4.1;1.4.1 Sensory Contributions in the Real World;19
4.4.2;1.4.2 Sensory Contributions in Virtual Environments;21
4.5;1.5 Conclusion;30
4.6;References;31
5;2 Perceptual and Cognitive Factors for Self-Motion Simulation in Virtual Environments: How Can Self-Motion Illusions (``Vection'') Be Utilized?;36
5.1;2.1 Introduction: The Challenge of Walking in VR;36
5.2;2.2 Visually Induced Self-Motion Illusions;38
5.2.1;2.2.1 Circular Vection;39
5.2.2;2.2.2 Linear Vection;40
5.3;2.3 Self-Motion Sensation from Walking;41
5.4;2.4 Interaction of Walking and Other Modalities for Vection;42
5.4.1;2.4.1 Walking and Auditory Cues;42
5.4.2;2.4.2 Walking and Visual Cues;42
5.5;2.5 Further Cross-Modal Effects on Self-Motion Perception in VR;45
5.6;2.6 Simulator Sickness and Vection in VR;47
5.7;2.7 Perceptual Versus Cognitive Contributions to Vection;47
5.7.1;2.7.1 Lower-Level and Bottom-Up Contributions to Vection;47
5.7.2;2.7.2 Cognitive and Top-Down Contributions to Vection;49
5.8;2.8 Does Vection Improve Spatial Updating and Perspective Switches?;52
5.9;2.9 Conclusions and Conceptual Framework;53
5.10;2.10 Outlook;56
5.11;References;57
6;3 Biomechanics of Walking in Real World: Naturalness we Wish to Reach in Virtual Reality;64
6.1;3.1 Introduction;64
6.2;3.2 Kinematics of Human Walking;65
6.2.1;3.2.1 Global Description;66
6.2.2;3.2.2 Joint Kinematics;71
6.3;3.3 Dynamics of Human Walking;73
6.3.1;3.3.1 Forces and Torques Description;74
6.3.2;3.3.2 Energetics of Human Walking;76
6.3.3;3.3.3 Balance;78
6.4;3.4 Comparison Between Ground and Treadmill Walking;80
6.5;3.5 Conclusion;81
6.6;References;82
7;4 Affordance Perception and the Visual Control of Locomotion;87
7.1;4.1 Introduction;87
7.2;4.2 Taking Body Dimensions and Movement Capabilities into Account;88
7.2.1;4.2.1 Theoretical Approach;88
7.2.2;4.2.2 Affordance Perception and the Control of Locomotion;89
7.2.3;4.2.3 Eyeheight-Scaled Information;89
7.3;4.3 Perceiving Body-Scaled Affordances;91
7.4;4.4 Perceiving Action-Scaled Affordances;94
7.4.1;4.4.1 The Information-Based Approach;94
7.5;4.5 Testing the Information-Based Approach;95
7.5.1;4.5.1 An Alternative Account;97
7.6;4.6 Testing the Affordance-Based Approach;99
7.7;4.7 Extensions of the Affordance-Based Approach;101
7.8;4.8 Affordance Perception and the Continuous Control of Locomotion;102
7.9;4.9 Conclusions;103
7.10;References;105
8;5 The Effect of Translational and Rotational Body-Based Information on Navigation;107
8.1;5.1 Introduction;107
8.2;5.2 Applications of Virtual Environments;108
8.3;5.3 Ecological Validity;109
8.4;5.4 The Effect of Body-Based Information;110
8.4.1;5.4.1 Review Framework;111
8.4.2;5.4.2 Studies Investigating the Effect of Body-Based Information;113
8.5;5.5 Summary and Conclusions for VE Applications;116
8.5.1;5.5.1 Model-Scale Environments;116
8.5.2;5.5.2 Small-Scale Environments;117
8.5.3;5.5.3 Large-Scale Environments;118
8.5.4;5.5.4 Further Research;118
8.6;References;119
9;6 Enabling Unconstrained Omnidirectional Walking Through Virtual Environments: An Overview of the CyberWalk Project;121
9.1;6.1 Introduction;122
9.2;6.2 Gait and Biomechanics;124
9.2.1;6.2.1 Natural Unconstrained Walking;124
9.2.2;6.2.2 Overground Versus Treadmill Walking;128
9.2.3;6.2.3 Potential Implications for CyberWalk;132
9.3;6.3 Multisensory Self-Motion Perception;132
9.3.1;6.3.1 Multisensory Nature of Walking;133
9.3.2;6.3.2 Integration of Vestibular and Proprioceptive Information in Human Locomotion;135
9.3.3;6.3.3 ``Vection'' from Walking;139
9.3.4;6.3.4 Potential Implications for CyberWalk;140
9.4;6.4 Large Scale Navigation;141
9.4.1;6.4.1 Potential Implications for CyberWalk;144
9.5;6.5 Putting it All Together: The CyberWalk Platform;144
9.6;References;147
10;Part II Technologies;153
11;7 Displays and Interaction for Virtual Travel;154
11.1;7.1 Introduction;154
11.2;7.2 Display Systems;156
11.3;7.3 Interaction Devices;161
11.4;7.4 Travel Techniques;173
11.4.1;7.4.1 Travel as a Control Task;173
11.4.2;7.4.2 Direct Self Motion Control Techniques;177
11.4.3;7.4.3 Indirect Self Motion Control Techniques;178
11.4.4;7.4.4 Scene Motion Techniques;178
11.4.5;7.4.5 Other Control Inputs;179
11.5;7.5 Conclusion;179
11.6;References;180
12;8 Sensing Human Walking: Algorithms and Techniques for Extracting and Modeling Locomotion;183
12.1;8.1 Introduction;183
12.2;8.2 Sensing and Interpreting Global Gait Parameters;184
12.2.1;8.2.1 Step Length and Frequency;184
12.2.2;8.2.2 Curvature and Non-linear Walking;185
12.2.3;8.2.3 Gait Asymmetry and Regularity;189
12.3;8.3 Joint Angles, Torques and Muscle Activity;189
12.3.1;8.3.1 Measuring Joint Displacements;189
12.3.2;8.3.2 Measuring Joint Angles;192
12.3.3;8.3.3 Estimating Joint Torques with Inverse Dynamics;194
12.4;8.4 Isolated Segments;194
12.5;8.5 Global System and Controllers;195
12.6;8.6 Conclusion About Inverse Dynamic Approaches;196
12.6.1;8.6.1 Measuring or Estimating Muscle Activities;197
12.7;8.7 Conclusion;201
12.8;References;201
13;9 Locomotion Interfaces;204
13.1;9.1 Introduction;204
13.2;9.2 Sliding Shoes;206
13.2.1;9.2.1 Virtual Perambulator;206
13.2.2;9.2.2 Powered Shoes;207
13.2.3;9.2.3 String Walker;208
13.2.4;9.2.4 Evacuation Simulator Using the Virtual Perambulator;209
13.3;9.3 Treadmills;210
13.3.1;9.3.1 Related Works in Treadmill-Based Locomotion Interface;210
13.3.2;9.3.2 Torus Treadmill;211
13.3.3;9.3.3 Control Algorithm of the Torus Treadmill;213
13.3.4;9.3.4 Effects of Walking on the Torus Treadmill;214
13.3.5;9.3.5 Limitation of Torus Treadmill;214
13.4;9.4 Foot Pad;215
13.4.1;9.4.1 Related Works in Foot-Pad-Based Locomotion Interface;215
13.4.2;9.4.2 Gait Master;215
13.4.3;9.4.3 Control Algorithm of the GaitMaster;218
13.4.4;9.4.4 GaitMater for Walking Rehabilitation;219
13.5;9.5 Robotic Tiles;220
13.5.1;9.5.1 The CirculaFloor;220
13.5.2;9.5.2 User Study of the Robot Tile Approach;220
13.6;9.6 Conclusion;223
13.7;References;223
14;10 Implementing Walking in Virtual Environments;225
14.1;10.1 Introduction;225
14.2;10.2 Virtual Reality Workspaces;227
14.3;10.3 Isometric Virtual Walking;229
14.3.1;10.3.1 One-to-One Mappings;229
14.3.2;10.3.2 Reference Coordinates;230
14.3.3;10.3.3 Virtual Traveling;231
14.4;10.4 Nonisometric Virtual Walking;231
14.4.1;10.4.1 User-Centric Coordinates;232
14.4.2;10.4.2 Scaling Self-Motions;234
14.4.3;10.4.3 Redirected Walking;237
14.5;10.5 Conclusion;241
14.6;References;242
15;11 Stepping-Driven Locomotion Interfaces;245
15.1;11.1 Designing Stepping-Driven Locomotion for Virtual Environment Systems;245
15.2;11.2 Walking-in-Place Interfaces;248
15.2.1;11.2.1 Setting Speed: Interpreting Stepping Gestures;248
15.2.2;11.2.2 Setting Direction for Walking-in-Place;253
15.2.3;11.2.3 The Future for Walking-in-Place Interfaces;255
15.3;11.3 Real-Walking Interfaces;256
15.3.1;11.3.1 Manipulating Speed;256
15.3.2;11.3.2 Manipulating Direction;258
15.3.3;11.3.3 Reorientation Techniques;262
15.3.4;11.3.4 The Future for Real-Walking Interfaces for IVE Systems;264
15.4;References;264
16;12 Multimodal Rendering of Walking Over Virtual Grounds;267
16.1;12.1 Introduction;268
16.2;12.2 Auditory Rendering;269
16.2.1;12.2.1 Introduction;269
16.2.2;12.2.2 Footstep Sound Synthesis;271
16.2.3;12.2.3 Walking Sounds and Soundscape Reproduction;276
16.2.4;12.2.4 Footstep Sound Design Toolkits;278
16.3;12.3 From Haptic to Multimodal Rendering;279
16.3.1;12.3.1 Introduction;279
16.3.2;12.3.2 Touch Sensation in the Feet;282
16.3.3;12.3.3 Multimodal Displays;285
16.3.4;12.3.4 Display Configurations;286
16.3.5;12.3.5 Interactive Scenarios;290
16.4;12.4 Conclusion;294
16.5;References;294
17;Part III Applications and Interactive Techniques;300
18;13 Displacements in Virtual Reality for Sports Performance Analysis;301
18.1;13.1 Introduction;301
18.1.1;13.1.1 Why Virtual Reality for Sports?;302
18.1.2;13.1.2 Requirements for Using Virtual Reality for Sports;306
18.1.3;13.1.3 Some Applications of Virtual Reality for Sports;307
18.2;13.2 Case Study 1: Deceptive Movements in Rugby;308
18.2.1;13.2.1 Setup;308
18.2.2;13.2.2 Method;309
18.2.3;13.2.3 Results;310
18.2.4;13.2.4 Discussion;310
18.3;13.3 Case Study 2: Wall Configuration for Soccer Free Kicks;312
18.3.1;13.3.1 Setup;313
18.3.2;13.3.2 Methods;314
18.3.3;13.3.3 Results;315
18.3.4;13.3.4 Discussion;316
18.4;13.4 Conclusion;316
18.5;References;317
19;14 Redirected Walking in Mixed Reality Training Applications;321
19.1;14.1 Locomotion in Virtual Environments;322
19.2;14.2 Redirected Walking;323
19.3;14.3 Practical Considerations for Training Environments;324
19.3.1;14.3.1 Impact of Redirection on Spatial Orientation;324
19.3.2;14.3.2 Augmenting Effectiveness of Redirected Walking;325
19.3.3;14.3.3 Designing Experiences for Redirected Walking;327
19.4;14.4 Redirection in Mixed Reality Environments;328
19.5;14.5 Challenges and Future Directions;330
19.6;References;331
20;15 VR-Based Assessment and Rehabilitation of Functional Mobility;334
20.1;15.1 VR-Based Assessment and Rehabilitation to Promote Functional Mobility;337
20.1.1;15.1.1 VR-Based Assessment and Rehabilitation Following Motor Dysfunction;337
20.1.2;15.1.2 VR-Based Assessment and Rehabilitation Following Visual Dysfunction;340
20.2;15.2 Dynamical Disease and VR-Based Assessment;342
20.2.1;15.2.1 Dynamic Measures for Assessing Local Functional Mobility Using VR;343
20.2.2;15.2.2 Dynamic Measures for Assessing Global Functional Mobility Using VR;345
20.3;15.3 Conclusion;347
20.4;References;348
21;16 Full Body Locomotion with Video Game Motion Controllers;352
21.1;16.1 Introduction;352
21.2;16.2 Video Game Motion Controllers;353
21.2.1;16.2.1 Wiimote;354
21.2.2;16.2.2 Playstation Move;358
21.2.3;16.2.3 Microsoft Kinect;360
21.3;16.3 Dealing with the Data;363
21.3.1;16.3.1 Understanding the Data Coming from the Device;364
21.3.2;16.3.2 Research the Algorithm Options Suited for the Data;365
21.3.3;16.3.3 Modifying the Models to Address Error and Uncertainty;369
21.3.4;16.3.4 Applying All the Data Toward a Solution;370
21.4;16.4 Creating an Interface;371
21.4.1;16.4.1 Challenges;371
21.4.2;16.4.2 Controlling Travel;372
21.4.3;16.4.3 Understand Your Design Tradeoffs and Users;373
21.4.4;16.4.4 Find How People Want to Interact;374
21.4.5;16.4.5 Compensate For Technology Limitations;374
21.5;16.5 Conclusion;376
21.6;References;376
22;17 Interacting with Augmented Floor Surfaces;378
22.1;17.1 Introduction;378
22.2;17.2 Background;379
22.2.1;17.2.1 Input from the Foot in Human-Computer Interaction;381
22.2.2;17.2.2 Relevance to Virtual Reality;382
22.3;17.3 Techniques and Technologies;383
22.3.1;17.3.1 Indirect Optical Sensing;383
22.3.2;17.3.2 Contact Sensing;384
22.3.3;17.3.3 Usability;385
22.4;17.4 Case Study: A Distributed, Multimodal Floor Interface;388
22.4.1;17.4.1 Contact Localization;388
22.4.2;17.4.2 Virtual Walking on Natural Materials;391
22.4.3;17.4.3 Floor Touch-Surface Interaction Techniques;391
22.4.4;17.4.4 Usability of Foot-Floor Touch-Surface Interfaces;392
22.4.5;17.4.5 Application: Geospatial Data Navigation;394
22.4.6;17.4.6 Foot-Based Gestures for Geospatial Navigation;394
22.5;17.5 Conclusions;398
22.6;References;398
23;Index;401




