Suk-han / Ko / Hahn | Multisensor Fusion and Integration for Intelligent Systems | E-Book | www.sack.de
E-Book

E-Book, Englisch, Band 35, 479 Seiten

Reihe: Lecture Notes in Electrical Engineering

Suk-han / Ko / Hahn Multisensor Fusion and Integration for Intelligent Systems

An Edition of the Selected Papers from the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems 2008
1. Auflage 2009
ISBN: 978-3-540-89859-7
Verlag: Springer Berlin Heidelberg
Format: PDF
Kopierschutz: 1 - PDF Watermark

An Edition of the Selected Papers from the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems 2008

E-Book, Englisch, Band 35, 479 Seiten

Reihe: Lecture Notes in Electrical Engineering

ISBN: 978-3-540-89859-7
Verlag: Springer Berlin Heidelberg
Format: PDF
Kopierschutz: 1 - PDF Watermark



The ?eld of multi-sensor fusion and integration is growing into signi?cance as our societyisintransitionintoubiquitouscomputingenvironmentswithroboticservices everywhere under ambient intelligence. What surround us are to be the networks of sensors and actuators that monitor our environment, health, security and safety, as well as the service robots, intelligent vehicles, and autonomous systems of ever heightened autonomy and dependability with integrated heterogeneous sensors and actuators. The ?eld of multi-sensor fusion and integration plays key role for m- ing the above transition possible by providing fundamental theories and tools for implementation. This volume is an edition of the papers selected from the 7th IEEE International Conference on Multi-Sensor Integration and Fusion, IEEE MFI'08, held in Seoul, Korea, August 20-22, 2008. Only 32 papers out of the 122 papers accepted for IEEE MFI'08 were chosen and requested for revision and extension to be included in this volume. The 32 contributions to this volume are organized into three parts: Part I is dedicated to the Theories in Data and Information Fusion, Part II to the Multi-Sensor Fusion and Integration in Robotics and Vision, and Part III to the Applications to Sensor Networks and Ubiquitous Computing Environments. To help readers understand better, a part summary is included in each part as an introduction. The summaries of Parts I, II, and III are prepared respectively by Prof. Hanseok Ko, Prof. Sukhan Lee and Prof. Hernsoo Hahn.

Suk-han / Ko / Hahn Multisensor Fusion and Integration for Intelligent Systems jetzt bestellen!

Weitere Infos & Material


1;Preface;5
2;Contents;6
3;Contributors;10
4;Part I Theories in Data and Information Fusion;16
4.1;Performance Analysis of GPS/INS Integrated System by Using a Non-Linear Mathematical Model ;18
4.1.1;Khalid Touil, Mourad Zribi and Mohammed Benjelloun;18
4.2;Object-Level Fusion and Confidence Management in a Multi-Sensor Pedestrian Tracking System ;30
4.2.1;Fadi Fayad and Véronique Cherfaoui;30
4.3;Effective Lip Localization and Tracking for Achieving Multimodal Speech Recognition ;47
4.3.1;Wei Chuan Ooi, Changwon Jeon, Kihyeon Kim, Hanseok Ko and David K. Han;47
4.4;Optimal View Selection and Event Retrieval in Multi-Camera Office Environment ;58
4.4.1;Han-Saem Park, Soojung Lim, Jun-Ki Min and Sung-Bae Cho;58
4.5;Fusion of Multichannel Biosignals Towards Automatic Emotion Recognition ;67
4.5.1;Jonghwa Kim and Elisabeth André;67
4.6;A Comparison of Track-to-Track Fusion Algorithms for Automotive Sensor Fusion ;81
4.6.1;Stephan Matzka and Richard Altendorfer;81
4.7;Effective and Efficient Communication of Information ;94
4.7.1;Jan Willem Marck, Leon Kester, Miranda van Iersel, Jeroen Bergmans and Eelke van Foeken;94
4.8;Most Probable Data Association with Distance and Amplitude Information for Target Tracking in Clutter ;107
4.8.1;Taek Lyul Song;107
4.9;Simultaneous Multi-Information Fusion and Parameter Estimation for Robust 3-D Indoor Positioning Systems ;120
4.9.1;Hui Wang, Andrei Szabo, Joachim Bamberger and Uwe D. Hanebeck;120
4.10;Efficient Multi-Target Tracking with Sub-Event IMM-JPDA and One-Point Prime Initialization ;135
4.10.1;Seokwon Yeom;135
4.11;Enabling Navigation of MAVs through Inertial, Vision, and Air Pressure Sensor Fusion ;151
4.11.1;Clark N. Taylor;151
5;Part II Multi-Sensor Fusion and Integration in Robotics and Vision;167
5.1;Enhancement of Image Degraded by Fog Using Cost Function Based on Human Visual Model ;170
5.1.1;Dongjun Kim, Changwon Jeon, Bonghyup Kang and Hanseok Ko;170
5.2;Pedestrian Route Guidance by Projecting Moving Images ;179
5.2.1;Takuji Narumi, Yasushi Hada, Hajime Asama and Kunihiro Tsuji;179
5.3;Recognizing Human Activities from Accelerometer and Physiological Sensors ;193
5.3.1;Sung-Ihk Yang and Sung-Bae Cho;193
5.4;The ``Fast Clustering-Tracking'' Algorithm in the Bayesian Occupancy Filter Framework ;206
5.4.1;Kamel Mekhnacha, Yong Mao, David Raulo and Christian Laugier;206
5.5;Compliant Physical Interaction Based on External Vision-Force Control and Tactile-Force Combination ;225
5.5.1;Mario Prats, Philippe Martinet, Sukhan Lee and Pedro J. Sanz;225
5.6;iTASC: A Tool for Multi-Sensor Integration in Robot Manipulation ;238
5.6.1;Ruben Smits, Tinne De Laet, Kasper Claes, Herman Bruyninckx and Joris De Schutter;238
5.7;Behavioral Programming with Hierarchy and Parallelism in the DARPA Urban Challenge and RoboCup ;258
5.7.1;Jesse G. Hurdus and Dennis W. Hong;258
5.8;Simultaneous Estimation of Road Region and Ego-Motion with Multiple Road Models ;273
5.8.1;Yoshiteru Matsushita and Jun Miura;273
5.9;Model-Based Recognition of 3D Objects using Intersecting Lines ;291
5.9.1;Hung Q. Truong, Sukhan Lee and Seok-Woo Jang;291
5.10;Visual SLAM in Indoor Environments Using Autonomous Detection and Registration of Objects ;303
5.10.1;Yong-Ju Lee and Jae-Bok Song;303
5.11;People Detection using Double Layered Multiple Laser Range Finders by a Companion Robot ;317
5.11.1;Alexander Carballo, Akihisa Ohya and Shin'ichi Yuta;317
6;Part III Applications to Sensor Networks and Ubiquitous Computing Environments;334
6.1;Path-Selection Control of a Power Line Inspection Robot Using Sensor Fusion ;336
6.1.1;SunSin Han and JangMyung Lee;336
6.2;Intelligent Glasses: A Multimodal Interface for Data Communication to the Visually Impaired ;350
6.2.1;Edwige Pissaloux, Ramiro Velázquez and Flavien Maingreaud;350
6.3;Fourier Density Approximation for Belief Propagation in Wireless Sensor Networks ;359
6.3.1;Chongning Na, Hui Wang, Dragan Obradovic and Uwe D. Hanebeck;359
6.4;Passive Localization Methods Exploiting Models of Distributed Natural Phenomena ;374
6.4.1;Felix Sawo, Thomas C. Henderson, Christopher Sikorski and Uwe D. Hanebeck;374
6.5;Study on Spectral Transmission Characteristics of the Reflected and Self-emitted Radiations through the Atmosphere ;392
6.5.1;Jun-Hyuk Choi and Tae-Kuk Kim;392
6.6;3D Reflectivity Reconstruction by Means of Spatially Distributed Kalman Filters ;406
6.6.1;G.F. Schwarzenberg, U. Mayer, N.V. Ruiter and U.D. Hanebeck;406
6.7;T-SLAM: Registering Topological and Geometric Maps for Robot Localization ;421
6.7.1;F. Ferreira, I. Amorim, R. Rocha and J. Dias;421
6.8;Map Fusion Based on a Multi-Map SLAM Framework;437
6.8.1;François Chanier, Paul Checchin, Christophe Blanc and Laurent Trassoudaine;437
6.9;Development of a Semi-Autonomous Vehicle Operable by the Visually-Impaired ;453
6.9.1;Dennis W. Hong, Shawn Kimmel, Rett Boehling, Nina Camoriano, Wes Cardwell, Greg Jannaman, Alex Purcell, Dan Ross and Eric Russel;453
6.10;Process Diagnosis and Monitoring of Field Bus based Automation Systems using Self-Organizing Maps and Watershed Transformations ;466
6.10.1;Christian W. Frey;466



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.