Dickmanns | Dynamic Vision for Perception and Control of Motion | Buch | 978-1-84628-637-7 | sack.de

Buch, Englisch, 474 Seiten, HC runder Rücken kaschiert, Format (B × H): 160 mm x 241 mm, Gewicht: 995 g

Dickmanns

Dynamic Vision for Perception and Control of Motion

Buch, Englisch, 474 Seiten, HC runder Rücken kaschiert, Format (B × H): 160 mm x 241 mm, Gewicht: 995 g

ISBN: 978-1-84628-637-7
Verlag: Springer


The application of machine vision to autonomous vehicles is an increasingly important area of research with exciting applications in industry, defense, and transportation likely in coming decades.

Dynamic Vision for Perception and Control of Motion has been written by the world's leading expert on autonomous road-following vehicles and brings together twenty years of innovation in the field by Professor Dickmanns and his colleagues at the German Armed Forces university in Munich.

The book uniquely details an approach to real-time machine vision for the understanding of dynamic scenes, viewed from a moving platform that begins with spatio-temporal representations of motion for hypothesized objects whose parameters are adjusted by well-known prediction error feedback and recursive estimation techniques.

A coherent and up-to-date coverage of the subject matter is presented, with the machine vision and control aspects detailed, along with reports on the mission performance the first vehicles using these innovative techniques built at Munich. Pointers to the future development and likely applications of this hugely important field of research are presented.

Dynamic Vision for Perception and Control of Motion will be a key reference for technologist working in autonomous vehicles and mobile robotics in general who wish to access the leading research in this field, as well as researchers and students working in machine vision and dynamic control interested in one of the most interesting and promising applications of these techniques.
Dickmanns Dynamic Vision for Perception and Control of Motion jetzt bestellen!

Zielgruppe


Research

Weitere Infos & Material


Basic Relations: Image Sequences — “the World”.- Subjects and Subject Classes.- Application Domains, Missions, and Situations.- Extraction of Visual Features.- Recursive State Estimation.- Beginnings of Spatiotemporal Road and Ego-state Recognition.- Initialization in Dynamic Scene Understanding.- Recursive Estimation of Road Parameters and Ego State while Cruising.- Perception of Crossroads.- Perception of Obstacles and Vehicles.- Sensor Requirements for Road Scenes.- Integrated Knowledge Representations for Dynamic Vision.- Mission Performance, Experimental Results.- Conclusions and Outlook.


From the reviewers: "Prof. Dickmanns is THE most authoritative person in the field; there is only one Dickmanns", Stephano Soatto, UCLA.

"He is one of the world's leading experts in autonomous road-following vehicles, he has conducted seminal research that is 2nd to none. To win such an author is very prestiguous for Springer", Ulrich Zehmzow, Univ. Essex.

Professor Dr.-Ing. Ernst D. Dickmanns has been professor at the University of the Bundeswehr (FRG Army) at Munich for over 30 years. For the last two decades he and his research group have conducted the most influential work worldwide on automous vehicles, producing seminal work on perception systems for dynamic vision. He is considered to have shaped research on autonomous vehicles and to have defined the modern field in this area.


Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.