Moriarty | Practical Human Factors for Pilots | E-Book | www.sack.de
E-Book

E-Book, Englisch, 304 Seiten

Moriarty Practical Human Factors for Pilots


1. Auflage 2014
ISBN: 978-0-12-800786-0
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark

E-Book, Englisch, 304 Seiten

ISBN: 978-0-12-800786-0
Verlag: Elsevier Science & Techn.
Format: EPUB
Kopierschutz: 6 - ePub Watermark



Practical Human Factors for Pilots bridges the divide between human factors research and one of the key industries that this research is meant to benefit-civil aviation. Human factors are now recognized as being at the core of aviation safety and the training syllabus that flight crew trainees have to follow reflects that. This book will help student pilots pass exams in human performance and limitations, successfully undergo multi-crew cooperation training and crew resource management (CRM) training, and prepare them for assessment in non-technical skills during operator and license proficiency checks in the simulator, and during line checks when operating flights. Each chapter begins with an explanation of the relevant science behind that particular subject, along with mini-case studies that demonstrate its relevance to commercial flight operations. Of particular focus are practical tools and techniques that students can learn in order to improve their performance as well as 'training tips' for the instructor. - Provides practical, evidence-based guidance on issues often at the root of aircraft accidents - Uses international regulatory material - Includes concepts and theories that have practical relevance to flight operations - Covers relevant topics in a step-by-step manner, describing how they apply to flight operations - Demonstrates how human decision-making has been implicated in air accidents and equips the reader with tools to mitigate these risks - Gives instructors a reliable knowledge base on which to design and deliver effective training - Summarizes the current state of human factors, training, and assessment

David Moriarty is founder of Zeroharms Solutions, a company that specializes in the science of safety. Dr. Moriarty was a medical doctor prior to becoming an airline captain and Crew Resource Management Instructor. As well as a medical degree, he also holds degrees in Neuroscience (BSc) and Human Factors (MSc), is a Member of the Royal Aeronautical Society and the Resilience Engineering Association and also has extensive instructional experience

Moriarty Practical Human Factors for Pilots jetzt bestellen!

Autoren/Hrsg.


Weitere Infos & Material


1;Front Cover;1
2;Practical Human Factors for Pilots;4
3;Copyright Page;5
4;Dedication;6
5;Short contents;8
6;Full contents;10
7;Acknowledgements;16
8;Author biography;17
9;Preface;18
9.1;What this book is not;21
9.2;How to use this book;22
9.3;A few other things to note;24
10;1 Introduction to human factors;26
10.1;1.1 The start of modern human factors;26
10.2;1.2 What is human factors?;28
10.3;1.3 A picture of human factors in aviation;29
10.4;1.4 Human factors and non-technical skills;32
10.5;Chapter key points;34
10.6;References;34
11;2 Information processing;36
11.1;Introduction;37
11.2;2.1 Introduction to brain structure and function;38
11.2.1;2.1.1 Types of neuron;42
11.3;2.2 Overview of information processing;43
11.4;2.3 Sensation;44
11.4.1;2.3.1 Sensory thresholds;45
11.4.2;2.3.2 Sensory habituation;47
11.4.3;2.3.3 Somatogravic sensory illusions;49
11.4.4;2.3.4 Strategies for dealing with sensory limitations;50
11.5;2.4 Attention;51
11.5.1;2.4.1 Selective attention;52
11.5.1.1;2.4.1.1 Strategies for selective attention;54
11.5.2;2.4.2 Sustained attention (passive monitoring/vigilance);55
11.5.2.1;2.4.2.1 Strategies for sustained attention;57
11.5.3;2.4.3 Divided attention and multitasking;57
11.5.3.1;2.4.3.1 Strategies for divided attention;58
11.6;2.5 Perception;58
11.6.1;2.5.1 Mental models;60
11.6.2;2.5.2 Perceptual difficulties with sensory-induced spatial disorientation;63
11.6.3;2.5.3 Strategies for dealing with perceptual limitations;66
11.7;2.6 Decision making;67
11.7.1;2.6.1 The anatomy of decision making;68
11.7.1.1;2.6.1.1 Goal module;68
11.7.1.2;2.6.1.2 Imaginal (mental manipulation) module;68
11.7.1.3;2.6.1.3 Production (pattern-matching) module;69
11.7.1.4;2.6.1.4 Summary of the anatomy of decision making;70
11.7.2;2.6.2 The two systems of human decision making;72
11.7.2.1;2.6.2.1 System 1;72
11.7.2.2;2.6.2.2 System 2;73
11.7.2.3;2.6.2.3 Summary of Systems 1 and 2;74
11.7.2.4;2.6.2.4 How Systems 1 and 2 interact and the role of workload;74
11.7.3;2.6.3 Heuristics and biases;75
11.7.3.1;2.6.3.1 Evolutionary origins of heuristics and biases;76
11.7.3.2;2.6.3.2 Heuristics and biases in aviation;77
11.7.3.3;2.6.3.3 Decision making and memory retrieval heuristics;77
11.7.3.3.1;Confirmation bias;77
11.7.3.3.2;Availability heuristic;77
11.7.3.3.3;Plan continuation bias;79
11.7.3.3.4;Representativeness heuristic;79
11.7.3.3.5;Automation bias;80
11.7.3.4;2.6.3.4 Social heuristics;80
11.7.3.4.1;Overconfidence heuristic;80
11.7.3.4.2;Halo effect heuristic;80
11.7.3.4.3;Hindsight bias;82
11.7.3.4.4;Fundamental attribution error;84
11.7.4;2.6.4 Strategies for decision making;84
11.7.4.1;2.6.4.1 Managing high task-load;85
11.7.4.2;2.6.4.2 Managing high time-pressure;85
11.7.4.3;2.6.4.3 Managing problem underspecificity;86
11.7.4.4;2.6.4.4 Managing System 1 effects;86
11.7.5;2.6.5 An operational decision-making strategy: TDODAR;86
11.8;2.7 Response;90
11.9;2.8 Fear-potentiated startle and freezing;91
11.9.1;2.8.1 Strategies for managing fear-potentiated startle and freezing;94
11.10;2.9 A note about the models used in this chapter;94
11.11;Chapter key points;95
11.12;References;97
11.13;Recommended reading;97
12;3 Error management and standard operating procedures for pilots;102
12.1;Introduction;102
12.2;3.1 Performance levels;103
12.3;3.2 Errors and violations at different performance levels;104
12.4;3.3 Detection of errors;107
12.5;3.4 The Swiss Cheese Model;109
12.6;3.5 Threat and Error Management 2;111
12.6.1;3.5.1 Threat management;111
12.6.1.1;3.5.1.1 Threat management opportunities;113
12.6.1.2;3.5.1.2 Types of threat;113
12.6.1.3;3.5.1.3 Threats associated with serious negative outcomes;115
12.6.1.3.1;Loss of control;115
12.6.1.3.2;Runway excursions;116
12.6.1.3.3;Controlled flight into terrain;117
12.6.1.3.4;Runway incursions;118
12.6.1.3.5;Airborne conflict;118
12.6.1.3.6;Ground handling;119
12.6.1.3.7;Fire;119
12.6.1.3.8;Summary of main threats;120
12.6.1.4;3.5.1.4 Threat identification framework;120
12.6.1.5;3.5.1.5 Generic threat management strategies;122
12.6.1.6;3.5.1.6 Threat management tool for briefings;122
12.6.2;3.5.2 Unsafe act (error and violation) management;122
12.6.2.1;3.5.2.1 Unsafe act prevention where no threat has been identified;123
12.6.2.2;3.5.2.2 Unsafe act detection: checklists and crew communication;124
12.6.2.2.1;Detection of skill-based errors: checklists;125
12.6.2.2.2;Detection of skill-based violations, and rule- and knowledge-based errors: crew interaction;126
12.6.2.2.3;Detection of rule- and knowledge-based violations;129
12.6.2.2.4;Summary of unsafe act detection strategies;130
12.6.2.3;3.5.2.3 Unsafe act management strategies;130
12.6.3;3.5.3 Undesired aircraft state management;131
12.6.4;3.5.4 Summary of TEM2;133
12.6.5;3.5.5 History of Threat and Error Management and differences between original TEM and TEM2;133
12.7;3.6 TEM2 and unstabilized approaches;135
12.7.1;3.6.1 Threat management for unstabilized approaches;136
12.7.2;3.6.2 Unsafe act management for unstabilized approaches;138
12.7.3;3.6.3 Undesired aircraft state management for unstabilized approaches;138
12.8;Chapter key points;140
12.9;References;141
12.10;Recommended reading;141
13;4 Error management and standard operating procedures for organizations;144
13.1;Introduction;144
13.2;4.1 Beyond human error;145
13.3;4.2 Systems thinking;145
13.3.1;4.2.1 Normal Accident Theory;146
13.3.2;4.2.2 The Old View versus the New View of human error;147
13.4;4.3 Resilience engineering;149
13.5;4.4 Safety culture;152
13.5.1;4.4.1 Just culture;152
13.5.2;4.4.2 Moving from Safety I to Safety II;154
13.6;4.5 Principles of managing organizational resilience;155
13.7;Chapter key points;156
13.8;References;157
13.9;Recommended reading;156
14;5 Personality, leadership and teamwork;158
14.1;Introduction;158
14.2;5.1 Personality;159
14.2.1;5.1.1 Personality structure;159
14.2.2;5.1.2 Personality and behavior;163
14.2.3;5.1.3 Personality management strategies;164
14.3;5.2 Leadership and command;165
14.3.1;5.2.1 Leadership and personality;166
14.3.2;5.2.2 Non-technical skills for leadership;168
14.3.3;5.2.3 Leadership, personality and flight safety;169
14.3.3.1;5.2.3.1 The toxic captain;171
14.3.3.2;5.2.3.2 The toxic first officer;175
14.4;5.3 Flight deck gradient;177
14.4.1;5.3.1 Culture and flight deck gradient;177
14.5;5.4 Cooperation and conflict solving;181
14.5.1;5.4.1 Individual differences and conflict-solving strategies;182
14.5.2;5.4.2 Situational variables and conflict-solving strategies;183
14.6;Chapter key points;184
14.7;References;185
15;6 Communication;188
15.1;Introduction;188
15.2;6.1 The Sender–Message–Channel–Receiver model of communication;188
15.2.1;6.1.1 Barriers to communication;191
15.2.2;6.1.2 The NITS briefing;192
15.3;6.2 Communication between pilots;193
15.3.1;6.2.1 Introduction to transactional analysis;193
15.3.2;6.2.2 Ego states;194
15.3.3;6.2.3 Complementary and crossed transactions;195
15.3.4;6.2.4 Ulterior transactions;201
15.4;6.3 Establishing a positive team atmosphere;202
15.5;6.4 Communication strategies for effective briefings;203
15.6;6.5 Communication strategies for assertiveness;205
15.6.1;6.5.1 First officer assertiveness at the threat management stage;207
15.6.2;6.5.2 First officer assertiveness at the unsafe act management stage;208
15.6.3;6.5.3 First officer assertiveness at the undesired aircraft state management stage;210
15.6.4;6.5.4 How captains can encourage assertiveness;210
15.7;Chapter key points;211
15.8;References;212
16;7 Fatigue risk management;214
16.1;Introduction;214
16.2;7.1 Introduction to sleep;215
16.3;7.2 Fatigue;217
16.3.1;7.2.1 Cognitive effects of fatigue;218
16.4;7.3 Role of sleep in managing fatigue;220
16.4.1;7.3.1 Physiology of sleep;221
16.4.2;7.3.2 Sleep inertia;224
16.4.3;7.3.3 Homeostatic sleep drive and sleep need;225
16.4.4;7.3.4 Biological clock, circadian rhythms, chronotypes and sleep urge;226
16.4.4.1;7.3.4.1 Biological clock and circadian rhythms;227
16.4.4.2;7.3.4.2 Chronotypes;228
16.4.4.3;7.3.4.3 Melatonin;228
16.4.4.4;7.3.4.4 Sleep urge;229
16.4.5;7.3.5 Sleep debt;231
16.5;7.4 Fatigue risk management strategies;235
16.5.1;7.4.1 How to achieve sufficient high-quality sleep;236
16.5.1.1;7.4.1.1 Planning your sleep;236
16.5.1.2;7.4.1.2 Sleep hygiene;237
16.5.1.2.1;Sleep hygiene before sleeping;237
16.5.1.2.2;Sleep hygiene in your bedroom;238
16.5.2;7.4.2 How to nap effectively;238
16.5.3;7.4.3 How to deal with insomnia and sleep disorders;239
16.5.3.1;7.4.3.1 Chronic insomnia;240
16.5.3.2;7.4.3.2 Obstructive sleep apnea;240
16.5.4;7.4.4 How to mitigate risk if you find yourself fatigued;241
16.5.5;7.4.5 How to use sleep medications;243
16.5.6;7.4.6 How to deal with jet lag;244
16.5.7;7.4.7 Organizational strategies for fatigue risk management;247
16.6;Chapter key points;247
16.7;References;248
16.8;Recommended reading;248
17;8 Stress management and alcohol;252
17.1;Introduction;252
17.2;8.1 Chronic stress;252
17.2.1;8.1.1 Prevention of allostatic load due to chronic stress;255
17.2.2;8.1.2 Management of allostatic load due to chronic stress;255
17.2.2.1;8.1.2.1 Management strategies aimed at the stressor;256
17.2.2.2;8.1.2.2 Management strategies aimed at the individual;257
17.2.3;8.1.3 Critical incident stress management;258
17.3;8.2 Alcohol;259
17.3.1;8.2.1 Alcoholism in aviation;263
17.4;Chapter key points;265
17.5;References;266
17.6;Recommended reading;266
18;9 Automation management;268
18.1;Introduction;268
18.2;9.1 Systems of aircraft automation;270
18.3;9.2 Flight control laws;271
18.4;9.3 Levels of automation and their uses;274
18.5;9.4 Flight mode annunciators;277
18.6;9.5 Automation, perception and Newton’s laws of motion;278
18.7;9.6 The ironies of automation;279
18.8;9.7 Skill fade and automation dependency;283
18.9;9.8 Automation complacency;284
18.10;9.9 Automation bias;285
18.11;9.10 Automation surprises;287
18.12;Chapter key points;290
18.13;References;291
19;Conclusion;294
20;Index;298


Preface
As many of you will know, every year, pilots and cabin crew have to receive training in crew resource management (CRM). This is essentially human factors training but specifically for aviation. Because this is a legal requirement, it has to be completed within a specified time-frame. Occasionally, because of illness or scheduling difficulties, we have to run a one-to-one CRM course so that the crew member remains legal to operate flights. As a freelance CRM instructor, I also do training for a variety of different airlines as well as my own airline. A while back, I carried out some CRM training for a business jet operator based in Europe. Unfortunately, one senior training captain was unable to attend the CRM training. Because he was due to start a work trip within the next couple of days, we had to arrange for him to complete his CRM training prior to leaving. In the event, the easiest way to achieve this was for him to fly to the UK and get him to undertake a one-to-one CRM course with me. If there are any CRM instructors reading this, you will know that it can be incredibly difficult to run a successful course with just one person. One of the key benefits of CRM training is the discussion that occurs between different crew members. Opinions and experiences are shared, situations are analyzed from a variety of different perspectives and, hopefully, consensus is reached. Needless to say, this becomes a lot more difficult when there is just one student in the room. The pilot in question was a highly experienced training captain who had served for many years in the military. He had flown for both commercial and business jet operators and had been exposed to CRM training since its development in the 1980s. Given that there were just the two of us, I thought I would start the session by asking a candid question so I asked him what he thought of CRM. At first he gave me the answers that I have come to expect. He talked about how important it was, how communication is important, teamwork is important, and so on. He also said that it was relevant because it is the human rather than the machine that leads to aviation accidents these days. I pressed him a little further on this. These were answers that I had heard on just about every other CRM course where I had been an instructor or a participant: well-rehearsed, timeworn phrases that I could imagine hearing from any pilot, cabin crew member, aviation instructor or training department manager. Given that his answers were pretty standard, I took a different tack and asked him how CRM training has affected his behavior during day-to-day operations. This was clearly a more difficult question as he was unable to give me any examples of how his 20 years of CRM training had affected his performance in any way whatsoever. Based on this, I repeated my first question. “What you think of CRM?” Finally, he gave me his honest answer: “It’s a waste of time”. Unfortunately, I think he is right. When CRM training came into being back in the 1980s, it was generally felt that aircraft were becoming safer and safer and that it was now the human element that led to accidents and incidents. The old image of the gallant, swashbuckling captain, wrestling the burning aircraft on to the runway, while all around him panic, was beginning to fade. It was becoming increasingly apparent that successful management of abnormal situations was best achieved by utilizing the resources of all the crew members. This became the basis of modern CRM training. It was a profound step and, indeed, many other industries looked and continue to look to aviation as the pioneers of this new and exciting field. But what is CRM today? Has it lived up to the promise of revolutionizing aeronautical safety? Or is it, as I have heard from so many students, just common sense by a different name? I suspect that many crew members feel the same as that business jet training captain does: a good idea in principle; a waste of time in reality. I think that it is important that from the outset that I give you my answers to these questions as this explains why I thought it was important to write this book. I think that we, as an industry, have failed. Although our initial analysis was correct, insofar as the human element had become the key to understanding the accidents and incidents happening in aviation, we have not followed this up in either our training philosophy or our legislation. A common statistic that is often repeated is that 70% of aviation accidents are due to human error. While I take some exception to this figure (show me an accident that does not include an element of human error!), it leads to an interesting observation. When we consider how we train pilots and cabin crew, the vast majority of the time allotted is spent in training them how to deal with technical failures. We spend comparatively little time looking at human factors. Why should this be so, given that we know that the majority of accidents are due to human error? For me, the answer is simple: it is easier to teach technical subjects. As will become apparent to you as you work through this book, human factors is a truly huge subject. It encompasses just about every aspect of human performance. It is far easier to teach crew members how to deal with technical failure than it is to equip them with accurate, research-based human factors strategies and behaviors that they can use in the workplace to enhance safety and efficiency. Another major drawback of human factors in aviation is that it is rarely assessed. Although it is common for national regulators to stipulate that crew members’ “non-technical skills” are assessed in the simulator and during line checks, we do not routinely assess an individual’s knowledge of the human factors. I am reminded of a conversation I had with someone who worked for the national regulatory authority. We were talking about the importance of CRM training and I expressed my view that it was strange that it was the only subject that was not formally assessed and that this may, inaccurately, suggest to students that the subject is not important. The response I received was, “If you can get the subject across to just one or two people in the room, then you’re doing well”. Would we have the same, laissez-faire view if the subject was “hydraulics” or “safety and emergency procedures”? Of course not! We require all crew members to learn and be able to recall the details of these technical subjects. Failure to do so may have a negative impact on an individual’s career. Why then, if we have all agreed on the importance of human factors with regard to aviation safety, do we not take the same approach to teaching and assessing this subject? The only answer that occurs to me is that the science of human factors is so broad and so “messy” that we have not, as an industry, agreed on what is important and what is not. For example, the concept of “situational awareness” is still hotly debated. There are some human factors academics who feel that this is not a useful term.1 There are others who think that it is a useful term. Who is right? The sad answer is that we do not know. Unlike hydraulics or safety and emergency procedures, the subject of human factors is not black and white. While national legislation specifies that human factors must be taught in the form of CRM and lists various topics that must be covered in such training, it does not, as yet, specify which approach should be taken to each of these topics. As you will soon see, there are multiple approaches that can be taken with each topic. If there is no uniformity in how different operators determine how a topic should be taught, is it fair to assess people on it? The unfortunate side-effect of this dilemma is that when crew members are scheduled to attend CRM training they are well aware that their knowledge of the subject will not be assessed at the end of the day. I believe that this, tragically, leads to the impression that the subject is not important. If it is not important enough to assess, then where is the motivation to pay attention and to learn? Sadly, for the time being, the legislation is what it is although there do seem to be some changes on the horizon in Europe with recent proposed changes put forward by the European Aviation Safety Agency. When I look back on my experience of teaching this subject, I have come to the conclusion that the term “CRM” is tainted. This generation of crew has been brought up in a system that has not made up its mind on which is the most effective way to teach this safety-critical subject. Although some people are willing to engage with the subject, I believe that the majority of people are unconvinced about its importance and, for that, instructors like me must bear some responsibility. If there are any other CRM instructors reading this now, I am sure you will, as I have, have had the heartbreaking experience of standing up in front of a group that you have taught the year before and asked them to recall anything of what they have learned previously, and been met with blank stares. If the information is not being retained or, more importantly, not being put into practice, then what is the point of CRM training? This is a question that troubles me deeply. Unfortunately, I am not a legislator. If I were, I would drop the term “CRM” and replace it with “human factors”. I would completely revise and expand the training syllabus and give specific objectives to be achieved for each subject, and I would make it compulsory for each subject to be assessed formally. As it stands, I am a humble instructor trying to communicate the importance of these...



Ihre Fragen, Wünsche oder Anmerkungen
Vorname*
Nachname*
Ihre E-Mail-Adresse*
Kundennr.
Ihre Nachricht*
Lediglich mit * gekennzeichnete Felder sind Pflichtfelder.
Wenn Sie die im Kontaktformular eingegebenen Daten durch Klick auf den nachfolgenden Button übersenden, erklären Sie sich damit einverstanden, dass wir Ihr Angaben für die Beantwortung Ihrer Anfrage verwenden. Selbstverständlich werden Ihre Daten vertraulich behandelt und nicht an Dritte weitergegeben. Sie können der Verwendung Ihrer Daten jederzeit widersprechen. Das Datenhandling bei Sack Fachmedien erklären wir Ihnen in unserer Datenschutzerklärung.