Смекни!
smekni.com

A Technical Analysis Of Ergonomics And Human (стр. 1 из 2)

Factors In Modern Flight Deck Design Essay, Research Paper

A Technical Analysis of Ergonomics and Human Factors in Modern Flight Deck Design

I. Introduction

Since the dawn of the aviation era, cockpit design has become

increasingly complicated owing to the advent of new technologies enabling

aircraft to fly farther and faster more efficiently than ever before. With

greater workloads imposed on pilots as fleets modernize, the reality of he or

she exceeding the workload limit has become manifest. Because of the

unpredictable nature of man, this problem is impossible to eliminate completely.

However, the instances of occurrence can be drastically reduced by examining the

nature of man, how he operates in the cockpit, and what must be done by

engineers to design a system in which man and machine are ideally interfaced.

The latter point involves an in-depth analysis of system design with an emphasis

on human factors, biomechanics, cockpit controls, and display systems. By

analyzing these components of cockpit design, and determining which variables of

each will yield the lowest errors, a system can be designed in which the

Liveware-Hardware interface can promote safety and reduce mishap frequency.

II. The History Of Human Factors in Cockpit Design

The history of cockpit design can be traced as far back as the first

balloon flights, where a barometer was used to measure altitude. The Wright

brothers incorporated a string attached to the aircraft to indicate slips and

skids (Hawkins, 241). However, the first real efforts towards human factors

implementation in cockpit design began in the early 1930’s. During this time,

the United States Postal Service began flying aircraft in all-weather missions

(Kane, 4:9). The greater reliance on instrumentation raised the question of

where to put each display and control. However, not much attention was being

focused on this area as engineers cared more about getting the instrument in the

cockpit, than about how it would interface with the pilot (Sanders & McCormick,

739).

In the mid- to late 1930’s, the development of the first gyroscopic

instruments forced engineers to make their first major human factors-related

decision. Rudimentary situation indicators raised concern about whether the

displays should reflect the view as seen from inside the cockpit, having the

horizon move behind a fixed miniature airplane, or as it would be seen from

outside the aircraft. Until the end of World War I, aircraft were manufactured

using both types of display. This caused confusion among pilots who were

familiar with one type of display and were flying an aircraft with the other.

Several safety violations were observed because of this, none of which were

fatal (Fitts, 20-21).

Shortly after World War II, aircraft cockpits were standardized to the ?

six-pack’ configuration. This was a collection of the six critical flight

instruments arranged in two rows of three directly in front of the pilot. In

clockwise order from the upper left, they were the airspeed indicator,

artificial horizon, altimeter, turn coordinator, heading indicator and vertical

speed indicator. This arrangement of instruments provided easy transition

training for pilots going from one aircraft to another. In addition, instrument

scanning was enhanced, because the instruments were strategically placed so the

pilot could reference each instrument against the artificial horizon in a hub

and spoke method (Fitts, 26-30).

Since then, the bulk of human interfacing with cockpit development has

been largely due to technological achievements. The dramatic increase in the

complexity of aircraft after the dawn of the jet age brought with it a greater

need than ever for automation that exceeded a simple autopilot. Human factors

studies in other industries, and within the military paved the way for some of

the most recent technological innovations such as the glass cockpit, Heads Up

Display (HUD), and other advanced panel displays. Although these systems are on

the cutting edge of technology, they too are susceptible to design problems,

some of which are responsible for the incidents and accidents mentioned earlier.

They will be discussed in further detail in another chapter (Hawkins, 249-54).

III. System Design

A design team should support the concept that the pilot’s interface with

the system, including task needs, decision needs, feedback requirements, and

responsibilities, must be primary considerations for defining the system’s

functions and logic, as opposed to the system concept coming first and the user

interface coming later, after the system’s functionality is fully defined.

There are numerous examples where application of human-centered design

principles and processes could be better applied to improve the design process

and final product. Although manufacturers utilize human factors specialists to

varying degrees, they are typically brought into the design effort in limited

roles or late in the process, after the operational and functional requirements

have been defined (Sanders & McCormick, 727-8). When joining the design process

late, the ability of the human factors specialist to influence the final design

and facilitate incorporation of human-centered design principles is severely

compromised. Human factors should be considered on par with other disciplines

involved in the design process.

The design process can be seen as a six-step process; determining the

objectives and performance specifications, defining the system, basic system

design, interface design, facilitator design, and testing and evaluation of the

system. This model is theoretical, and few design systems actually meet its

performance objectives. Each step directly involves input from human factors

data, and incorporates it in the design philosophy (Bailey, 192-5).

Determining the objectives and performance specifications includes

defining a fundamental purpose of the system, and evaluating what the system

must do to achieve that purpose. This also includes identifying the intended

users of the system and what skills those operators will have. Fundamentally,

this first step addresses a broad definition of what activity-based needs the

system must address. The second step, definition of the system, determines the

functions the system must do to achieve the performance specifications (unlike

the broader purpose-based evaluation in the first step). Here, the human

factors specialists will ensure that functions match the needs of the operator.

During this step, functional flow diagrams can be drafted, but the design team

must keep in mind that only general functions can be listed. More specific

system characteristics are covered in step three, basic system design (Sanders &

McCormick, 728-9).

The basic system design phase determines a number of variables, one of

which is the allocation of functions to Liveware, Hardware, and Software. A

sample allocation model considers five methods: mandatory, balance of value,

utilitarian, affective and cognitive support, and dynamic. Mandatory allocation

is the distribution of tasks based on limitations. There are some tasks which

Liveware is incapable of handling, and likewise with Hardware. Other

considerations with mandatory allocation are laws and environmental restraints.

Balance of value allocation is the theory that each task is either incapable of

being done by Liveware or Hardware, is better done by Liveware or Hardware, or

can only be done only by Liveware or Hardware. Utilitarian allocation is based

on economic restraints. With the avionics package in many commercial jets

costing as much as 15% of the overall aircraft price (Hawkins, 243), it would be

very easy for design teams to allocate as many tasks to the operator as possible.

This, in fact, was standard practice before the advent of automation as it

exists today. The antithesis to that philosophy is to automate as many tasks as

possible to relieve pressure on the pilot. Affective and cognitive support

allocation recognizes the unique need of the Liveware component and assigns

tasks to Hardware to provide as much information and decision-making support as

possible. It also takes into account limitations, such as emotions and stress

which can impede Liveware performance. Finally, dynamic allocation refers to an

operator-controlled process where the pilot can determine which functions should

be delegated to the machine, and which he or she should control at any time.

Again, this allocation model is only theoretical, and often a design process

will encompass all, or sometimes none of these philosophies (Sanders & McCormick,

730-4).

Basic system design also delegates Liveware performance requirements,

characteristics that the operator must posses for the system to meet design

specifications (such as accuracy, speed, training, proficiency). Once that is

determined, an in-depth task description and analysis is created. This phase is

essential to the human factors interface, because it analyzes the nature of the

task and breaks it down into every step necessary to complete that task. The

steps are further broken down to determine the following criteria: stimulus

required to initiate the step, decision making which must be accomplished (if

any), actions required, information needed, feedback, potential sources of error

and what needs to be done to accomplish successful step completion. Task

analysis is the foremost method of defining the Liveware-Hardware interface. It

is imperative that a cockpit be designed using a process similar to this if it

is to maintain effective communication between the operator and machine (Bailey,

202-6). It is widely accepted that the equipment determines the job. Based on

that assumption, operator participation in this design phase can greatly enhance

job enlargement and enrichment (Sanders & McCormick, 737; Hawkins, 143-4).

Interface design, the fourth process in the design model, analyzes the

interfaces between all components of the SHEL model, with an emphasis on the

human factors role in gathering and interpreting data. During this stage,

evaluations are made of suggested designs, human factors data is gathered (such

as statistical data on body dimensions), and any gathered data is applied. Any

application of data goes through a sub-process that determines the data’s

practical significance, its interface with the environment, the risks of

implementation, and any give and take involved. The last item involved in this

phase is conducting Liveware performance studies to determine the capabilities

and limitations of that component in the suggested design. The fifth step in

the design stage is facilitator design. Facilitators are basically Software

designs that enhance the Liveware-Hardware, such as operating manuals, placards,

and graphs. Finally, the last design step is to conduct testing of the proposed

design and evaluate the human factors input and interfaces between all

components involved. An application of this process to each system design will

enhance the operators ability to control the system within desired

specifications. Some of the specific design characteristics can be found in

subsequent chapters.

IV. Biomechanics

In December of 1981, a Piper Comanche aircraft temporarily lost

directional control in gusty conditions within the performance specifications of

the aircraft. The pilot later reported that with the control column full aft,

he was unable to maintain adequate aileron control because his knees were

interfering with proper control movement (NTSB database). Although this is a

small incident, it should alert engineers to a potential problem area. Probably

the most fundamental, and easiest to quantify interface in the cockpit is the

physical dimensions of the Liveware component and the Hardware designs which

must accommodate it. The comfort of the workspace has long been known to

alleviate or perpetuate fatigue over long periods of time (Hawkins, 282-3).

These facts indicate a need to discuss the factors involved in workspace design.

When designing a cockpit, the engineer should determine the physical

dimensions of the operator. Given the variable dimensions of the human body, it

is naturally impossible to design a system that will accommodate all users. An

industry standard is to use 95% of the population’s average dimensions, by

discarding the top and bottom 2.5% in any data. From this, general design can

be accomplished by incorporating the reach and strength limitations of smaller

people, and the clearance limitations of larger people. Three basic design

philosophies must be adhered to when designing around physical dimensions: reach

and clearance envelopes, user position with respect to the display area, and the

position of the body (Bailey, 273).

Other differences must be taken into account when designing a system,

such as ethnic and gender differences. It is known, for example, that women are,

on average, 7% shorter than men (Pheasant, 44). If the 95 percentile convention

is used, the question arises, on which gender do we base that? One was to speak

of the comparison is to discuss the F/M ratio, or the average female

characteristic divided by the average male characteristic. Although this ratio

doesn’t take into account the possibility of overlap (i.e., the bottom 5th

percentile of males are likely to be shorter than the top 5th percentile of

females), that is not an issue in cockpit design (Pheasant, 44). The other

variable, ethnicity must also be evaluated in system design. Some Asian races,

for example have a sitting height almost ten centimeters lower than Europeans

(Pheasant, 50). This can raise a potential problem when designing an instrument

panel, or windshield.

Some design guides have been established to help the engineer with

conceptual problems such as these, but for the most part, systems designers are

limited to data gathered from human factors research (Tillman & Tillman, 80-7).

As one story went, during the final design phase of the Boeing 777, the chairman

of United Airlines was invited to preview it. When he stood in his first class

seat, his head collided with an overhead baggage rack. Boeing officials were

apologetic, but the engineers were grinning inside. A few months later, the

launch of the first 777 in service included overhead baggage racks that were

much higher, and less likely to be involved in a collision. Unlike this

experience, designing clearances and reach envelopes for a cockpit is too

expensive to be a trial and error venture.

V. Controls

In early 1974, the NTSB released a recomendation to the FAA regarding

control inconsistencies:

“A-74-39. Amend 14 cfr 23 to include specifications for standardizing fuel

selection valve handle designs, displays, and modes of operation” (NTSB

database).

A series of safety accidents occurred during transition training of pilots

moving from the Beechcraft Bonanza and Baron aircraft when flap and gear handles

were mistakenly confused:

“As part of a recently completed special investigation, the safety board

reviewed its files for every inadvertent landing gear retraction accident

between 1975 and 1978. These accidents typically happened because the pilot was

attempting to put the flaps control up after landing, and moved the landing gear

control instead. This inadvertent movement of the landing gear control was often

attributed to the pilot’s being under stress or distracted, and being more

accustomed to flying aircraft in which these two controls were in exactly

opposite locations. Two popular light aircraft, the Beech Bonanza and Baron,

were involved in the majority of these accidents. The bonanza constituted only

about 30 percent of the active light single engine aircraft fleet retractable

landing gear, but was involved in 16 of the 24 accidents suffered by this

category of aircraft. Similarly, the baron constituted only 16 percent of the

light twin fleet, yet suffered 21 of the 39 such accidents occurring to these

aircraft” (NTSB database).

Like biomechanics, the design of controls is the study of physical relationships

within the Liveware-Hardware interface. However, control design philosophy

tends to be more subtle, and there is slightly more emphasis on psychological

components. A designer determines what kind of control to use in a system only

after the purpose of the system has been established, and what operator needs