COGNITIVE TASKS:DIAGNOSIS, DECISION MAKING AND ERGONOMICS
DIAGNOSIS, DECISION MAKING AND ERGONOMICS
After this brief review of behavioral and cognitive models, we can now focus on two complex cognitive processes: diagnosis and decision making. These cognitive processes are brought into play in many problem-solving situations where task goals may be insufficiently specified and responses may not benefit from past knowledge. These characteristics are common to many problem-solving scenarios and affect how people shift to different modes of cognitive control. Problem solvers may use experiences from similar cases in the past, apply generic rules related to a whole category of problems, or try alternative courses of actions and assess their results. In other words, optimizing problem solving on the basis of knowledge-based behavior may be time consuming and laborious. People tend to use several heuristics to regulate their performance between rule-based and knowledge- based processing. This increases task speed but may result in errors that are difficult to recover from.
In following, we present a set of common heuristics followed by experienced diagnosticians and decision makers, potential biases and errors that may arise, and finally ergonomic interventions how to support human performance.
Diagnosis
Diagnosis is a cognitive process whereby a person tries to identify the causes of an undesirable event or situation. Technical failures and medical problems are two well-known application areas of human diagnosis. Figure 6 presents some typical stages of the diagnosis process. Diagnosis starts with the perception of signals alerting one to a system’s failure or malfunction. Following this, diagnosticians may choose whether to search for more information to develop a mental representation of the current system state. At the same time, knowledge about system structure and functioning can be retrieved from the long-term memory. On the basis of this evidence, diagnosticians may generate hypotheses about possible causes of the failure. Pending upon further tests, hypotheses may be confirmed, com- pleting the diagnosis process, or rejected, leading to selection of new hypotheses. Compensation for failures may start when the diagnostician feels confident that a correct interpretation has been made of the situation.
Fault diagnosis is a demanding cognitive activity whereby the particular diagnostic strategy will be influenced by the amount of information to be processed in developing a mental model of the situation, the number of hypotheses consistent with available evidence, and the activities required for testing hypotheses. In turn, these factors are influenced by several characteristics of the system or the work environment, including:
• The number of interacting components of the system
• The degrees of freedom in the operation of the system
• The number of system components that can fail simultaneously
• The transparency of the mechanisms of the system
Time constraints and high risk may also add up, thus increasing the difficulty of fault diagnosis. A particularly demanding situation is dynamic fault management (Woods 1994), whereby operators have to maintain system functions despite technical failures or disturbances. Typical fields of practice
where dynamic fault management occurs are flight deck operations, control of space systems, anes- thetic management, and process control.
When experienced personnel perform fault diagnosis, they tend to use several heuristics to over- come their cognitive limitations. Although heuristics may decrease mental workload, they may often lead to cognitive biases and errors. It is worth considering some heuristics commonly used when searching for information, interpreting a situation, and making diagnostic decisions.
Failures in complex systems may give rise to large amounts of information to be processed. Experienced people tend to filter information according to its informativeness—that is, the degree to which it helps distinguish one failure from another. However, people may be biased when applying heuristics. For instance, diagnosticians may accord erroneous or equal informative value to a pattern of cues (‘‘as-if’’ bias: Johnson et al. 1973). In other cases, salient cues, such as noises, bright lights, and abrupt onsets of intensity or motion, may receive disproportionate attention (salience bias: Payne 1980).
Diagnosis may be carried out under psychological stress, which can impose limitations in enter- taining a set of hypotheses about the situation (Rasmussen 1981; Mehle 1982; Lusted 1976). To overcome this cognitive limitation, experienced diagnosticians may use alternative strategies for hy- pothesis generation and testing. For instance, diagnosticians may start with a hypothesis that is seen as the most probable one; this probability may be either subjective, based on past experience, or communicated by colleagues or the hierarchy. Alternatively, they may start with a hypothesis asso- ciated with a high-risk failure, or a hypothesis that can be easily tested. Finally, they may start with a hypothesis that readily comes to mind (availability bias: Tversky and Kahneman 1974).
Another heuristic used in diagnosis is anchoring, whereby initial evidence provides a cognitive anchor for the diagnostician’s belief in this hypothesis over the others (Tversky and Kahneman 1974). Consequently, people may tend to seek information that confirms the initial hypothesis and avoid any disconfirming evidence (Einhorn and Hogarth 1981; DeKeyser and Woods 1990). This bias is also known as cognitive tunnel vision (Sheridan 1981).
Finally, Einhorn and Hogarth (1981), Schustack and Sternberg (1981), and DeKeyser and Woods (1990) describe situations where diagnosticians tend to seek—and therefore find—information that confirms the chosen hypothesis and to avoid information or tests whose outcomes could reject it. This is known as confirmation bias. Two possible causes for the confirmation bias have been pro- posed: (1) people seem to have greater cognitive difficulty dealing with negative information than with positive information (Clark and Chase 1972); (2) abandon a hypothesis and reformulating a new one requires more cognitive effort than does searching for and acquiring information consistent with the first hypothesis (Einhorn and Hogarth 1981; Rasmussen 1981).
Decision Making
Decision making is a cognitive process whereby a person tries to choose a goal and a method that would stabilize the system or increase its effectiveness. In real-world situations, goals may be insuf- ficiently defined, and thus goal setting becomes part of decision making. Furthermore, evaluation criteria for choosing among options may vary, including economic, safety, and quality considerations. In such cases, optimizing on most criteria may become the basis for a good decision. Managerial planning, political decisions, and system design are typical examples of decision making.
Traditional models of decision making have adopted a rational approach that entailed:
1. Determining the goal(s) to be achieved and the evaluation criteria
2. Examining aspects of the work environment
3. Developing alternative courses of action
4. Assessing alternatives
5. Choosing an optimal course of action
Rational models of decision making view these stages as successive; however, in real fields of practice their sequence may change (Marmaras et al. 1992; Klein 1997; Dreyfus 1997) as a function of several factors in the work environment.
Factors that could influence the difficulty of decision making include:
• The amount of information that the decision maker has to consider
• The uncertainty of available information
• The dynamic nature of the work environment
• The complexity of evaluation criteria
• The alternative courses of action that can be developed
• The time constraints imposed on the decision maker
• The risk related to possible decisions
To overcome limitations in human cognition, experienced decision makers may use a number of heuristics. As in diagnosis, heuristics may allow decision makers to cope with complexity in real- world situations but on the other hand, they may lead to cognitive biases and errors. For instance, decision makers may use information selectively and make speculative inferences based on limited data. Hayes-Roth (1979) has called this heuristic ‘‘opportunistic thinking,’’ which is similar to Simon’s (1978) concept of ‘‘satisfying and search cost’’; in other words, decision makers may pursue a trade- off between seeking more information and minimizing the cost in obtaining information. Although this strategy can simplify decision making, important data may be neglected, leading to erroneous or suboptimal decisions.
To cope with complexity and time pressure, decision makers tend to acquire sufficient evidence to form a mental representation of the situation and examine a rather limited set of alternative actions (Marmaras et al. 1992). Instead of generating a complete set of alternatives at the outset and sub- sequently performing an evaluation based on optimizing several criteria, decision makers may start with an option that has been incompletely assessed. This heuristic could be attributed to the fact that experienced decision makers possess a repertoire of well-practiced responses accessed through rec- ognition rather than conscious search (Simon 1978). Limited consideration of alternatives, however, can lead to ineffective practices. For instance, if the situation at hand differs in subtle ways from previous ones, suboptimal solutions may be adopted. Furthermore, in domains such as system design and managerial planing, new innovative solutions and radical departures may be of great advantage.
In dynamic systems, the distinguishing features of a situation may change over time and new events may add up. Operators should reflect on their thinking and revise their assessment of the situation or their earlier decisions in order to take account of new evidence, interruptions, and negative feedback (Weick 1983; Schon 1983; Lindblom 1980). Under stress, however, experienced decision makers may fixate on earlier decisions and fail to revise them at later stages. Thinking / acting cycles may compensate to some extent for cognitive fixation on earlier decisions. That is, initial decisions can be effected on the basis of related experiences from similar situations, but their suitability can be evaluated after the first outcomes; in this way, decision makers can undertake corrective actions and tailor earlier decisions to new circumstances.
Supporting Diagnosis and Decision Making
To overcome these weaknesses of human cognition, many engineering disciplines, such as artificial intelligence, operations research, and supervisory control, have pursued the development of stand- alone expert systems that support humans in diagnosis and decision making. Although these systems have made a significant contribution to system design, their performance seems to degrade in unfa- miliar situations because humans find it rather awkward to cooperate. Operators are forced to repeat the whole diagnostic or decision-making process instead of taking over from the computer advisor. There is a need, therefore, to develop cognitive advisors that enhance the cognitive processes of operators rather than to construct computer advisors capable of independent performance (Woods and Hollnagel 1987; Roth et al. 1987).
To combat several biases related to diagnosis and decision making, support can be provided to human operators in the following ways:
• Bring together all information required to form a mental representation of the situation.
• Present information in appropriate visual forms.
• Provide memory aids.
• Design cognitive aids for overcoming biases.
• Make systems transparent to facilitate perception of different system states.
• Incorporate intelligent facilities for hypothesis testing and evaluation of options.
Ecological interfaces (Vicente and Rasmussen 1992), following the arguments presented earlier, provide a good example of artifacts that meet some of these requirements. Other cognitive aids are presented in Section 5.
Real-world situations requiring diagnosis and decision making can vary, each one presenting its own specific characteristics. As a result, artifacts and cognitive aids require an exhaustive analysis of the demands of the situation, the user requirements, and the user strategies in achieving satisfactory performance. This type of analysis, usually referred to as cognitive analysis, is valuable in putting human performance models into actual practice.
Comments
Post a Comment