Engineering Psychology and Human Machine Interaction

As a specific sub-discipline of industrial psychology, engineering psychology (EP) underlies many of the services we provide that address particular problems having to do with the structure of organizations. These include services that have been looked at on other pages on this site, such as Error and Accident Troubleshooting, and Risk Handling. On this page, we show in greater detail what EP is all about, how it relates to human machine interaction, and how Psychonomics can apply knowledge and principles from the field to add value to your organization.

Engineering psychology is about designing a system that responds to the information-processing capacities and constraints of the human brain. In contrast, human factors design, which you may be more familiar with, is about designing a system that responds to the physical properties and constraints of the human body, a very different discipline. The EP that we do is also conducted as a result of a wide literature and documented experiments to elucidate governing principles that can be transferred to other situations. The specialty of human factors, on the other hand, is much more context specific, with little transferability. In practice, these distinctions are important for organizations to understand as they relate to the type of expertise that needs to be called on.

To a large extent EP is about optimizing the system; and when we talk about systems we mean the system in its widest context. Hence, at Psychonomics, we work to design a system that maximizes relationships and interactions within the organizational network constituted by parts which include: machines (including computers), operations, human decisions, tasks, knowledge, procedures, and human functioning. As can be seen, there are many people-parts to an organizational system, and any intervention Psychonomics applies, with an aim of creating improvement, is going to be directed at these people in some way. And when successful, this too underlies how value is added to companies.

Leaving aside for the moment the wider need of reengineering the whole system where required, within the system itself human performance is key, as it relates to how the brain perceives and decides what to do (perception, decision, action). This is implicated in the design of not only tasks, but also of display construction and use, and control design: the interface.

An extreme EP example of how a lack of good system design can end in disaster is the Three-Mile Island nuclear power plant core breach, where four major system errors were committed:
1. Alternate feed-water supply pipe blocked by a maintenance crew - the radioactive core no longer received cold water to remove heat
2. A main thermostatically controlled valve did not close due to a malfunction
3. Indicator display showed pressure relief valve had closed when it hadn't
4. Operators decided to manually override system and shut off the emergency pump - they thought the coolant was too high when it was, in fact, too low
Responsibility - and resulting failure - was distributed across several sources within the system and human error was evident at several levels - incorrect decisions, design of the relief valve, complexity of confusing information, and the way information was displayed. This overloaded the cognitive capacities of the people involved. The human mind has intrinsic limitations in the way that it attends, perceives, remembers, decides and acts.

At the interface level, much of the EP we do is concerned with reengineering the human machine interaction, commonly known as HMI, where there are dials, meters, screens and indicators that employees have to interact with. Consider: how would you react if a safety warning was visible on the side of a screen as opposed to the center
of the screen? Research suggests that generally such a warning should be centralized as it is more noticeable. And, in fact, we specialize in addressing these types of problems concerning the interaction of people with computer displays. In this context too, it's worth bearing in mind that actual performance often departs from predicted or assumed optimal performance - ie people tend to make systematic errors. Human processing capacities and constraints, therefore, underlie how a system interface should be designed. So, for example, in cockpit design there are dials and displays that provide information about the current aerodynamic operating parameters, engine performance, as well as primary, secondary and ancillary systems. The pilot needs to be able to get feedback and be able to control the system when necessary. Color, layout, control input (button, switch, stick, lever) and alarms, all impact on usability. Similarly, when it comes to indicator lights, they could vary in not only color but also intensity, whether they flash, and the frequency of the flash. All these factors and more underlie the effective use of interfaces by a human operator.

From the EP perspective too, the system design we apply also includes training, as well as recruitment and selection components. Relevant psychological findings are used to determine how the operator will best be taught the skills for safe and efficient operation of the system's mechanisms. Of concern to us here too are such factors as conceptual training v rote learning; part v whole task learning; forgetting; verbal v spatial abilities; words v learning with pictures or graphs. It is our job to determine what is going to work best in what situation. A similar approach is used for the selection of operators, in terms of how we select the most appropriate individuals for your organization, bearing in mind the competencies required and the assessed ability of potential employees to learn new skills. This is where assessment centers really come into their own (see: Selection and Recruitment).

So what kind of factors do we look at when a problem materializes? Well, a change in human functioning effecting quality, say, or any other organizational function, or the appearance of error, can result from a breakdown of any of the following components, amongst others, when having to deal with a new stimulus (and by stimulus we mean any input that is recognized by one of the five senses):

Perception: The recognizing and encoding of a stimulus. The stimulus might also make contact with a unique neural code that has been previously learned, such as different forms of the letter 'a' - spoken, visual, coming from male or female etc, that creates a complex response.

Sensory memory: This is a short term memory store that can prolong a representation of a stimulus for a brief time after the stimulus has ended and attention has been diverted.

Nature of stimulus: A stimulus may have different levels of complexity which has to be recognized. In addition, the operator task requirement must also be considered. The task may require a dichotomous response yes/no - radar detection of blip - or it may also require recognition, identification, and categorization (absolute judgment tasks) - the smoothness of a steel plate, loudness of a note, size of a crowd.

Feedback and information flow: Constant monitoring and updating of the decision and execution process, which can also be altered by internal thoughts

Attention: Characterized as a searchlight that selects information sources to process. Less attentional resource is required for a given task as a skill increases An example is conversing in a car, where attention is automatic. But this will be stopped if there is a sudden need to concentrate on a traffic situation.

Signal detection and vigilance: People perceive external signals differently. And the complexity of the task involving a signal, such as in radar operation, relates to factors like processing efficiency compared to number of events, the time the operator has to process it, and the task difficulty. It is also often hard to detect events that are near the perceptual threshold. For example, the guard noticing the thief in a bank of security cameras.

Memory: Characterized as both short-term (working) memory and long-term memory. The latter can be depicted as having three stages: 1) putting information in - encoding, learning and training. 2) the way information is held - storage. 3) Getting information out - retrieval. All these stages have potential difficulties for organizations. For example, forgetting - pilots may forget complex navigational instructions before implementing them. A plant operator may forget which dial he was looking at. Complexity is a related problem here. For example, overly increasing labels on a computer screen - such as on air traffic control displays - would be likely to confuse or overload the operator. 'Interference', is another problem, such as that from activities that require additional mental processing. This causes confusion with items held in working memory at the same time.

Decision-making: Perception may trigger the need to decide on a course of action. The information, however, is known with an imperfect probability. An internal hypothesis is formed (eg why the dial is in the red) This involves the interplay between short-term memory (alternative hypotheses based on current cues) and long-term memory (previous hypotheses stored). Risk is also evaluated in some way - costs v benefits (what will be the cost of shutting down the reactor v the potential benefit). Feeding into this is the tendency for people to apply biases in the decision making process, and to apply heuristics (mental shortcut rules).

Reaction time: How fast the operator responds to a stimulus. This ability interacts with working memory retrieval, which is to say whether it involves serial or parallel processing, and how the information is actually retrieved. When there are two stimuli, they may compete, slowing reaction time or producing a difference in the refractory period.

Arousal: This relates to alertness. High levels of stress and arousal lead to a degradation in performance. The optimal level of arousal for a simple task is also higher than for a complex one. This relates to an operator faced with a complex, critical situation. High arousal situations also cause strategic shifts - shifts in processing strategy, from accurate to fast which are more error prone, often due to the need to 'do something'. Hence the requirement in a nuclear power plant following an alarm to perform no physical action until the nature of the malfunction is more clearly known.

After conducting an extensive evaluation using advanced methods such as task analysis and heuristic analysis, Psychonomics takes into account all the possible psychological factors that are evident and then creates workable design solutions that are applicable to the part of the organizational system that is most problematic - or, where necessary, the
entire organizational system. As we've discussed, this often involves redesigning or making additions to training, procedures, protocols, recruitment practices, or user interfaces. Sometimes the solution is simple, such as making interfaces provide information in chunks rather than all in one go to operators, or implementing a basic training simulation; and sometimes the solution is complex, such as when a major interactive part of the system involving many people, procedures and machines has to be changed. Yet, whatever the problem clients ask us to deal with the thrust of our approach is to implement EP methods that will reduce the load on human processes, and to implement strategies that allow employees to better cope with the task demands they're faced with. Throughout, we also aim to create error tolerant systems that will stand the test of time without failing.

 
   
     

 

 

 

 

 

 

 

 

| Home |

| Site Map |

| Client Area |

| Joining Us |

| Licensing |

| Legal Notices |

| Further Reading |

| Contact |

 

 Copyright © Psychonomics 2002. All rights reserved.