Blogs

 

Automation is great, but it can create some weird problems

By intouch * posted 20-09-2018 10:42

  

Although automation is undoubtedly benefitting many industries, an experienced forensic engineer has warned that “weird things happen” when automation collides with humans.


People-And-Robots-Modern-Human-And-Artificial-Intelligence-Futuristic-Mechanism-Technology-820097994_2106x1428.jpegForensic structural engineer Dr Sean Brady opened the Queensland Mining Industry Health and Safety Conference on the Gold Coast with a keynote address titled ‘The Ironies of Automation’, exploring how increasing automation in an industry often results in unexpected and negative results.

Using the near-catastrophic partial meltdown of the Three Mile Island nuclear power plant and a study into Swedish maternity hospitals as examples, Brady said “weird things happen” when automation is introduced into human systems, and vice versa.

“We have to be really careful when we have a system that’s controlled by humans, with lots of human tasks, and we then go and introduce automation on top of that,” he said.

“People are complex, and introducing automation among people has funny side effects.”

With the Foundation for Young Australians estimating that 40% of Australian jobs are at high risk of automation in the next 10-15 years, it’s clear that greater automation is going to play a greater role in many industries, including engineering. 

Brady explained that although it’s often thought that automating certain tasks ensures errors won’t occur, it can actually create new, unexpected errors.  

“We believe that when we automate tasks in a system we remove the human error. It turns out we don’t – we just create a whole new type of error,” he said.

“Interfaces are created between tasks that didn’t even exist in the old system. Every one of them is what we class as a new pathway for success, but it’s also a new pathway for failure. That’s why were able to fail systems in ways designers never even envisaged – we’re not really taking away the human error.”

Automation in the maternity ward

In the Swedish maternity hospital example, Brady explained that an attempt to automate what appeared to be a simple system showed just how complex human decision making is.

Prior to automation being introduced, the expectant mothers had an ECG connected to them that monitored the baby’s heart rate, among other things. There was always a nurse or midwife with the patient, and if the readings from the ECG indicated that intervention may be necessary the nurse would call the doctor, who would decide whether to go ahead with an intervention. Intervening too late could have catastrophic consequences, but intervening too early could also have unfortunate results.  

The decision was made that all patients’ readings would be displayed on an electronic board in the hospital breakroom, so that nurses didn’t have to constantly check patients’ readings and would know immediately if intervention was needed.

What should have in theory made the hospital work smoother had the opposite effect – it threw the hospital into chaos.

Why? Because it ignored the fact that nurses judged when to call a doctor based not only on ECG readings, but whether the doctor on call was known to be jumpy and prone to early intervention, or prone to delaying intervention.

Likewise, doctors judged how quickly they responded to a nurse’s call based on their knowledge of that nurse, and whether they typically called for intervention too early or usually timed it right.

When the ECG readings were broadcast to the whole ward, the complex interpersonal relationships and multilayered decision-making were disrupted – nurses felt pressure to call a doctor even if they usually would have waited longer, and doctors observing the readings because frustrated if the nurse didn’t call them. 

A nuclear debacle

Referring to the partial meltdown of the Three Mile Island nuclear power station, Brady explained that the physical cause of the disaster – a valve that was stuck open, allowing nuclear reactant coolant to escape – went undetected for a dangerous amount of time because the design of the pilot-operated relief valve indicator light was fundamentally flawed.

The bulb was simply connected in parallel with the valve solenoid, implying that the pilot-operated relief valve was shut when it went dark without actually verifying the real position of the valve.

When everything was operating correctly, the indication was true and the operators got used to relying on it. However, when things went wrong and the main relief valve stuck open, the unlighted lamp was actually misleading the operators by implying that the valve was shut. This caused the operators considerable confusion because the pressure, temperature and coolant levels in the primary circuit, so far as they could observe them via their instruments, were not behaving as they would have if the pilot-operated relief valve were shut.

Brady referred to this unexpected change in situations as moving to a different “mode”; the operators weren’t able to deal with the problem because the automated equipment was no longer operating by the rules they knew.  

Automation down the road

The implications for driverless cars are interesting, to say the least. Brady's advice was to design automation with human behaviour in mind. 

“The key things if you really want to design and automate a system so it won’t fail are you have to view that first implementation of the automation as the first draft,” Brady said.

“Then, you have to monitor very carefully how that’s being used, and if it’s being used in a different way than it was intended – which it will – don’t go and slap people on the wrist and give them new procedures, go back and recut the automaton to be more consistent with how the humans work. 

“Automation is unstoppable; we’re going to keep automating things, and it’s great! The word of caution is if you automate and you expect the systems to remain the same, that won’t happen. Those systems will change, and because they change they will open up new pathways for failure that weren’t there before.”

You can watch the full presentation here. 
0 comments
27 views