Automation bias is the tendency for individuals to favor suggestions from automated decision-making systems, often ignoring contradictory information if it comes from non-automated sources, even when that information is correct. This bias arises from the psychological tendency to assign positive evaluations to decisions made by machines or automated systems compared to those made by humans, leading to potential errors in critical environments like aviation, healthcare, and nuclear facilities.
For instance, a pilot may rely solely on an automated navigation system during flight, overlooking warning signals related to engine performance that are not indicated by the system, leading to a dangerous situation.
To mitigate automation bias, individuals should be trained to critically evaluate automated suggestions and to regularly confirm system outputs with independent checks or additional data.