Trust in artificial intelligence: how to find a balance between safety and economy
In an environment where artificial intelligence (AI) is increasingly used in industry, especially at high-risk facilities, it is extremely important understand how people interact with these new technologies. Alexander Venger, professor at the Department of Psychology at Dubna State University, and Victor Dozortsev, a professor at MIPT, began researching this important topic. The results of their work were published in the international journal Mathematics.
Today, AI is able to identify situations that potentially threaten an industrial accident. Having detected such a threat, the system prompts the operator to take certain actions, for example, changing the process mode or even stopping it. Despite the risks, a complete shutdown of production entails serious economic losses. At the same time, AI predictions cannot be absolutely accurate, and the decision on further actions always remains with the person.
How to find a balance between the need to prevent an accident and the desire to avoid unjustified economic losses? Scientists have tried to answer this question.
They have developed a mathematical model that will help the operator make decisions based on AI recommendations. This model «incorporates» a balance between safety, potential economic efficiency, and the individual psychological characteristics of the operator.
The developers of this model took into account situations where the cost of an error is disproportionately high. We are talking about high-risk industries, such as nuclear power plants, where an incorrect decision can lead to catastrophic consequences.
Scientists have identified two key factors that influence the degree of operator trust in AI emergency warnings:
The first is “Operator Caution”. This characteristic reflects an individual's level of risk tolerance. The more “cautious” the operator is, the lower the probability of an accident he considers acceptable.
The second factor is “Tendency to doubt.” This characteristic determines how high the reliability of the information must be for the operator to make a decision based on it. The higher the “propensity to doubt,” the more time the operator will spend collecting and verifying data.
Both characteristics have both positive and negative sides. For example, an overly “cautious” operator may be prone to unjustified production stops, which will lead to large economic losses. And an operator with a low “propensity to doubt” risks making an erroneous decision based on unverified information.
In the future, developers plan to create special simulators that simulate real production processes to identify individual strategies of operators. This will allow for more effective professional selection of personnel for high-risk facilities. In addition, based on the data obtained, it will be possible to develop special training programs aimed at developing the skills of operators to make optimal decisions in conditions of high responsibility.
The research of Wenger’s team has already formed the basis for research for training personnel for the nuclear industry.
Свежие комментарии