Do you trust a self-driving car?

A smart thermometer that warms up your house before you get home from work, a Google Home set that turns on your favorite music or a navigation system that effortlessly guides you to the right location. Every day we use all kinds of technologies, which we give more and more responsibility and on which we gradually become more and more dependent. Are we not relying too much on all these devices and systems?

To answer this question, I took a look at the lab of the Faculty of Behavioral, Management and Social Sciences (BMS) at the University of Twente, where they conduct research into behaviour, safety and many more aspects of the interaction between humans and technology. They use virtual reality (VR) for this.

Confidence Meter

The confidence meter of the BMS lab of the University of Twente. This allows test subjects to indicate without looking whether they have a lot or little confidence in the car.

Renée Moezelaar for NEMO Kennislink

The entire BMS lab appears to occupy an entire wing of the building. It consists of various rooms that all have their own purpose and are often equipped with large computer screens, beamers and observation equipment. In one of the rooms we find a huge screen that takes up almost the entire wall, with a stripped-down version of a car in front of it: a car seat with a steering wheel, pedals and a gear lever.

“People sit in the car with VR glasses on. We then start a simulation, which makes it seem as if you are really driving,” says main programmer Lucía Rábago. “We can adapt everything to the car and the environment. In the meantime, we measure how well they drive, what they look at and even how much they sweat.” The Twente researchers have even developed their own confidence meter: a very simple wooden box with a slide that you can move from high to low confidence. Rábago: “Test subjects can indicate without looking to what extent they still trust the system.”

overestimate

The simulation has been used, among other things, to measure confidence in self-driving cars. “Francesco Walker, a former PhD student, looked at what people expect from a self-driving car in advance and how that changed while driving,” says Rábago. “And he also looked at whether different systems on the car’s dashboard influenced this confidence.”

Many test subjects who had never driven a self-driving car before were found to overestimate the power of the car. So they had too much faith in the technology: they relied too much on the sensors of the car and turned out to pay less attention themselves, even if it became busier or the sensors did not work properly due to fog or snow. Changes to the dashboard helped to adapt this image to reality. Rábago: “Together with the PhD student, we tested various ways to adjust the dashboard without distracting the driver too much. A screen that shows how well the sensors work, for example, turned out to really help the driver.”

Fairly realistic

To see what such a ride in a VR car feels like, I take a seat behind the wheel myself. The environment is more realistic than I expected, yet little things give away that you’re not actually driving. For example, I don’t see my own hands on the steering wheel and it turns out afterwards that I was driving more than 200 kilometers per hour without noticing it. Rábago recognizes the problem: “We are still working on adding movement in the seat, so that you really go back a bit when you accelerate hard and therefore also notice that you are going fast.”

Editor Renée Moezelaar takes a tour of a virtual city.

Renée Moezelaar for NEMO Kennislink

This naturally raises the question of whether such a VR environment is realistic enough to draw real conclusions. “It remains a simulation, but we see that the experience people have comes very close to reality,” says Rábago. “Sometimes we get complaints that the car doesn’t realistically accelerate or brake, but that experience is usually very similar to driving a car that you don’t know yet. You always have to get used to that, and it is no different here.”

And although it is not completely one hundred percent realistic, the results are nevertheless valuable, according to Jan Willem van ‘t Klooster, director of the BMS lab. “It remains a tool, but it is also an easy way to test adjustments without converting an entire car. In addition, this test does not endanger people. That’s why we work a lot with car manufacturers, who use our results before they modify their cars.”

A test subject conducts a house search in virtual reality.

Via Esther Kox, with permission

house search

PhD student Esther Kox also thinks that VR can be a good tool for measuring the relationship between man and machine. She is working with the University of Twente and research institute TNO on a study into trust in drones, and also uses VR for this. In her case, subjects sit on a white platform, where they stand with their waists in a metal ring and have VR glasses on their heads. Kox: “We used to put people behind a computer to play a kind of game. This setup allows a person to walk around and really do their own research. That is much more realistic than sitting in front of a computer.”

In the VR environment that Kox uses, the subjects walk through an abandoned house to check if anyone is still there. They are given a drone that continuously scans for danger and provides feedback about what to expect around the corner. “At some point, the drone will say that it is not detecting any danger, but then it turns out that the drone has missed an obstacle such as an explosive device or a burglar,” says Kox. “The participants are shocked by this. We are looking at how this affects their trust in the drone and how we can prevent or repair a possible breach of trust.”

The first results show that people forgive the drone more quickly if the uncertainty in the measurements is made clear. “When you hear that there is no danger with eighty percent certainty, you also understand that you still have to be a bit on your guard,” says Kox. It could also help to make the drone appear more human. “Making mistakes is human, so people may be more inclined to forgive the drone.”

Exploding Bomb

Despite the reasonably realistic environment, Kox sees that many test subjects still see it as a bit of a game: “They know that in principle they can just start again if a bomb explodes.” That’s why she also tested soldiers. “There you see a greater breach of trust if the drone makes a mistake, and they are also less likely to forgive the drone,” says Kox. “This is probably because they know what it’s like to encounter a bomb, and what the consequences can be if such a bomb explodes.”

Of course it would be even better to investigate the interaction between humans and technology in real life, admits Kox: “That would give you an even more realistic picture, but in our case it also entails a lot of danger. And drones are not allowed everywhere, so that makes it difficult. With VR we circumvent those problems.” Of course, this does not mean that the researcher does not want to repeat the experiments in real life. “This is just the beginning. Ultimately, we would like to test our results in real situations,” says Kox. “For example, with firefighters who take drones to scan for hazardous substances. That happens now and then. That would provide very valuable information.”

On the market

Well, many people will never do house searches with drones, and that is why the researchers are also focusing on more generally applicable situations. “For example, a student recently looked at how people react to a drone that flies in the park or at a festival, for example to monitor,” says Van ‘t Klooster. For this research, the student set up on the market square of Enschede with the mobile VR lab, a large bus with all kinds of VR equipment. Van ‘t Klooster: “We don’t just want to use students as test subjects, but also want to see what other target groups think. We often use our mobile lab for this.”

According to Van ‘t Klooster, the results mainly showed that the presence of a drone raises many questions: “In both environments, people mainly wondered what the drone was doing, and trust was not immediately very high. At a festival, people are more inclined to accept the drone, perhaps because drones are already being used there as cameras.”

Not perfect

The question of whether we have too much faith in technology is not so easy to answer, but with all these experiments the researchers at the BMS lab hope to prepare us for the future. Because as long as the role of technology in our lives continues to grow, we’d better be prepared for how to deal with it, says Kox. “I don’t think we have much to fear at this point, but we have to realize that technology isn’t perfect. Every technology has its limits, even the most advanced drones, so we will always have to keep an eye out for ourselves. And it is our job to find out which communication strategies we can best use to make that clear, so that people know that they also have to rely on themselves.”


Source: Kennislink by www.nemokennislink.nl.

*The article has been translated based on the content of Kennislink by www.nemokennislink.nl. If there is any problem regarding the content, copyright, please leave a report below the article. We will try to process as quickly as possible to protect the rights of the author. Thank you very much!

*We just want readers to access information more quickly and easily with other multilingual content, instead of information only available in a certain language.

*We always respect the copyright of the content of the author and always include the original link of the source article.If the author disagrees, just leave the report below the article, the article will be edited or deleted at the request of the author. Thanks very much! Best regards!