Self-driving truck

AI still not ready to drive your car safely

Hallucination problem getting in the way of fully autonomous vehicles
Image: Shutterstock via Dennis

1 March 2024

Artificial intelligence (AI) is already used in many aspects of the auto industry from production to design to marketing. Now, it’s getting closer to drivers’ daily commutes but don’t get excited just yet.

At CES 2024, the annual consumer electronics show held in Las Vegas, BMW and Volkswagen both announced new AI-enabled chat assistants for their upcoming vehicles. The car makers say that voice chats will be able to control a limited number of features in the vehicle like navigation and climate. That’s now. The future looks much different.

There will come a time when an AI assistant controls everything, including how fast the vehicle is going, but according to experts, that time is still a ways off.




Today’s assistants use large language models by ChatGPT and Amazon-owned Alexa. These aren’t yet all-seeing, all-connected technology ‘beings’. They are interactive technologies that have been developed specifically for use in an automobile, with all expected safety systems in place both for the driver and their privacy.

“The concept presented at CES is based on the so-called ‘retrieval augmented generation’ approach of natural language processing. This combines query-based methods with generative methods. Based on voice input information, a retrieval model is used to retrieve relevant information from a large pool of information consisting of the vehicle’s operating instructions and other product information such as marketing literature or press releases,” a BMW spokesperson told Newsweek.

“The generative model is then used to formulate an answer based on the retrieved information. The retrieval model therefore functions as a search engine that selects the most relevant information. This retrieved information is then used as input to the generative model, which generates a coherent and contextually relevant response,” they said.

This hybrid approach reduces the risk of hallucinations and offers a good balance between factual accuracy and creative, human-like dialogues.

Hallucinations are the term for when an AI language model gives incorrect or misleading results. The errors can be caused by a variety of factors, including insufficient training data, incorrect assumptions made by the model, or biases in the data used to train the model.

History of collaboration

Amazon and BMW have a history of working together, starting in 2019 when the company first put Alexa in its vehicles. The companies have worked to include sounds and commands that are already familiar to Alexa users in the models. Today, they control smart home features, but in the near future they will also control vehicle and service features. It won’t yet be able to control your accelerator.

“It’s unlikely that a driver would ever prefer to use their voice to issue a command such as to increase the speed of the vehicle when it would be so much easier and faster to do so with a single touch of a button or tap of the accelerator,” Kelly Funkhouser, associate director of vehicle technology at Consumer Reports told Newsweek.

“Think of how annoying it would be to stop jamming to your favorite song to tell your car to increase the speed, wait for the system to process what you said, confirm what you said, and execute the action? Instead, you could have hit a button on your steering wheel, perhaps even to the beat of your song, to bump the speed up then be on your way in a fraction of the time,” she said.

Funkhouser also noted some of the language models at CES gave worrisome answers when tested. Twice a system incorrectly interpreted the request and gave an unexpected answer on multiple prompts.

“This technology is certainly not ready to be used and trusted at a level that would make me feel safe or confident in controlling driving and safety aspects of the vehicle.”

Funkhouser explained that as of right now, consumers don’t love using voice controls while driving. A big reason was because they needed to learn set phrases and the speech recognition was far from perfect. But over the last several years, most systems have begun using natural language processing and have greatly improved.

“Yet there still isn’t huge satisfaction and reported use. That’s because drivers don’t want to use them for other reasons, such as pausing the radio example or it’s just easier to quickly push a button,” said Funkhouser.

“I don’t foresee consumers shopping for their next vehicle based on the AI or voice assistant. It’s more likely they will continue to look for the other aspects of safety, utility, driving dynamics, and comfort while the voice assistant will just be a feature that exists in the car,” she said.

News Wires

Read More:

Back to Top ↑