Researchers at Stanford University in California used a type of machine-learning algorithm that studied footage of similar maneuvers. The type of artificial intelligence is called a neural network and is loosely based on how human brains work.
They built a neural network, which is a type of artificially intelligent computing system, that incorporates data from recent maneuvers and previous driving experiences. To train the autonomous car not to spin out of control, they took more than 200,000 motion samples from test drives on an icy track near the Arctic circle. It observed its motion from past driving experiences to adjust its steering to provide accurate motion predictions on various road surfaces.
Then, the team equipped a Volkswagen GTI with the algorithm and tested it on an oval-shaped racetrack. The car adjusted its steering and acceleration to turn successfully after driving as fast as possible and having learned from watching past tests.
The researchers explained that driverless cars can work safely if they have control systems that can quickly brake, accelerate, or steer in crucial situations. This enables them to drive safely at the limits of friction, which is just before the tires stop gripping the road and the car spins out.
"Our work is motivated by safety, and we want autonomous vehicles to work in many scenarios, from normal driving on high-friction asphalt to fast, low-friction driving in ice and snow," said Nathan Spielberg, a graduate student in mechanical engineering at Stanford and lead author of the paper.
The researchers believed that their system could be useful during emergency situations, where sudden swerves are needed. While their results were encouraging, the team stressed that their neural network system cannot perform well in conditions outside the ones it has experienced. They explained that as self-driving cars gather more data to train their networks, the cars should be able to handle a wider range of conditions.
The idea of driverless vehicles is becoming a more acceptable concept to Americans. However, experts warn that people should not be completely comfortable with them just yet.
Experts say that autonomous vehicles are still continually being developed. Every batch comes outfitted with new capabilities. However, they still have a long way to go – which is a fact that not every driver keeps in mind.
Adding to concerns, passengers in self-driving vehicles may trust the car to make sophisticated decisions, which can lead to accidents or close calls. In addition, people may overestimate the technology’s capabilities. As a result, drivers are being drawn into a false sense of security. This could encourage passengers to indulge in relaxing activities while in a self-driving car, such as eating or sleeping. (Related: Self-driving vehicles without human drivers now allowed in California.)
“People are already inclined to be distracted. We’re on our phones, eating burgers, driving with our knees,” Nidhi Kalra, a senior information scientist at the Rand Corporation, told The Guardian.
Experts reiterate that people should be careful not to put too much trust in autonomous vehicles.
Sources include: