Facebook  Twitter  Linkedin  YouTube
Saturday , 20 April 2024

A Mind-Controlled Car

A research team in Berlin has come up with cars that can be controlled by mind. This new breakthrough in technology can now empower physically challenged people to drive their own cars.

From manual cars to automatic cars to driverless cars, automobile technology has come a long way. Now another milestone has been added to it – a car controlled by our thoughts, thanks to a research team at the Free University of Berlin led by Prof Raul Rojas. The mind-controlled car, titled Brain Driver, uses a computer or microprocessor to control the different sections of the car such as the brakes, accelerator and steering wheel through sophisticated electronics. Thus, the driver of the car has to interact with this computer through his mind.

Brain Driver achieves this human-computer interface through a high-level EEG (Electroencephalogram) device. The EEG is a recording of the electrical activity of the brain, similar to the ECG for the heart. When a subject thinks something or experiences some emotion, groups of neurons fire in the brain which gives rise to small but detectable voltage traces on the surface of the scalp. An EEG device can pick up these signals through sensors placed on different parts of the head which correspond with different areas of the brain such as the frontal cortex, temporal cortex, etc. The EEG device used in Brain Driver is the Epoc neuro-headset, designed by Emotiv Systems. The signals or data from the different sensors attached to the Epoc are sent wirelessly to a PC. The PC has a dedicated software that extracts information from the data coming from the sensors. With this software and data, it is possible for the computer to detect up to 12 different facial expressions and four different emotions of the driver in real-time. The system also detects simple direct mental commands.

Before the user or driver can use the Epoc to give commands to the car, he has to spend some time training the Epoc’s cognitive suite to recognise the commands. The Epoc can be trained to recognise up to four concurrent mental actions within a few minutes. Most users need to practise some time training the Epoc headset. The main factor in training the Epoc is for the user to be able to generate distinct patterns of neuronal firing in the brain repeatedly which the Epoc software can then recognise as corresponding to a particular thought or mental command. The time for training varies from a few hours to a few days. The Epoc system is unique – it can use facial expressions of the user as commands for the brain-computer interface.

Using the present brain-computer interface, it is possible to distinguish four possible directions on the computer screen – left, right, up and down. The up direction indicates accelerating the car and down indicates decelerating the car. Whenever the user or driver wants the car to go left, the mental command goes to the car microprocessor through the Epoc headset interface. The microprocessor gives the command to the steering wheel which advances a small angle to the left.Using such small increments of changes in steering or acceleration, it is possible to turn the car or accelerate it with mental commands.

Autonomous car combined with brain-computer interface

One of the problems with brain-computer interfaces is that the driver has to think really hard and concentrate a lot to produce the distinct brain patterns which train the brain-computer interface to produce the corresponding sustained physical motion of the car. Also, many users will not succeed in producing the required brain patterns to successfully produce the corresponding physical motion even after sustained efforts.

Prof Raul Rojas told TrafficInfraTech, “Working with a computer-brain interface is very intense and tiring; a 10 minute session with the interface is like working for an hour otherwise.” To avoid these problems and too much brain fatigue, the research team has modelled the mind-controlled car as a combination of an autonomous car which requires decision points via the brain-computer interface only at critically important points, e.g. the corner of roads which are crucial decision points while driving. So the car drives by itself during most of its journey, and only at the corner points, the driver tells the car to turn left or right using the brain-computer interface. Thus, the driver needs to concentrate only a few seconds to give commands to the car. This combination of autonomous driving and a brain-computer interface such as the Epoc neuro-headset will make brain controlled cars a reality in the near future, at least until brain-computer interfaces become sophisticated enough to exclusively control cars without the above problems.

In facilitating this combination of the brain-computer interface with an autonomous car, Prof Rojas and his team are developing an autonomous car, Made in Germany.

Made in Germany is a commercial car that has been modified to be used as an autonomous car. It has an additional motor for moving the steering wheel automatically. There is also a channel in the car which communicates commands from the computer to different sections of the car – the brakes, accelerator, steering wheel, etc. The computer is plugged into this channel and there are sensors which perceive the environment around the car for threats, obstacles and pedestrians. The central part of the car has three video cameras and sensors and also seven laser scanners including one rotating scanner on top of the car. This gives a three dimensional view of everything happening around the car in a 60-metre radius. This is much more than what a normal driver can perceive while driving the car. The car knows its location on the street because it has a map of the street and also a GPS navigation system onboard. It knows if there are objects in front and behind and it also knows if the traffic light is green or red. A computer programme constantly monitors the sensors and computes the best trajectory for the car and gives the best commands for the safest possible route.

“The car (Made in Germany) has been extensively tested in a closed airport at Berlin on many kilometres of test runs. We generated here artificial traffic by creating traffic congestion and other problems in street corners which the autonomous car has to negotiate in real-life situations. We also threw objects in the car’s way to see whether it stops correctly on perceiving obstacles and at the correct time. We did not have a single collision of the car in the numerous tests that we have done on it,” Rojas told the magazine.

According to Prof Rojas, the technology for the autonomous car could be ready to go for commercial production for driving cars within the next ten years in restricted environments like delivering meals at airports from the kitchens to the planes and moving people in airports from one location to another.

The obstacle

Some situations will pose difficulties for the autonomous car though. In a situation when two cars are waiting at a traffic intersection and the light turns green, the human drivers make eye contact and from the expression of each other’s face, arrive at an understanding as to who will go first. It is very difficult for the autonomous car to interpret the expression of the driver and take a decision. So this is one of the main problems that the research team faces to make humans and autonomous cars work in the same environment. It would take quite some time before this problem is solved.

Prof Rojas and his team have been trying out different things for controlling Made in Germany. One such application is the i-driver which uses a mobile device such as the iPhone and iPad to control the autonomous car. This mobile device enables to see all the parameters of the car. When the car is in the garage or at home, the user can turn on the iPhone or iPad and see all the sensors and the information that the car receives from the sensors. The user can also see the location of the car on the mobile device. In future, users may be able to call their autonomous taxi or autonomous car using their iPhone or iPad. They would not have to enter their location in these devices because the mobile phone would know where the user is. Plus, the user will be able to see on the iPhone or mobile device where the autonomous car or taxi that would pick him up, is driving on the street.

 

Share with: