While fully autonomous vehicles might not be here for awhile, new driver assistance systems are so advanced you’ll feel like you’re in a self-driving car.
What’s coming down the pipeline makes Tesla’s semi-autonomous Autopilot system look positively polite and passive. Instead of quiet beeps urging you to put your hands back on the wheel or a light flashing when something’s in your blind spot, these features plan to give control over to the vehicle.
Chipmaker Nvidia has dubbed these features “Level 2+.” Others call them driver-vehicle interactions or human-machine interactions (HMI). Whatever they’re called, they are coming to our cars — and fast.
Self-driving features are already in a lot of newer cars. Heck, cruise control is considered part of Level 1 autonomy and has been around for years. Look at cars from BMW: heads-up displays, parking assistance, active cruise control, and more. This isn’t the future, it’s already here.
But the next level of autonomy is essentially Autopilot on steroids. Daimler showed this off at CES with its new trucks that can auto-brake, self-steer, turn on windshield wipers, and gradually bring the car to a complete stop — basically drive without you. But you can’t actually go anywhere. The system will warn you if your hands are off the wheel for too long.
Those features will be available on its trucks starting in July.
Nvidia says its advanced system, Drive AutoPilot, will be in cars starting in 2020. Another chipmaker, Qualcomm, unveiled its third-generation Snapdragon automotive cockpit platform at CES.
Nakul Duggal, Qualcomm’s SVP of product management, said in a phone call last month that components from the cockpit should be in cars by the middle of 2021. This includes new abilities like face detection, drowsiness detection, and heads-up display info.
The car isn’t supposed to be driving itself, he explained, but sharing enough information to make the driver safer and better at driving.
The best example of this shared interaction was at a CES demo with Swedish mobility company Veoneer. On a course full of challenges like construction crews, distracting signs, errant shopping carts, and dark tunnels, we saw how the machine can take over in an instant.
The car can steer, brake, accelerate and navigate itself in many situations and Veoneer plans to bring the co-driving ability into cars by 2020.
The scenarios Veoneer is researching could easily come to life with Nuance’s emotional AI detector. From the MIT startup Affectiva, which Nuance acquired last year, the car evaluates your facial expressions and eye movement and then responds accordingly.
If you repeatedly trigger the drowsiness detector, the car increasingly worries and frets over you, suggesting you pull over. It’ll even find you nearby gas stations and rest stops. Eventually the car could take over and make the executive decision that you can’t keep driving.
In a similar vein, B-Secur’s HeartKey EKG-measuring steering wheel opens the potential to alert the car when your heart rate is dangerously high or low. It even measures your stress levels and can alert you when you’re getting drowsy. It’s still in development but hopes to be in car steering systems in the next few years. Eventually it could be incorporated into a semi-autonomous system that pulls you over, or even calls medical services for you
Other parts of driving lend itself well to the advanced self-driving features coming to cars. Parking or “valet” maneuvers are the perfect place to let the car do the work.
Continental — the German car parts maker — has various systems to give the car more responsibility. A new feature is its self-parking and obstacle detection system. The car isn’t just pulling into a parking spot but making complex turns and avoiding objects in its path. All while you just sit there. Yes, it’s driving at extremely slow speeds, but it’s doing it on its own and safely.
You may be in the driver’s seat, but soon you’ll barely be driving