Today, cars are designed around the lifestyles of digitally savvy drivers. Dashboards and steering wheels house buttons for everything from radio and lights, to phone activation and cruise control. However, the increased availability of technology such as touchscreen and phone syncing could reduce the amount of knobs and buttons from the driving area.
And if the widespread use of self-driving vehicles becomes a reality, will we need any controls at all in the car interior? Car technologies such as 3D gesture recognition could do away with the need for almost all buttons in a car cockpit. A movement of the arm could start the radio or even the car itself while Augmented Reality (AR) apps could display information and imagery onto otherwise clean surfaces.
Voice command takes away the need to touch any touchscreen or button, further aiding driver safety. Without having to lean across the dashboard or look at the wheel to access controls, the driver can keep their eyes on the road. We can already see this car technology in use with Android Auto and Apple CarPlay which sync our smart devices to the vehicle’s onboard computer system. For example, virtually everything in Android Auto can be accessed through spoken commands. Apart from an initial tap on a console, there’s no further need to touch the screen. In addition, the Hyundai i40 comes with a dedicated voice command button on the steering wheel.
Hyundai Motor is also the first large-scale car maker to connect vehicles with homes using Amazon Echo and its new Blue Link® skill for Amazon Alexa. Hyundai owners in the United States can lock, unlock and remote start their cars without even needing to be sitting in them.
Head-Up Displays or HUDS use projectors and computer-generated images to display information such as speedometer and navigation data onto a car’s windscreen. In a first for Hyundai, the new KONA offers this latest car technology displaying an eight-inch projected image into the driver’s sight line, meaning they don’t have to take their focus away from the road.
Building on that idea, Augmented Reality has the power to make the real world interactive. Using computer-generated content such as video, audio, graphics, and GPS data, information can be overlaid onto physical objects, read by hand-held devices and used by people. For example, by pointing a smart device at an area of a car, an AR app such as the Hyundai Virtual Guide uses the built-in camera to recognise and display the different components in the vehicle, whether under the hood or inside the car itself.
Hyundai has already showcased 3D-gesture control car technology with its Connectivity Cockpit Concept. The system uses infrared and camera sensors to recognise driver’s hand commands allowing them to select functions such as navigation, infotainment, audio, temperature control and smartphone features. Basic commands are available while driving but more become available when the car is at a standstill.
The cockpit concept is also paired with wearable smart devices such as a smartwatch integrated with the in-car system to monitor the driver’s heart-rate and alertness, recommend rest periods, or relay navigation details such as upcoming blind spots. The cockpit is designed to offer a vision of a future with even closer human/vehicle interaction.
Touchscreen technology is now common in modern car models. Accessing features such as radio, apps, and navigation, or data such as weather information, can be done with a press of your fingertips. For example, the New Generation Hyundai i30 boasts a five-inch or optional eight-inch LCD touch screen which also integrates a rear-view camera for safety while reversing. This navigation system can relay real-time travel and weather information, via a seven-year subscription to LIVE services, and can also display 3D maps.