Skip to main content

Hitachi
Contact InformationContact Information

Autonomous Driving Technology for Connected Cars

Cars that Connect with People

Integrated HMI Technology Designed to Achieve Harmony with People

    Highlight

    Clarion Co., Ltd. aims to overcome the challenges that driving automation and the connected car pose for automotive HMIs by supplying new in-vehicle information systems that provide seamless access to information. This article describes key technologies for achieving the goal, namely InfoSeat (a seat-based HMI), workload estimation, cloud-based voice recognition, and gesture control.

    Table of contents

    Author introduction

    Masashi Koga, Ph.D.

    • Software Development Department, R&D Division, Clarion Co., Ltd.
    • Current work and research : Development of automotive human–machine interfaces (HMI).
    • Society memberships : IEEE, the Information Processing Society of Japan (IPSJ), and the Institute for Electronics, Information and Communication Engineers (IEICE).

    Mitsuharu Shibazaki

    • System Process Strategy Department, R&D Strategy Division, Clarion Co., Ltd.
    • Current work and research : Research and development strategy planning for in-vehicle device development.

    Noriyuki Abe

    • Software Development Department, R&D Division, Clarion Co., Ltd.
    • Current work and research : Advanced engineering of automotive virtual assistants.
    • Society memberships : Society of Automotive Engineers of Japan, Inc. (JSAE).

    Nobuyasu Kunii

    • Clarion Co., Ltd.
    • Current work and research : Business strategy planning and R&D strategy planning for in-vehicle device development as executive officer, CTO, of the R&D Strategy Division.

    Introduction

    Recent years have seen progress being made on vehicle connectivity and driving automation. Advances in high-speed, high-capacity communications mean that always-on access to a variety of cloud-based services is now available from inside cars. This is bringing significant changes to how people spend their time in their cars, causing major changes in the relationship between vehicles and drivers. Highly autonomous vehicles that not only simply control speed and steering, but also understand the environment and decide their maneuvers.

    This article describes Clarion Co., Ltd.'s activities to address these changes. First, an overview of the seamless mobility that will be made possible in the future by the connected car and autonomous driving is described. It will also explain the challenges that human machine interfaces (HMIs) will face. Finally, the article will describe technologies being developed to overcome these challenges, namely InfoSeat (a seat-based HMI), workload estimation, cloud-based voice recognition, and gesture control.

    Future of Seamless Mobility

    People usually gather information before going on a drive. Detailed information about the destination and the roads to get there can be found via the cloud using a personal computer or smartphone. This section describes a future in which a seamless environment is available for making use of the information so obtained while driving.

    In this scenario, the driver saves his or her destination and other plans for the drive on the cloud instead of taking notes, as people typically do. At the beginning of the trip, the vehicle identifies the driver and automatically downloads the stored information from the cloud. The vehicle then provides directions on the best route, taking account of real-time information on things like congestion, nearby events, and the weather. It also switches to autonomous driving on roads where it is permitted. Based on the route specified on the cloud, a smartphone app starts automatically upon arriving in the parking lot and provides walking directions to the final destination. In this way, the user is provided with end-to-end navigation (see Figure 1).

    Figure 1 - Seamless Travel Experience Using Information from Cloud Having previously used a tablet or other device to indicate what they want to do, the trip plan appears on-screen when the driver gets into the car, which then starts providing directions. After parking somewhere close to the destination, a smartphone app provides directions for the remainder of the journey on foot.

    Clarion will supply an integrated HMI designed to provide seamless operation and the Smart Access function for connectivity to the cloud (see Figure 2). The integrated HMI links the digital devices in the cabin, enabling a safe, stress-free, and convenient driving experience. Smart Access provides access to cloud services via the network. This will provide interactions tailored to each user's preferences, and utilize the latest artificial intelligence (AI) technologies such as video and voice recognition, natural language communication, and behavior prediction for using the vast amount of content available on the cloud. Clarion will draw on its accumulated know-how about in-vehicle information systems to supply solutions that cover everything from the cloud to the onboard devices.

    Figure 2 - New In-vehicle Information System Clarion intends to equip new in-vehicle information systems with displays, InfoSeat (a seat-based HMI), gesture control, voice HMI, and cloud-based voice recognition.

    Challenges for Integrated HMI

    Integrated HMI faces the following three challenges.

    (1) How to present large amounts of information in an easily intelligible form.
    Connectivity and autonomous driving are resulting in large volumes of information flowing from onboard sensors and networks that subject the driver to cognitive overload (see Figure 3).

    Figure 3 - Changes in Information Flow In comparison to conventional driving, where drivers determine what is happening by obtaining information from their surroundings, the amount of information obtained via the HMI of the in-vehicle information system has increased. Moreover, while automation has reduced the input from the driver, the vehicle has a greater need to monitor the driver.

    Drivers used to understand the environment directly with their visual, auditory, and tactile senses. Information from onboard sensors and networks, in contrast, is presented via an HMI (such as a display or audio interface). Adding autonomous driving functions into this mix further increases the complexity of information coming through the HMI, as shown for (B) in Figure 3. On top of all the displays and audio interfaces that are already used, any further increase risks the confusion or distraction of the driver.

    Clarion aims to overcome this problem by using an integrated HMI that fully utilizes the visual, auditory, and tactile senses in a coordinated way.

    (2) Monitoring the driver
    While autonomous driving requires less input from the driver, the vehicle needs a way to determine what the driver is doing. For example, the readiness of the driver needs to be verified before the control is transferred back to the driver. Moreover, presenting large amounts of information puts safety at risk because of the greater workload it imposes on the driver. For these reasons, ways of determining things like driver readiness and workload are needed.

    Clarion is developing techniques for estimating the driver's cognitive workload and for using sensors to monitor drivers.

    (3) HMIs for safe and stress-free access to non-driving activities
    A lower driving workload puts the time spent in vehicles to good use, such as entertainment or catching up on news. Moreover, as vehicles become more connected, demand is growing for seamless access, from the vehicle and elsewhere, to things like social networking services (SNSs), video, games, and e-mail. New HMIs are needed so that users can engage in these activities safely and without stress.

    Clarion is seeking to overcome these challenges by making enhancements to voice recognition and gesture control.

    Work on Integrated HMIs for In-vehicle Information Systems

    To overcome the challenges described above, Clarion is developing an integrated HMI that links the digital devices in vehicles. It has been said that utilizing our human senses and our innate spatial cognition makes it easier for people to understand what is happening around them(1). For this reason, the integrated HMI works through a combination of screens, audio, and vibration in a coordinated way that is natural, like the real world.

    Clarion built a prototype of the integrated HMI to prove the concepts behind it (see Figure 4). In addition to using displays to present a variety of information, the HMI uses InfoSeat to deliver information through sound and vibration. It is also equipped with gesture sensors and driver monitoring sensors (using cameras and electromagnetic waves) for interactive operation. It was used to test a variety of warnings and HMI use cases of autonomous driving. The prototype has also been exhibited at trade shows and other events where it has provided a large number of people with a chance to try the integrated HMI. The feedback gained there was then incorporated into the HMI specifications.

    The following sections describe the development of technologies for building integrated HMIs.

    Figure 4 - Prototype for Testing Concepts behind Integrated HMI A prototype was fitted with displays, InfoSeat, gesture sensors, and sensors for driver monitoring to test autonomous driving and warning use cases.

    InfoSeat

    Numerous initiatives have been undertaken for presenting information through screens and audio(2). The following describes our approach to using the InfoSeat.

    Figure 5 shows a schematic diagram of InfoSeat. Microphones and speakers are embedded in the headrest and vibrators in the seat. It generates vibration and audio based on signals from the vehicle and in-vehicle information systems, while speech recorded by the microphone is used for applications such as voice recognition or hands-free calling.

    Figure 5 - InfoSeat The system comprises vibrators mounted in the driver seat, a microphone and speakers mounted in the headrest, and a controller. The controller controls the InfoSeat based on information from the vehicle and in-vehicle information system.

    Sound produced from the headrest is easily noticed by the driver, even if music is playing. Being largely inaudible to passengers, however, it does not interfere with their listening to music. Vibration, too, is a reliable way of getting the driver's attention even if focused elsewhere. For the audio processing and vibration required by InfoSeat, Clarion has drawn on the sound technology it has built up through its involvement with audio systems.

    While seats fitted with eccentric motors to convey information are already on the market, changing the vibration of these motors is difficult because only their intensity can be controlled. InfoSeat, in contrast, uses a special type of vibrator that can generate a variety of vibrations by providing independent control over frequency and intensity.

    The National Highway Traffic Safety Administration (NHTSA) recommends shifting the location of the vibration to make it easier for the driver to notice(2). The problem was that doing this by conventional means required a number of eccentric motors to be built into the seat. What Clarion did instead was develop a technique that can shift the vibration location using only two vibration devices.

    Research has also been published indicating that using vibration to convey information can help with situation assessment when switching back from autonomous to manual driving(3).

    Monitoring the Driver

    Two types of driver statuses are necessary for the integrated HMI: the driver's physical and psychological state(4) and those that depend on the surrounding environment. This section describes a technique for assessing the cognitive workload imposed on the driver, which belongs to the second category(5).

    We developed techniques to control display and input devices based on driving conditions. This has included attempts to estimate workload based on vehicle signals, with high estimation accuracy having been reported. Clarion has extended these techniques to develop a technique that uses a model able to replicate a wide range of driving conditions to estimate workload under various conditions while also taking account of individual differences (see Figure 6).

    In testing, the technique demonstrated an ability to estimate workload with an error of 20% or less in terms of the continuous subjective assessment result based on National Aeronautics and Space Administration task load index (NASA-TLX) reference data.

    Figure 6 - Block Diagram of Workload Estimation The required tasks are selected and the workload calculated using information obtained from sources such as car navigation, CAN, and ADASs.

    Implementation of New Ways of Operating Devices

    HMIs that use techniques such as voice recognition or gesture are a good way to allow devices to be operated while driving without compromising safety. The following sections describe cloud-based voice recognition, Quad View, and gesture control.

    Cloud-based Voice Recognition

    Previous voice recognition implemented on onboard devices was limited in the accuracy of recognition and the number of phrases or words it recognizes because of constraints of the performance of the devices. In recent years, however, cloud-based voice recognition services have become available. They are not subject to these constraints. Unfortunately, many such cloud-based voice recognition servers are not sufficiently accurate when used in vehicles where levels of ambient noise are high. In response, Clarion has developed techniques for voice activity detection and background noise suppression that can deal with this noise(6) (see Figure 7). In performance testing, these techniques reduced 58% of recognition error. The function was installed in car navigation systems from 2013 under the name “Intelligent VOICE.” Following further enhancements, the function was also installed in the latest NXV977D navigation system released in 2017 and other products. Along with advances in autonomous driving, this technology will become increasingly important as drivers will have time to access cloud services from vehicles.

    Figure 7 - Block Diagram of System for Connecting to Cloud-based Voice Recognition Service (1) Voice activity detection is handled by an onboard device. Compressed voice data is sent via the communication network to an intermediate server that performs noise suppression before forwarding it to a voice recognition service. (2) The photograph shows a system that uses this function (a 2017 NXV977D).

    Quad View

    Touch panels are a common way of using hand movement as a means of input. Clarion developed its new Quad View touch screen with the aim of providing a larger screen and with a wider range of functions (see Figure 8). Quad View splits the screen into four regions and displays different apps in each. The regions can be resized by dragging the diamond-shaped icon in the middle of the screen, in which case the display updates to the new size automatically. As this makes it possible to use multiple apps on a single large screen rather than a number of separate ones, it provides an HMI that can cope with the larger volumes of information that come with greater connectivity and autonomous driving. The function was first introduced on NXV977D car navigation systems in 2017.

    Figure 8 - Quad View (2017 NXV977D) Different apps can display on each of the four screen regions. The regions can be resized as needed by dragging the diamond-shaped icon in the middle of the screen. This provides an easy way to view four screens of information on the same display, as needed.

    Gesture Control

    A downside of touch panels is the need to look at the location to be touched, thereby distracting the driver from paying attention to the road ahead. Accordingly, Clarion has been working on the development of a gesture HMI that does not require touch operation, instead using a camera(7) and near-infra-red proximity sensor(8) to detect hand movements. Used in combination with a head-up display (HUD) like that shown in Figure 9, this provides visual feedback together with a way to use gestures to choose from a menu, something that was difficult to accomplish in the past(8). The HMI has demonstrated an ability to improve the lane-keeping performance of the driver by significantly reducing the time spent looking away from the road, from an average of 2.36 seconds to just 0.16 seconds.

    Figure 9 - New Gesture Control A menu appears on the HUD when the user passes their hand over the sensor. The user selects menu items by moving their hand in the direction indicated.

    Conclusions

    Advances in connectivity and autonomous driving have posed challenges for HMIs, namely the presentation of large amounts of information, monitoring the situation of the driver, and new types of display and input. In response, Clarion has developed a range of technologies that include InfoSeat, workload estimation, voice recognition, and gesture control, and has demonstrated their efficacy. The results of this work have been deployed in products such as car navigation systems.

    The progress of information and communications technology is rapid, with new applications and services appearing all the time. Further progress has also been made on automating driving and providing more sophisticated forms of assistance. It is anticipated that this will in turn raise new challenges for HMIs due to the larger amounts of information to be dealt with in the vehicle. In the future, Clarion intends to be quick to identify these new challenges and to make further enhancements to HMIs to supply to customers.

    REFERENCES

    1)
    D. Norman, “The Design of Future Things,” (May 2009).
    2)
    “Human Factors Design Guidance For Driver-vehicle Interfaces,” NHTSA (2016).
    3)
    K. Sonoda et al., “An Assistant Method for the Situation Awareness of a Driver using Haptic Seat in Highly Automated Driving,” Proceedings of JSAE Annual Congress (May 2017) in Japanese.
    4)
    S. Kaplan et al., “Driver Behavior Analysis for Safe Driving: A Survey,” IEEE Transactions on Intelligent Transportation Systems, Volume. 16, Issue. 6 (Dec. 2015).
    5)
    N. Uchida et al., “A Study of Driver Workload Estimation by the VACP Method,” Transactions of the Society of Automative Engineers of Japan 46(6), pp. 1171–1176 (Nov. 2015) in Japanese.
    6)
    T. Homma et al., “Development of Speech Processing Technology for In-vehicle Device Use of Cloud-type Speech Recognition Service,” IPSJ SIG Technical Report, Spoken Language Processing (SLP), 2013-SLP-98(6) (Oct. 2013) in Japanese.
    7)
    T. Yoshinaga et al., “Hand-snap Gesture Recognition by Using Hand-motion and Hand-shape Estimated with Random Forest Classifier,” IEICE technical report, 113 (197), pp. 119–124 (Sep. 2013) in Japanese.
    8)
    S. Takada et al., “Study of Reducing Driver Distraction Caused by Operating in Vehicle Apparatus while Driving, by an Effective Combination of Gesture Operation and HUD,” The 77th National Convention of IPSJ, 3E-02 (Mar. 2015) in Japanese.
      Download Adobe Reader
      In order to read a PDF file, you need to have Adobe® Reader® installed in your computer.