Objectives and activities

Psychophysiological model of the driver

The assessment of the driver state is very important to adapt and tailor the interaction between the driver and the vehicle in the shared control model. The goal of this Work Package is to assess several psychophysiological indicators (e.g. ECG, EDA and EEG) and eye movements in experimentally manipulated special driving situations in a driving simulator. Special driving situations include monotonous driving, driving under sleepiness and fatigue, noise (e.g., distractions from family members), and high attentional load (e.g., fog and limited visibility, high traffic density, speeding). In addition, the same indicators will be assessed in the field, with test drivers driving a real car. Advanced signal processing techniques will be applied to the various measures in order to model continuously and in real-time the driver’s fitness to drive. In this regard, intelligent multimodal fusion of brain and peripheral signals will be investigated, with the expectation to increase the accurate detection of the driver’s fitness to drive.

 

Human-Vehicle interaction model for intervention

This activity is focused on the Human-Vehicle Interaction aspects involving the driver intervention on the system. In particular, we can highlight three main objectives:

♦ Design of the guidance system to help the physical interaction with vehicles
♦ Design of multimodal interaction for warning messages
♦ Dynamically adapt the interactions taking into account the context and the driver state

Finally, the purpose is to improve alert messages and guidance feedback to the driver during take-over requests.

 

Human-Vehicle interaction model for supervision

The objective of this Work Package is to investigate advanced interaction modalities based on full body and multisensory experiences aiming at improving safety and user experience in semi autonomous vehicles. In particular, it aims at investigating how human-vehicle interfaces can be designed to be operated at different levels of attention, with a special focus on peripheral interaction to support supervision and situational awareness. Wearable and ubiquitous computing will work as technological enablers of novel interaction modalities that are more natural and, additionally, can reduce the cognitive workload on the user.