Adaptive Multimodal In-Car Interaction project pages finally got ready:
AMICI project develops methods for machine understanding of the driving situation, particularly monitoring driver state (e.g., gaze direction, alertness). This facilitates optimising presented information according to the situation and urgency. Another focus is multimodal interaction: information is shown on screens, as audio, or as haptic feedback. The chosen modality depends on the kind of information and the current driving situation. Completely new innovations would come from LIDAR sensing inside the car, which would improve monitoring humans’ state and enable new interaction methods.