Our MMIG group carries out joint projects with other leading research groups globally. We often collaborate with companies in joint research projects or under reseach contracts. The funding typically comes from Business Finland, The Academy of Finland or European Union programs.

Current Projects – MMIG group

2021-2023 Adaptive Multimodal In-Car Interaction (Amici)

Project Goals

  • Develop technologies for human-car-interaction 
  • Prepare for autonomous, electric transportation
  • Understand + predict the needs of future drivers
  • Create SW/HW framework for car HCI for easy integration by multiple companies

Cars are computers-on-wheels

  • Autonomous, electric, connected
  • SW companies will dominate the auto industry
  • How to make that SW safe, informative, easy?
  • In semi-autonomous cars, drivers are just su­per­visors, like airline pilots (but lack training)

Rethink Future Vehicle Interaction

  • AI to understand traffic & driver (e.g. fatique)
  • Rich sensors: LIDAR, 3D cameras, 4D point clouds, head and gaze tracking, speech, haptics 
  • Optimize safety & reliability of car-human symbiosis, show right info at right times via multiple senses

2018-2022 Multimodal Interaction in Autonomous Vehicles (MIAuV)

2018-2021 Digital and Physical Immersion in Radiology and Surgery (DPI)

The DPI research project aims at co-creating and developing innovative technical solutions for medical and surgical use. The project begins with a co-creation phase, which is followed by an co-innovation phase where virtual reality, artificial intelligence and deep learning technologies are applied. The focus is both on creating a new channel for communication and on creating new value for radiology. Interpreting three-dimensional body tissues through two-dimensional image slices takes the doctors years to learn. This project produces more realistic, immersive 3D models with AR and VR technologies. These technologies provide surgeons and other medical staff with new visual data.

DPI is funded by Business Finland.

XAI: Explainable AI, one area of AI research that seeks to make AI functions understandable for humans.

2019-2022 Human-Centered Artificial Intelligence (KITE)

Acceptance of artificial intelligence (AI) applications requires that the developers of AI solutions consider end-users’ needs and concerns. This project focuses on co-creation of AI solutions and their design methods with companies utilizing applied research on human-centred technology. The aim is to create knowledge and skills in the design and development of AI solutions that are meaningful, understandable, acceptable and ethically sustainable for their users.

KITE is funded by the European Regional Development Fund.

2018-2022 Augmented Eating Experiences (AEE)

The aim of the Augmented Eating Experiences project is to understand how augmented reality (AR) and virtual reality (VR) technologies stimulating senses of sight, hearing, touch, taste and smell contribute to the perception of food texture and eating experiences.

Tampere Unit for Computer-Human Interaction (TAUCHI) will cooperate with VTT Technical Research Centre of Finland to create novel technologies for augmented eating and evaluate their impact on eating experience and promotion of health and wellbeing.

AEE is funded by the Academy of Finland.

The project seeks to transform eating experiences through digital means – for instance, machine-generated scents or augmented reality views.

Finished Projects – MMIG group

2019-2021 Multimodal In-Vehicle Interaction and Intelligent Information Presentation (MIVI)

Today’s cars have already become computers with wheels and this trend will continue at full pace. Car manufacturers are developing new information technologies and ways to present the driver with different types of information as needed. For the driver, however, the new systems unfortunately mean more things to learn and control. Especially in exceptional circumstances – such as near misses, traffic jams and unfavorable weather conditions – it is important that people receive exactly the information they need without getting information overload.

The MIVI project worked in the field of automotive human-technology interaction, studying and developing ways by which drivers are able to control the functions of vehicles more naturally. Another key area of research was the presentation of information through different human senses.

MIVI was funded by Business Finland.

Picture of fictitious car dashboard. Design by siili_auto.
Picture of fictitious car dashboard. Design by siili_auto.

2016-2018 Virtual and Augmented Reality Content Production and Use (VARPU)

The VARPU project helps companies to adopt Virtual Reality (VR) and Augmented Reality (AR) techniques in their production and business. We see two major bottlenecks today. First, 3D content production requires much manual artistry and is thus costly. Second, interaction techniques with these systems are still cumbersome. Together with the leading research groups, we solve these problems on selected use cases of our partner companies.

More information on VARPU web pages:

VARPU was funded by Tekes/Business Finland.

2012-2016 Haptic Gaze Interaction (HAGI)

The HAGI project works in the field of human-technology interaction, more specifically multimodal interaction techniques. The project will begin with basic research on combining of eye-pointing and haptic interaction. The basic research findings are then applied in the constructive research process in two areas: augmentative and alternative communication systems and interaction with mobile devices. It is expected that haptic interaction will make it possible to apply gaze-based user interfaces in mobile devices much better than what has been possible in the past, and therefore open new opportunities for natural and efficient interaction.

HAGI was a joint project with VIRG, funded by the Academy of Finland.

2012-2015 DIGILE Digital Services Program (DS)

DIGILE Digital Services program is a huge investment in the future of digital services business in Finland. The academic coordination of the research program is at the University of Tampere. TAUCHI’s main research topics in this program are connected to different kind of education services.

2010-2015 UXUS

In UXUS project, we develop advanced user interfaces for machinery industry and study the resulting new user experience in this context.

2011 – 2015 RYM Indoor Environment

The Tekes-funded project develops innovative learning environments.

2012 – 2014 AVO2

The project will develop advanced methods for teaching and it employs augmented reality technologies.

2012-2013 Active Learning Spaces (Aktiiviset oppimistilat)

The Tekes-funded project develops innovative and multimodal learning spaces for various education levels.

2009 – 2012 Drex

In the DREX project we use theatre work and interactive spatial technology to study new types of evental spaces. We build demonstrations and applications and work in close collaboration with industrial partners (including STX Finland, Heureka, Turku 2011 -foundation, SKS Mechatronics). In DREX we collaborate with Centre for Practice as Research in Theatre and Theatre and Drama Research. The project is funded by Tekes Spaces and Places programme. Internation co-operation partners include Trinity College Dublin, University of Queensland and Teesside University.

2011 – 2013 EnergyLand

The Tekes-funded project develops innovative energy solutions.

2011 – 2012 MOBSTER

MOBSTER project develops speech-based applications for health care.

2009 – 2012 DIYSE

Do-it-Yourself Smart Experiences is an Eureka ITEA2 consortium project. DiYSE aims at enabling ordinary people to easily create, setup and control applications in their smart living environments as well as in the public Internet-of-Things space, allowing them to leverage aware services and smart objects for obtaining highly personalised, social, interactive, flowing experiences at home and in the city.

2008 – 2012 DIEM

The Tekes-funded project develops scalable smart space interoperability solutions and novel user interfaces.

2011 – 2013 Haptic Auto

The Tekes-funded project develops new ways of interaction in cars.

2009 – 2012 Haptic Depth

The project is funded by the Academy of Finland. It deals with exploration of depth perception and non-pictorial cues in haptically reinforced interaction with 3D.

More information on project web pages:

2009 – 2012 Global Face Analysis

The project develops new ways of utilizing automatic face analysis in social-emotional human-technology interaction (HTI) and vision-based user interfaces. The main objective is to design and implement a new type of user interface that incorporates different machine vision techniques for the purpose of analyzing global facial information received from the user.

2010-2012 Human-Technology Interaction

The development program creates multidisciplinary collaboration and new projects.

2009 – 2011 HAPIMM

The Tekes-funded project develops haptic user interfaces.

2006 – 2011 Multimodal Interfaces for Transfer of Skills

SKILLS was an Integrated Project in the European IST FP6. It dealt with the acquisition, interpretation, storing and transfer of human skills by means of multimodal interfaces, robotics, virtual environments and interaction design. The novel approach of SKILLS is based on enactive paradigms of interaction between the human and the interface system.

2008 – 2009 Multimodal Gaming for Promoting Health

The objective of TERVI project was to carry out user-centered research and apply the newest interaction and software technology to developing interactive games that are designed to have an impact to health awareness and habits of young people. The aim was to have an impact on the whole life cycle by educating the young through gaming into awareness and care of their personal health.

2006 – 2009 Mobile Haptics

The joint research project between TAUCHI and Stanford University, USA, was focused on haptic feedback in unimodal and multimodal user interfaces. Novel device and software prototypes from constructive research were studied.

2007 – 2009 Täplä

The Tekes-funded project developed methods for new type of ubiquitous applications based on sound, speech, machine vision and multimodality. The new technology, applications and usability aimed at improving safety, comfort, automation, costs and quality of end user applications.

2007 – 2010 VISCOLE – Multimodal Interaction for Visually Impaired School Children

VISCOLE was funded by the Academy of Finland. A multimodal learning environment was constructed and studied. It supports collaboration and conceptual learning of visually impaired children in primary school groups. Multimodal interaction technology was utilized to make the system accessible and methods to support children’s collaboration was developed.

2008 Haptic Teaching 2.0

We created applications with haptic feedback for teaching physical phenomena. Research was conducted in Ylöjärvi comprehensive school in collaboration with teachers. By mimicking real world physics, and by enabling the sense of touch as a new way to sense and control objects, forces and material, applications offer new methods for learning.