Project Details
SPP 2236: Auditory Cognition in Interactive Virtual Environments – AUDICTIVE
Subject Area
Mechanical and Industrial Engineering
Computer Science, Systems and Electrical Engineering
Medicine
Social and Behavioural Sciences
Computer Science, Systems and Electrical Engineering
Medicine
Social and Behavioural Sciences
Term
since 2020
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 422686707
Considerable progress has been made over the past years in the understanding of auditory cognitive processes and capabilities - from perception, attention and memory to complex performances such as scene analysis and communication. AUDICTIVE aims to significantly extend the knowledge of hearing-related cognitive performances in real-life scenes and to enable creating “auditory-cognition-validated” VR technology. AUDICTIVE targets fundamental research addressing the three research priorities (a) “auditory cognition,” (b) “interactive audiovisual virtual environments,” and (c) “quality evaluation methods,” the latter being located at the interface between (a) and (b). The first phase of AUDICTIVE focussed mostly on porting the well-controlled but often unrealistic stimulus presentations used in auditory cognition research to more comprehensive virtual or mixed reality environments. Here, recent developments in hard- and software technologies were reflected, with audiovisual virtual and mixed reality (VR, MR) reaching a high level of perceptual plausibility. The results of the first phase have laid the foundation for the second phase of AUDICTIVE, which aims to identify or (further) develop suitable paradigms to use them in more realistic scenes, aiming to elicit close-to natural perception, experience, and/or behaviour. Hence, in the second phase of AUDICTIVE, we seek to further expand the scientific level of knowledge, theories, and models that have been developed within basic auditory perception and cognition research to even more realistic daily-life situations. Here, new knowledge, methods, and techniques shall be generated (e.g., psychometric and cognitive assessment, QoE evaluation, physiological or behavioural analysis, signal acquisition and analysis, VR/MR technology enhancement), for richer and more complex scenarios, involving interactive VR and/or MR technology. The coordination project will foster an open science approach, to continue developing a comprehensive database of results. All projects are to include an evaluation of the quality of VR and/or MR environments for research into auditory cognition or of the validity of research results on auditory cognition in VR and/or MR environments. All projects have to contribute to the central research data management. In addition, the coordination project will bring the RDM into practice by conducting a round robin test at different facilities. At this point, a good FDM becomes visible and obvious. Because a good exchange of experiments, scenes and the technique used is only possible if all the various steps are well documented and available. Only in this way are reproducible experiments possible. Furthermore, the coordination project will coordinate a book publication that aims at disseminating the results from both funding periods to a broad audience. Contributing authors and editors will include experts from the fields of acoustics, psychology, computer science, and beyond.
DFG Programme
Priority Programmes
Projects
- APlausE-MR - Audiovisual Plausibility and Experience in Multi-Party Mixed Reality (Applicants Brandenburg, Karlheinz ; Fröhlich, Bernd ; Raake, Alexander )
- Audio-visual perception of vehicles in traffic, phase 2 (Applicants Altinsoy, Ercan ; Oberfeld-Twistel, Daniel )
- Cognitive and signal-driven factors in static and dynamic distance perception (Applicants Ewert, Stephan ; Flanagin, Virginia ; van de Par, Steven )
- Coordination Funds (Applicant Fels, Janina )
- Cortical and behavioural measures of active communication (Applicants Debener, Stefan ; Hohmann, Volker )
- Evaluating auditory cognition in classroom environments across age groups from preschool children to adults using audiovisual virtual reality - EArAge-VR (Applicants Fels, Janina ; Klatte, Maria ; Raake, Alexander )
- EVOLVE-QoE - Ecological Validity Evaluation of Interactive Virtual Environments: A QoE Framework for Audiovisual Scenes (Applicants Habets, Emanuël ; Raake, Alexander )
- Examining attention, memory performance, and listening effort (exAMPLE): Understanding listeners' cognitive performances in complex audiovisual communication settings with embodied conversational agents (Applicants Fels, Janina ; Kuhlen, Torsten Wolfgang ; Schlittmeier, Sabine J. )
- Influence of audio rendering in virtual environments on realism, presence, and socio-cognitive processing (Applicants Blau, Matthias ; Kroczek, Leon ; Mühlberger, Andreas ; van de Par, Steven )
- Investigating Auditory and Audio-Visual Motion Perception in Real and Virtual 3D-Spaces as well as Effects of Attention and Training (Applicants Getzmann, Stephan ; Martin, Rainer )
- Mechanisms of spatial attention and integration along the three dimensions of auditory space (Applicants Schönwiesner, Marc ; Weinzierl, Stefan )
Spokesperson
Professorin Dr.-Ing. Janina Fels