Project Details
Projekt Print View

Auditory perception of sound reflections and source localization in dynamic scenes

Subject Area Acoustics
Term from 2014 to 2019
Project identifier Deutsche Forschungsgemeinschaft (DFG) - Project number 249703531
 
Apart from visual information, acoustic information enables human beings to orient themselves in their surroundings. This is especially true for emergencies in smoke-filled rooms, dark or foggy environments, or in case of visually impaired people. In complex environments, human beings use early boundary reflections (walls, ceilings, floors, large objects) and their relation to a sound source for an acoustic orientation. The auditory perception of complex acoustic scenes will be studied for this project. Different scenes such as free field scenes, large rooms, or outdoor scenes (e.g. orientation in a city) will be evaluated. Nowadays image sources are used to generate the perceived early reflections for real-time room acoustic simulations. If the relative orientation or the position of a wall changes due to an approximation of the geometric model of the environment, the position of the image source will change as well. Thus, the auditory perception of this scene will change. The environmental model that provides the basis for this project will only contain dominate early reflections. This means that exterior spaces and listener positions that are close to the walls in very large rooms are not taken into account. First of all, the perception of people with normal hearing (not visually impaired) will be analyzed using simple geometric room models. The maximum non-audible variation of a boundary interface (distance/rotation) depends on binaural parameters, interaural time and level difference, and psycho-acoustic factors. In the first phase, the correlation of binaural parameters, psychoacoustic factors, and the (few) image sources shall be studied as a function of the wall size, the head movement, and the distance of the test subject from the wall. In the second phase, the scene perception of a virtual acoustic environment will be studied taking into account the head movement as well as the distance between the source and the listener. Several sources with different positions will be integrated into the scenes for the test subjects. The degree of coherence, i.e. the similarity of the signals, shall be reduced to a minimum, to avoid phantom sources. Experiments will be carried out to determine the influence of the distance between sound source and test subject, and the head movement on the maximum number of audible sources. The results of this project can be used to create a room model in the virtual acoustic reality. Another possible application is the virtual, acoustic training of visually impaired persons to enable them to orient themselves in a city or in an unknown environment.
DFG Programme Research Grants
 
 

Additional Information

Textvergrößerung und Kontrastanpassung