domingo, 28 de março de 2010

INTERACTING WITH SOUND

INTERACTING WITH SOUND - AN INTERACTION PARADIGM FOR VIRTUAL AUDITORY WORLDS

1. INTRODUCTION

Vision is considered as being the most important of our senses from which we derive the majority of our environmental information. Hearing is often assumed to play a minor role only, but it assists in extending the horizon that is set by our visual system. Using visual information, we are able to clearly identify the environment in front of us within the defined viewing angle. However, as we can not see through most objects, we can only identify objects that are visible or visible through reflection or refraction. One advantage of the auditory field of view is that it enables us to perceive audible information from objects that are hidden or outside the viewing cone. Combined with visual information, it draws a complete picture of our local environment, thus enabling us to proper interact with it [18]. In this paper we describe methods for sonification and interaction in virtual auditory worlds. The focus is on techniques that provide the listener with enough information for a clear interaction and navigation. The motivation behind this work is to create a catalogue of sonification and interaction techniques suitable for the exploration of virtual auditory spaces.

2. AUDITORY ELEMENTS

All operations performed in virtual environments require either visual or auditory information to be present. Some applications have been developed that can substitute the visual knowledge through auditory descriptions. Examples include audio books and radio plays [1], audio only games [2], sonification techniques for scientific

data sets [28] and assistive auditory displays for the navigation of visually impaired people [17], [10]. The pattern of the auditory signals used and the functionality vary depending upon the applications requirements. Using physical explanations, sound is a mechanical vibration which is transmitted by an elastic medium. It can also be described

as being the audible part of a transmitted signal that was emit by a physical process. On a larger level, these auditory phenomena can be grouped together, revealing three basic auditory elements of which every audible sequence in our environment is composed off:

Speech,

Music, and

Natural or artificial sounds.

The auditory spectrum is composed of auditory sequences, which itself are constructed by these three elements. Speech is a verbal transmission of information by using words as an abstract representation and is mainly used for communication. Music is the concatenation of tones, resulting in harmonic compositions and often used to express or trigger emotions. Music is generally used on top of speech and sound to accompany the presented information.

The largest group is build by natural and artificial sounds, which describe audio signals that depict a physical object or process, eg. starting a cars engine or the sound of leaves rustling in a tree. Technically, music and speech are special cases of sound. Depending on the type and the importance of information, sounds can further be grouped as main sounds, supporting sounds and ambient information. Each of these auditory elements is suited best to express a certain piece of information. In general, speech is mainly used to transmit knowledge, news or advice, like the oral description of a scene.

3. SONIFICATION AND INTERACTION

The most crucial part in audio only applications is the correct transmission of non-auditory information through auditory channels. It is nearly impossible to describe an image using non-speech sound or to visualize an opera using a picture or an animation. As a result, speech has emerged to assist in the communication process. In both ways, several attempts exist, some of them with pretty good results.

3.1. Sonification

Sonification is defined as the mapping of abstract data to nonspeech sound and used to transmit arbitrary information through auditory channels. We constantly perceive information from our environment through our sensory apparatus, mostly vision and sound, which is filtered, analyzed and interpreted. As we receive large amounts of data, some is filtered out at early stages and not actively perceived. Strongly correlated to sonification is interaction. With the focus on sonification of auditory worlds, several information groups can be identified. These groups can be summarized by the following questions:

Where is something?

What is this?

What can I do with it?

The first question deals with environmental information that allows for orientation and navigation within the world. The second question characterizes the information that is necessary to identify and analyze objects, while the last question states possible interactions with interactable objects.

3.2. Interaction

To effectively convey information to the listener, the hearing behavior has to be incorporated into the sonification process. Humans often tilt their head to determine the location of a sound source in ambiguous cases or when listening very precisely. Technically, this behaviour results in a different input angle for the sound signal and helps to accurately locate the sound source and additionally to listen more focused. When interacting in virtual auditory environments, special care has to be taken for the sound localization process as this is mandatory to determine the own position within the environment. Additionally, spatialized sound can assist in the process of discrimination between several audio signals if they originate from different locations. However, if too many sound sources are presented at the same time, the auditory display can easily get cluttered, resulting in a meaningless sonification.

4. DESIGNING A FRAMEWORK

Based on the discussion in the previous sections, this section layouts an initial design for a framework, which allows an intuitive and easy interaction with narrative environments. The focus for this framework is the later use in narrative environments for interactive adventures by utilizing only positional and environmental audio as information sources. The goal is further to design this framework as open as possible to allow an easy adaptation to other fields and explore possible applications in tele-conferencing, audio-action games, mobile auditory displays and general nonvisual user interfaces based on 3D audio. This section discusses work in progress. The motivation behind this work is to create an immersive non-visual user interface, which is able to interactively guide a user through either entertaining auditory worlds or to use these interfaces for mobile applications, where the desire for a free view is mandatory. The design was mainly motivated by the actions the listener should be able to accomplish.

By Niklas Röber and Maic Masuch

From: Games Research Group Institut for Simulation and Graphics Magdeburg, Germany

" Estudo realizado visando a interação que se tem com o som nos mais diversos ambientes.

É de extrema importância a existência da interação entre o discurso, a música e os sons naturais ou artificiais dentro de um jogo na ambientação e na orientação do jogador. O som tem que estar de acordo com a sequência das ações que o jogador tomar para dar mais realismo, assim como o design do game."

Nenhum comentário:

Postar um comentário