Spatially Coordinated Auditory/Tactile Interactive Scenario


SCATIS - 6358

Work Area: Multi-Modal Human/Machine Interaction, Virtual Reality

Keywords multi-modal human/computer interaction, multi-modal human factors, virtual reality, multimedia, advanced interfaces, telepresence and control, HDTV, information retrieval, robotics


Start Date: 1 September 92 / Duration: 36 months / Status: running

[ participants / contact ]


Abstract Research based on the following scenario is proposed: A subject is exposed to a virtual space with various (invisible) auditory/tactile objects distributed in it. He/she will localise and identify these virtual objects auditorily, and be able to reach for them and grasp them individually. Upon tactile contact, contour, texture, and thermal attributes of the virtual objects will be perceived. It is the task of the subject to move the objects around manually, ie to re-arrange their spatial position and orientation according to an experimental plan. Auditory feedback is given.


Aims

SCATIS is a basic-research project in the area of Virtual Reality and Multimodal Human/Machine Interaction. It provides an advanced laboratory system, SCAT-LAB, for the generation of an interactive virtual auditory/tactile scenario - plus Human Factors research in this scenario. The scenario definition is as follows:

A subject is exposed to a virtual space with various (invisible) auditory/tactile objects distributed in it. He/she will localise and identify these virtual objects auditorily, and be able to reach for them and grasp them individually. Upon tactile contact, contour, texture, and thermal attributes of the virtual objects will be perceived. It is the task of the subject to move the objects around manually, ie to re-arrange their spatial position and orientation according to an experimental plan. Auditory feedback is given.

Approach and Methods

The SCATIS experimental scenario is considered to be representative of a variety of engineering problems in Multi-Media and Virtual-Reality applications with human beings involved in, eg, control, guidance, surveillance, access or evaluation tasks - not to mention training and entertainment. It has been chosen because the auditory and tactile domains are essential for intuitive Human/Machine Interaction, but still lack sufficient exploration with IT engineering in mind. Vision is not addressed in the project to avoid distraction and masking effects due to sensory dominance which might affect selective analysis of the auditory/tactile modalities: however, binocular visors can be interfaced to SCAT-LAB for software verification purposes.

Hardware includes a novel 36-channel binaural processor, a handmounted assembly with novel tactile effectors to transmit contour, texture and thermal information, as well as equipment necessary for tracking head and wrist-palm-fingers position and orientation. Software includes a virtual-world system, renderers, drivers, interfaces, subject monitor, task controller and editor. The complete system runs in real time. As soon as the components and the complete SCAT-LAB become available, multisensory pilot studies in human factors will be conducted. Research issues to be addressed are selected in order to support the construction of interactive VR systems. Towards the end of the current project, a detailed experimental plan for the application of SCAT-LAB in multimodal interactive Human-Factors research will be established.

Progress and Results

The work in the project is separated into three main phases: a definition and specification, implementation, and an integration and evaluation. The definition and specification phase was finished on in April 1993 and the first milestone document compiled. To this end a thorough state-of-the-art analysis was carried out dealing with hardware, software, and psychophyiscs. The architecture of the SCAT-LAB, was defined in terms of logical functionalities and information flow with respect to. the interaction of the single SCAT-LAB components. Additionally, the corresponding hardware setup was defined. The logical interfaces between all components of the SCAT-LAB were formulated.

By October 1993 it is planned to describe the interfaces in a high level programming language (C++) and to develop simple prototypes for most of the software components of the SCAT-LAB. Apart from some simple benchmarks, experimental results are not expected before mid-1994.

Potential

Virtual Reality systems constitute an enabling technology for various applications in modern information technology such as teleconferencing, high-definition tv, robotics (sensing and control), information retrieval, multi-media, and human/computer interaction. This broad technological potential is basically due to the fact that Virtual Reality systems offer the possibility of designing advanced interactive multi-modal human/machine interfaces. The SCATIS consortium is composed of three universities and two SMEs: the universities will disseminate the knowledge and experience gained from this project by means of publication, education and consulting, the SMEs plan on direct exploitation through collaborate product development.

Latest Publications

Information Dissemination Activies

SCATIS has been introduced at various national and international conferences during oral presentations (eg Danske Akustike Dage). An information package was compiled and is available from the contact point.


Coordinator

Lehrstuhl für allgemeine Elektronik und Akustik
Ruhr Universität Bochum - D
Universitätstrasse 150
Postfach 102 148
D - 33615 BOCHUM

Partners

Scuola Superiore S. Anna - I

Associate Partners

Head Acoustics GmBH - D
Aalborg Universitet - DK
S.M. Scienza Machinale srl - I

CONTACT POINT

Mr. Jens Blauert and Dr.-Ing. Hilmar Lehnert
tel +49/234 700 2496
fax +49/234 709 4165
e-mail: blauert@aea.ruhr-uni-bochum.de


LTR synopses home page LTR work area index LTR acronym index LTR number index LTR Projects index
All synopses home page all acronyms index all numbers index

SCATIS - 6358, August 1994


please address enquiries to the ESPRIT Information Desk

html version of synopsis by Nick Cook