Personal tools
You are here: Home Members DanielArfib's Home

Home page for DanielArfib

Daniel ARFIB is working at the Laboratoire de Mecanique et Acoustique, CNRS, in Marseille on fields linked to Computer Music since 1977 (see http://w3lma.cnrs-mrs.fr/~IM/en_index.htm). He has first been working on digital synthesis of sounds, and has passed his PhD thesis on the combination of waveshaping synthesis and amplitude modulation and has been responsible for improvements and implementations of the Music V program. He has then been working on sound transformations, the culmination of this period being his role of coordination of the COST DAFx action, and particulary his participation to the DAFx book. In these last years, he has been developing both theoretical and practical aspects in the design of digital musical instruments ( a project named "creative gesture in computer music"), and is now participating in a new Cost action named ConGAS (gesture controlled audio systems) as the leader of a working group on gesture analysis. He is currently an invited research in the "geneva emotion research group".

Parallely with this scientific aspect, he is a composer and performer. He has composed some electroacoustic works using computer programs, such as "Voyelles d'Eveil", "Le Souffle du Doux", ""l'Imminence de la lumière", 'Etoiles", "Fragments complets". In the recent years he has been performing with the "Tutti quanti computing orchestra" (see videos in http://tqco.free.fr/videos.htm) and the "Fotosonix" group, using instruments coming out from his research.

His main concern with the "Humaine" field is the fact that digital music instruments are perfect testbeds for emotional interfaces. As an exemple, retrieving the intention in a gesture and applying it to the expressivity of a sound not only requires data capture and an synthesis algorithm, but also a strategy of mapping which may include a perceptive analysis of the sound produced. This matter of mapping is a key point in the gesture-sound link,with a focus on feedbacks, be them sonic, visual or haptic. This can be called "motion and emotion" in digital music instruments.

His personal plans for the future are to explore in this context  the "sonic textures" domain, in a three fold situation: the perception of natural and synthetic textures, the gesture control of synthesis or analysis/synthesis textures, and the use of such textures/devices in human-machine interaction schemes.


Document Actions
Powered by Plone

Portal usage statistics