Papers by Vincent Verfaille
This is the template file for the proceedings of the 9 th International Conference on Digital Aud... more This is the template file for the proceedings of the 9 th International Conference on Digital Audio Effects (DAFx-06). This template has been generated from WASPAA'99 templates and aims at producing conference proceedings in electronic form. The format is essentially the one used for ICASSP conferences.
DAFX: Digital Audio Effects, 2011
Taylor & Francis
Bangert, Marc 243 Batlle, Eloi 65 Bello, Juan Pablo 223 Benbasat, Ari Y. 395 Blaine, Tina 411 Bon... more Bangert, Marc 243 Batlle, Eloi 65 Bello, Juan Pablo 223 Benbasat, Ari Y. 395 Blaine, Tina 411 Bonada, Jordi 95 Bonnet, Madeleine 65 Bresin, Roberto 317 Byrd, Don 223 ... Canazza, Sergio 281 Cano, Pedro 65 Cook, Perry R. 143, 351 Crawford, Tim 223 ... Dannenberg, Roger B. 153 Davidson, Philip 351 de CT Gomes, Leandro 65 De Poli, Giovanni 281, 295 Dillon, Roberto 327 Dovey, Matthew 223 Downie, J. Stephen 121 Dubnov, Shlomo 3 ... Eggen, Berry 179 Ermolinskyi, Andrey 143 Essl, Georg 351 ... Fabien Gouyon 41 Fels, Sidney 411 ...
Université de la Méditerranée - Aix-Marseille I
We present the implementation of two different sonifications of ancillary gestures from clarineti... more We present the implementation of two different sonifications of ancillary gestures from clarinetists. The sonifications are data driven from the clarinetist's posture which is captured with VICON motion tracking system. Further we develop a simple complementary visual display with a similar information content to match the sonification. The effect of the sonifications with respect to the movement perception is studied in an experiment where test subjects annotate the clarinetists performance represented by various combinations of the resulting uni- and multimodal displays.
Journées d'Informatique Musicale, May 1, 2002
Les paramètres de contrôle des effets audionumériques peuvent évoluer dans le temps, soit manuell... more Les paramètres de contrôle des effets audionumériques peuvent évoluer dans le temps, soit manuellement par l'utilisateur, soit automatiquement par un programme. Nous proposons une automatisation de l'effet à partir de paramètres extraits du son. Trois effets auto-adaptatifs sont ainsi proposés : un ralenti-accéléré sélectif, une robotisation et un écho granulaire adaptatifs.
In this paper we present an approach for the realtime indirect acquisition of specific fingerings... more In this paper we present an approach for the realtime indirect acquisition of specific fingerings that produce harmonic notes on the flute. We analyse both temporal and spectral characteristics of the attack of harmonic notes which are being produced by specific control gestures involving fingering and potentially overblowing. We then show that it is possible to acquire this effect through signal analysis using a principal component analysis on the subharmonics of harmonic notes. An 8-fold cross-validation showed this approach to be successful for a single performer playing isolated notes. 1.

Expert musical performance is rich with movements that facilitate performance accuracy and expres... more Expert musical performance is rich with movements that facilitate performance accuracy and expressive communication. Studying these movements quantitatively using high-resolution motion capture systems has been fruitful, but analysis is arduous due to the size of the data sets and performance idiosyncrasies. Compared to visual-only methods, sonification provides an interesting alternative that can ease the process of data analysis and provide additional insights. To this end, a sonification tool was designed in Max/MSP that provides interactive access to synthesis mappings and data preprocessing functions that are specific to expressive movement. The tool is evaluated in terms of its ability to fulfil the goals of sonification in this domain and the goals of expressive movement analysis more generally. Additional benefits of sonification are discussed in light of the expressive and musical context.
This paper presents a mapping strategy for the control of a signal-based digital synthesis model ... more This paper presents a mapping strategy for the control of a signal-based digital synthesis model of the clarinet. This strategy combines the use of a synthesis model based on a physical model with the use of a signal model based on additive synthesis. From the output of the physical model, specific sound descriptors are extracted and used to control an additive synthesis model based on the analysis of natural sounds. This approach allows to benefit both from the control quality of the physical modeling with the sound quality provided by the additive resynthesis of natural sounds. However, some improvements are still required to better interface and set the two models, particularly concerning transients.
This article intends to demonstrate how a specific digital audio effect can benefit from a proper... more This article intends to demonstrate how a specific digital audio effect can benefit from a proper control, be it from sounds and/or from gesture. When this control is from sounds, it can be called “adaptive” or “sound automated”. When this control is from gesture, it can be called “gesturally controlled”. The audio effects we use for this demonstration are time-scaling and pitch-shifting in the particular contexts of vibrato, prosody change, time unfolding and rythm change.

Through advanced recording and simulation possibilities the amount of 3D movement data is constan... more Through advanced recording and simulation possibilities the amount of 3D movement data is constantly growing. This data type stems from different origins ranging from sports, HCI and music research (body movements and gestures). Although these data sources look at first glance very disparate, they are in fact structurally very similar since they represent dynamic processes in 3D space with many degrees of freedom. The standard technique to investigate alike data is the scientific 3D visualization of moving points or models. However, the rather young research field of sonification offers novel inspection techniques that complement visual analysis by transforming the data into audible sound so that particularly dynamic patterns can be understood by listening. This is beneficial for several reasons: Firstly, if properly used, sonification is ideal for representing multivariate data sets with temporally complex information such as fast transient motions. Secondly, in many applications t...

The aim of this paper is to propose an interdisciplinary classification of digital audio effects ... more The aim of this paper is to propose an interdisciplinary classification of digital audio effects to facilitate communication and collaborations between DSP programmers, sound engineers, composers, performers and musicologists. After reviewing classifications reflecting technological, technical and perceptual points of view, we introduce a transverse classification to link disciplinespecific classifications into a single network containing various layers of descriptors, ranging from low-level features to high-level features. Simple tools using the interdisciplinary classification are introduced to facilitate the navigation between effects, underlying techniques, perceptual attributes and semantic descriptors. Finally, concluding remarks on implications for teaching purposes and for the development of audio effects user interfaces based on perceptual features rather than technical parameters are presented.
In this paper we present an approach for the realtime indirect acquisition of specific fingerings... more In this paper we present an approach for the realtime indirect acquisition of specific fingerings that produce harmonic notes on the flute. We analyse both temporal and spectral characteristics of the attack of harmonic notes which are being produced by specific control gestures involving fingering and potentially overblowing. We then show that it is possible to acquire this effect through signal analysis using a principal component analysis on the subharmonics of harmonic notes. An 8-fold cross-validation showed this approach to be successful for a single performer playing isolated notes.
This research project concerns the simulation of interpretation in live performance using digital... more This research project concerns the simulation of interpretation in live performance using digital instruments. It addresses mapping strategies between gestural controls and synthesizer parameters. It requires the design and development of a real time additive synthesizer with flexible control, allowing for morphing, interpolating and extrapolating instrumental notes from a sound parameters database. The scope of this paper is to present the synthesizer, its additive and spectral envelope control units, and the morphing they allow for.
The confproc package is a new L ATEX 2e document-class for building confer- ence proceedings. It ... more The confproc package is a new L ATEX 2e document-class for building confer- ence proceedings. It derives form LaTeX scripts written for the DAFx-06 conference proceedings, mainly based on the pdfpages package for including the proceedings papers and the hyperref package for creating proper table of contents, bookmarks and general bibliography back-references. It also uses many other packages for fine tuning of table of contents, bibliography and index of authors. The added value of this class resides in its time-saving aspects when designing conference proceed- ings. See readme.txt for a short overview and additional (legal) information, and example.tex and corresponding files and scripts for an example of use.
Uploads
Papers by Vincent Verfaille