Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
williambrent.conflations.com
The flexibility of current hardware and software has made the mapping of relationships between a sound's parameters and physical source of control a relatively trivial task. Consequently, the endeavor of sophisticated digital instrument design has been accessible to the creative community for several years, which has resulted in a host of new instruments that explore a variety of physical mappings. The emphasis on physicality exhibited by so-called "gestural controllers" stands in contrast to the practice of conventional laptop performance. While the laptop computer is certainly a digital musical instrument, its associated performance practice is often criticized based on a perceived lack of physicality. This paper examines motivations behind the foregrounding of gesture in computer-based performance. Critical theory and neuroscience research are drawn upon in order to consider ways in which the desire for connections between sound and motion amount to more than mere fascination with virtuosity.
Proceedings of the 2014 International Workshop on Movement and Computing - MOCO '14, 2014
This paper describes the implementation of gestural mapping strategies for performance with a traditional musical instrument and electronics. The approach adopted is informed by embodied music cognition and functional categories of musical gestures. Within this framework, gestures are not seen as means of control subordinated to the resulting musical sounds but rather as significant elements contributing to the formation of musical meaning similarly to auditory features. Moreover, the ecological knowledge of the gestural repertoire of the instrument is taken into account as it defines the action-sound relationships between the instrument and the performer and contributes to form expectations in the listeners. Subsequently, mapping strategies from a case study of electric guitar performance will be illustrated describing what motivated the choice of a multimodal motion capture system and how different solutions have been adopted considering both gestural meaning formation and technical constraints.
Computer-mediated music control devices compel us to reexamine the relationship between performer and sound, the nature and complexity of which is theoretically unlimited. This essay attempts to formulate some of the key aesthetic issues raised by the use of new control interfaces in the development of new musical works and new performance paradigms: mapping the gesture-sound relationship, identifying successful uses of “virtual” instruments, questioning the role of “interactivity” in performance, and positing future areas of exploration.
2015
With all computer music improvements in the last decades, musical gestures started to be used as parameter for controlling dynamic processes that compound a musical performance. This article intends to describe the processes and methods employed in a composition of this type: "Gest'Ação I" for classical guitar and live electronics which uses a set of 'instrumental gestures' for the guitar as part of the musical performance. The intent here was to elaborate a musical piece that could join, in a free process, the idea of 'extended technique' to the "writing/instrument extended' using open-sourced musical software. This article briefly discusses the concept of "gesture" as defined by Wanderley (2000) and Delande (1998), especially concerning the guitar performance. It is also discussed the concepts of "extended technique" and "extended instrument" as defined by Padovani and Ferraz (2011). Here two open-source softwares were used: MuseScore2.0 (for the score writing) and Pure Data (for the audio processing in real time).
2014
This paper describes the implementation of gestural mapping strategies for performance with a traditional musical instrument and electronics. The approach adopted is informed by embodied music cognition and functional categories of musical gestures. Within this framework, gestures are not seen as means of control subordinated to the resulting musical sounds but rather as signi cant elements contributing to the formation of musical meaning similarly to auditory features. Moreover, the ecological knowledge of the gestural repertoire of the instrument is taken into account as it de nes the action-sound relationships between the instrument and the performer and contributes to form expectations in the listeners. Subsequently, mapping strategies from a case study of electric guitar performance will be illustrated describing what motivated the choice of a multimodal motion capture system and how di erent solutions have been adopted considering both gestural meaning formation and technical constraints.
2007
Laptop performance of computer music has become wide spread in the electronic music community. It brings with it many issues pertaining to the communication of musical intent. Critics argue that performances of this nature fail to engage audiences as many performers use the mouse and keyboard to control their musical works, leaving no visual cues to guide the audience as to the correlation between performance gestures and musical outcomes. Interfaces need to communicate something of their task. The author will argue that cognitive affordances associated with the performance interface become paramount if the musical outcomes are to be perceived as clearly tied to realtime performance gestures, ie. That the audience is witnessing the creation of the music in that moment as distinct to the manipulation of pre-recorded or pre-sequenced events.
MUSICA movet : affectus, ludus, corpus, 2019
According to the theory of embodied cognition, gestures are at the very heart of human cognitive processes. The idea of embodied cognition is based on cognitive schemata and categories that emerge from the amassed experience of being and acting in the world. In human cognitive processes, many features of cognition are shaped by aspects of the entire body of the organism, so the physical domain serves as a source domain for understanding an idea or conceptual domain, using the tools of metaphor. As a basic element of the physical domain, the phenomenon of gesture has garnered particular attention and it has been recently studied in various fields such as phenomenology, EMT (Extended Mind Thesis), psychology, neuro-phenomenology, neo-cognitivism, robotics, critical theory, linguistics, neuroscience, constructivism, but also in music theory. In music, gestures encompass a large territory-from purely physical (bodily) on one side of the axis to mental (imagined) on the other. From a student's adopting of a teacher's posture, even facial expressions, to the syndrome of "watching" music, as in conductors' and players' gestures, both practical (sound producing) and expressive (auxiliary), to metaphorical concepts of up and down in intervals, scales, or as recognizable idioms of a composer's language (strategic) or style (stylistic), the phenomenon of gesture in music can be explored and perceived from many different viewpoints. In this paper, the issue of the inseparability of body and sound in musical practice will be explored, especially how these two basic types of gesture in music can intertwine and help deepen its performance. For this purpose, Alexandra Pierce's embodied analytical exercises will be used, those which enable the performer to explore gesturally the expressive meaning of a musical piece. It will be demonstrated that musical gesture supports performance-oriented analysis more than we think, know, or imagine.
Building on the insights of the first volume on Music and Gesture (Gritten and King, Ashgate 2006), the rationale for this sequel volume is twofold: first, to clarify the way in which the subject is continuing to take shape by highlighting both central and developing trends, as well as popular and less frequent areas of investigation; second, to provide alternative and complementary insights into the particular areas of the subject articulated in the first volume. The thirteen chapters are structured in a broad narrative trajectory moving from theory to practice, embracing Western and non-Western practices, real and virtual gestures, live and recorded performances, physical and acoustic gestures, visual and auditory perception, among other themes of topical interest. The main areas of enquiry include psychobiology; perception and cognition; philosophy and semiotics; conducting; ensemble work and solo piano playing. The volume is intended to promote and stimulate further research in Musical Gesture Studies.
This dissertation presents a new and expanded context for interactive music based on Moore’s model for computer music (Moore 1990) and contextualises its findings using Lesaffre’s taxonomy for musical feature extraction and analysis (Lesaffre et al. 2003). In doing so, the dissertation examines music as an expressive art-form where musically significant data is present not only in the audio signal but also in human gestures and in physiological data. The dissertation shows the model’s foundation in human perception of music as a performed art, and points to the relevance and feasibility of including expression and emotion as a high-level signal processing means for bridging man and machine. The resulting model is multi-level (physical, sensorial, perceptual, formal, expressive) and multi-modal (sound, human gesture, physiological) which makes it applicable to purely musical contexts, as well as intermodal contexts where music is combined with visual and/or physiological data. The model implies evaluating an interactive music system as a musical instrument design. Several properties are examined during the course of the dissertation and models based on acoustic music instruments have been avoided due to the expanded feature set of interactive music system. A narrowing down of the properties is attempted in the dissertation’s conclusion together with a preliminary model circumscription. In particular it is pointed out that high-level features of real-time analysis, data storage and processing, and synthesis makes the system a musical instrument, and that the capability of real-time data storage and processing distinguishes the digital system as an unprecedented instrument, qualitatively different from all previous acoustic music instrument. It is considered that a digital system’s particular form of sound synthesis only qualifies it as being of a category parallel to the acoustic instruments categories. The model is the result of the author’s experiences with practical work with interactive systems developed 2001-06 for a body of commissioned works. The systems and their underlying procedures were conceived and developed addressing needs inherent to the artistic ambitions of each work, and have all been thoroughly tested in many performances. The papers forming part of the dissertation describe the artistic and technological problems and their solutions. The solutions are readily expandable to similar problems in other contexts, and they all relate to general issues of their particular applicative area.
2004
New Interfaces for Musical Expression must speak to the nature of 'instrument', that is, it must always be understood that the interface binds to a complex musical phenomenon. This paper explores the nature of engagement, the point of performance that occurs when a human being engages with a computer based instrument. It asks questions about the nature of the instrument in computer music and offers some conceptual models for the mapping of gesture to sonic outcomes.
Proceedings of the Audio Mostly 2015 on Interaction With Sound - AM '15, 2015
Following a call for clear movement-sound relationships in motion-controlled digital musical instruments (DMIs), we developed a sound design concept and a DMI implementation with a focus on transparency through intuitive control metaphors. In order to benefit from the listener’s and performer’s natural understanding of physical processes around them, we use gestures with strong physical associations as control metaphors, which are then mapped to sound modules specifically designed to represent these associations sonically. The required motion data can be captured by any low-latency sensor device worn on the hand or wrist, that has an inertial measurement unit with six degrees of freedom. A dimension space analysis was applied on the current implementation in order to compare it to existing DMIs and illustrate its characteristics. In conclusion, our approach resulted in a DMI with strong results in transparency, intuitive control metaphors, and a coherent audio-visual link.
2010
The Arcontinuo is an electronic musical instrument designed from a perspective based in the study of their potential users and their interaction with existing musical interfaces. The study of this interaction yields information that could give place to better interfaces, which respond to natural gestures in musical situations, and at the same time incorporate the latest advances in digital technology. The article explains the methodology fundamentals both qualitatively and quantitatively, focusing on two key elements in the performer interaction with instruments: gestures and objects. The statistical analysis of the results leads to the proposition of several mockups, one of which is chosen and described in details including its hardware and software implementation.
Kansei, The Technology of …, 1997
This paper presents ongoing work on gesture mapping strategies and applications to sound syn-thesis by signal models controlled via a standard MIDI wind controller. Our approach consists in considering di erent mapping strategies in order to achieve \ ne" (therefore in ...
2000
In this paper, we present the SensOrg, a musical CyberInstrument designed as a modular assembly of input/output devices and musical software, mapped and arranged according to functional characteristics of the Man-Instrument system. We discuss how the cognitive ergonomics of non-verbal and symbolic task modalities influenced the design of our hardware interface for asynchronous as well as synchronous task situations. Using malleable atoms and tangible bits, we externally represented the musical functionality in a physical interface which is totally flexible yet completely freezable. Introduction Musicians strive many years in order to connect their neural pathways to a vibrating segment of string, wood, metal or air. In many ways, learning how to play a musical instrument is dictated by the physical idiosyncrasies of the instrument design. A good instrumentalist typically needs to start almost from scratch when trying to play a new instrument. Even when musicians master their instrum...
Organised Sound
This article explores how computation opens up possibilities for new musical practices to emerge through technology design. Using the notion of the cultural probe as a lens, we consider the digital musical instrument as an experimental device that yields findings across the fields of music, sociology and acoustics. As part of an artistic-research methodology, the instrumental object as a probe is offered as a means for artists to answer questions that are often formulated outside semantic language. This article considers how computation plays an important role in the authors’ personal performance practices in different ways, which reflect the changed mode-of-being of new musical instruments and our individual and collective relations with them.
Leonardo Electronic Almanac (Touch and Go) , 2012
Think about your body. Consider its capability of channeling articulate information with a single gaze, the dramatic force of a gesture propulsed by muscle tissue contractions, the sympathetic rhythmic changes in the heartbeat when listening to someone else’s palpitations, the meaningful shifting patterns of the brain wave cycles when drifting from relaxation to heightened mental activity. These are nothing but physiological and intimate processes that become externalized to affect the people and the space surrounding us. Once tangible, those processes can be captured, observed, strumentalized or augmented through technology, and become therefore informative (or shall we say informatic) media that are biological in nature. In contemporary electronic music performance this paradigm has exposed creative strategies that had been overlooked so far. This article places the biological media in the ‘broken ground’ where body and computational system interact musically with each other. It questions and defines the qualities of a gesture in the context of biologically sensitive musical instruments, providing therefore a framework to introduce a visceral model of electronic music performance; one in which the sonic matter incarnated within the tissues of the body rises and breaks through the skin to become tangible and shared experience
Journal on Multimodal …, 2010
This paper contributes to the development of a multimodal, musical tool that extends the natural action range of the human body to communicate expressiveness into the virtual music domain. The core of this musical tool consists of a low cost, highly functional computational model developed upon the Max/MSP platform that (1) captures real-time movement of the human body into a 3D coordinate system on the basis of the orientation output of any type of inertial sensor system that is OSC-compatible, (2) extract low-level movement features that specify the amount of contraction/expansion as a measure of how a subject uses the surrounding space, (3) recognizes these movement features as being expressive gestures, and (4) creates a mapping trajectory between these expressive gestures and the sound synthesis process of adding harmonic related voices on an in origin monophonic voice. The concern for a user-oriented and intuitive mapping strategy was thereby of central importance. This was achieved by conducting an empirical experiment based on theoretical concepts from the embodied music cognition paradigm. Based on empirical evidence, this paper proposes a mapping trajectory that facilitates the interaction between a musician and his instrument, the artistic collaboration between (multimedia) artists and the communication of expressiveness in a social, musical context.
This work describes a new approach to gesture mapping in a performance with a traditional musical instrument and live electronics inspired by theories of embodied music cognition (EMC) and musical gestures. Considerations on EMC and how gestures a ect the experience of music inform di erent mapping strategies. Our intent is to enhance the expressiveness and the liveness of performance by tracking gestures via a multimodal motion capture system and to use motion data to control several features of the music. We then describe an application of such approach to a performance with electric guitar and live electronics, focusing both on aspects of meaning formation and motion capturing.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.