Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
AI
The chapter discusses the evolution and impact of sensor-based musical instruments within the context of the New Interfaces for Musical Expression (NIME) movement. It highlights how technologies have transformed traditional musical practices by enabling new forms of gestural interaction and musical expression, emphasizing the historical development leading up to NIME's founding. The text also explores the integration of analog and digital technologies in capturing musical gestures and outlines current trends such as the use of biometric signals and location tracking in musical performance.
williambrent.conflations.com
The flexibility of current hardware and software has made the mapping of relationships between a sound's parameters and physical source of control a relatively trivial task. Consequently, the endeavor of sophisticated digital instrument design has been accessible to the creative community for several years, which has resulted in a host of new instruments that explore a variety of physical mappings. The emphasis on physicality exhibited by so-called "gestural controllers" stands in contrast to the practice of conventional laptop performance. While the laptop computer is certainly a digital musical instrument, its associated performance practice is often criticized based on a perceived lack of physicality. This paper examines motivations behind the foregrounding of gesture in computer-based performance. Critical theory and neuroscience research are drawn upon in order to consider ways in which the desire for connections between sound and motion amount to more than mere fascination with virtuosity.
Is there a distinction between New Interfaces for Musical Expression and New Interfaces for Controlling Sound? This article begins with a brief overview of expression in musical performance, and examines some of the characteristics of effective “expressive” computer music instruments. It becomes apparent that sophisticated musical expression requires not only a good control interface but also virtuosic mastery of the instrument it controls. By studying effective acoustic instruments, choosing intuitive but complex gesture-sound mappings that take advantage of established instrumental skills, designing intelligent characterizations of performance gestures, and promoting long-term dedicated practice on a new interface, computer music instrument designers can enhance the expressive quality of computer music performance.
Proceedings of the 2014 International Workshop on Movement and Computing - MOCO '14, 2014
This paper describes the implementation of gestural mapping strategies for performance with a traditional musical instrument and electronics. The approach adopted is informed by embodied music cognition and functional categories of musical gestures. Within this framework, gestures are not seen as means of control subordinated to the resulting musical sounds but rather as significant elements contributing to the formation of musical meaning similarly to auditory features. Moreover, the ecological knowledge of the gestural repertoire of the instrument is taken into account as it defines the action-sound relationships between the instrument and the performer and contributes to form expectations in the listeners. Subsequently, mapping strategies from a case study of electric guitar performance will be illustrated describing what motivated the choice of a multimodal motion capture system and how different solutions have been adopted considering both gestural meaning formation and technical constraints.
Organised Sound
This article explores how computation opens up possibilities for new musical practices to emerge through technology design. Using the notion of the cultural probe as a lens, we consider the digital musical instrument as an experimental device that yields findings across the fields of music, sociology and acoustics. As part of an artistic-research methodology, the instrumental object as a probe is offered as a means for artists to answer questions that are often formulated outside semantic language. This article considers how computation plays an important role in the authors’ personal performance practices in different ways, which reflect the changed mode-of-being of new musical instruments and our individual and collective relations with them.
Computer-mediated music control devices compel us to reexamine the relationship between performer and sound, the nature and complexity of which is theoretically unlimited. This essay attempts to formulate some of the key aesthetic issues raised by the use of new control interfaces in the development of new musical works and new performance paradigms: mapping the gesture-sound relationship, identifying successful uses of “virtual” instruments, questioning the role of “interactivity” in performance, and positing future areas of exploration.
Current Research in Systematic Musicology, 2017
This paper reports on a survey conducted in the autumn of 2006 with the objective to understand people's relationship to their musical tools. The survey focused on the question of embodiment and its different modalities in the fields of acoustic and digital instruments. The questions of control, instrumental entropy, limitations and creativity were addressed in relation to people's activities of playing, creating or modifying their instruments. The approach used in the survey was phenomenological, i.e. we were concerned with the experience of playing, composing for and designing digital or acoustic instruments. At the time of analysis, we had 209 replies from musicians, composers, engineers, designers, artists and others interested in this topic. The survey was mainly aimed at instrumentalists and people who create their own instruments or compositions in flexible audio programming environments such as SuperCollider, Pure Data, ChucK, Max/MSP, CSound, etc.
Proceedings of the 2nd Biennial Research Through Design Conference, 25-27 March 2015, Cambridge, UK, Article 26, 2015
Like any traditional instrument, the potential of a new instrument and its possible music can only be revealed through playing. How can we treat technological matter as yet another material from which our notions of possible future instruments can be constructed, intrinsically intertwined with and informed by a practise of performance? Our approach to developing instruments for musical performance strives to not just make technology, but aesthetic and cultural objects. A musical instrument is not an interface and should not be designed as such, instead, instruments are the source of new in new music. We would like to present an instrument design process conducted with a musician as visionary and agenda setter. As the instrument grows and evolves through the various stages, it remains playable and faithful to the desire to make music. The resulting objects are experimental prototypes of technological matter, which allow analysis and meaning to be specified through physical and tactile interaction with the object itself. At RTD2015 we will present a range of these intermediate prototypes and play the finished instrument.
Lecture Notes in Computer Science, 2014
The new interfaces are changing the way we interact with computers. In the musical context, those new technologies open a wide range of possibilities in the creation of New Interfaces for Musical Expression (NIME). Despite 10 years of research in NIME, it is hard to find artifacts that have been widely or convincingly adopted by musicians. In this paper, we discuss some NIME design challenges, highlighting particularities related to the digital and musical nature of these artifacts, such as virtuosity, cultural elements, context of use, creation catalysis, success criteria, adoption strategy, etc. With these challenges, we aim to call attention for the intersection of music, computing and design, which can be an interesting area for people working on product design and interaction design.
2001
The conception and design of new musical interfaces is a multidisciplinary area that tightly relates technology and artistic creation. In this paper, the author first exposes some of the questions he has posed himself during more than a decade experience as a performer, composer, interface and software designer, and educator. Finally, he illustrates these topics with some examples of his work.
Music interactivity is a sub-field of human-computer interaction studies. Interactive situations have different degree of structural openness and musical “ludicity” or playfulness. Discussing music seems inherently impossible since it is essentially a non-verbal activity. Music can produce an understanding (or at least prepare for an understanding) of creativity that is of an order neither verbal nor written. A human listener might perceive beauty to be of this kind in a particular music. But can machine-generated music be considered creative and if so, wherein lies the creativity? What are the conceptual limits of notions such as instrument, computer and machine? A work of interactive music might be more pertinently described by the processes involved than by one or several instanciations. While humans spontaneously deal with multiple process descriptions (verbal, visual, kinetic…) and are very good at synthesising, the computer is limited to handling processes describable in a formal language such as computer code. But if the code can be considered a score, does it not make a musician out of the computer? As tools for creative stimulus, composers have created musical systems employing artificial intelligence in different forms since the dawn of computer music. A large part of music interactivity research concerns interface design, which involves ergonomics and traditional instrument maker concepts. I will show examples of how I work with interactivity in my compositions, from straight-forward applications as composition tools to more complex artistic work.
This dissertation presents a new and expanded context for interactive music based on Moore’s model for computer music (Moore 1990) and contextualises its findings using Lesaffre’s taxonomy for musical feature extraction and analysis (Lesaffre et al. 2003). In doing so, the dissertation examines music as an expressive art-form where musically significant data is present not only in the audio signal but also in human gestures and in physiological data. The dissertation shows the model’s foundation in human perception of music as a performed art, and points to the relevance and feasibility of including expression and emotion as a high-level signal processing means for bridging man and machine. The resulting model is multi-level (physical, sensorial, perceptual, formal, expressive) and multi-modal (sound, human gesture, physiological) which makes it applicable to purely musical contexts, as well as intermodal contexts where music is combined with visual and/or physiological data. The model implies evaluating an interactive music system as a musical instrument design. Several properties are examined during the course of the dissertation and models based on acoustic music instruments have been avoided due to the expanded feature set of interactive music system. A narrowing down of the properties is attempted in the dissertation’s conclusion together with a preliminary model circumscription. In particular it is pointed out that high-level features of real-time analysis, data storage and processing, and synthesis makes the system a musical instrument, and that the capability of real-time data storage and processing distinguishes the digital system as an unprecedented instrument, qualitatively different from all previous acoustic music instrument. It is considered that a digital system’s particular form of sound synthesis only qualifies it as being of a category parallel to the acoustic instruments categories. The model is the result of the author’s experiences with practical work with interactive systems developed 2001-06 for a body of commissioned works. The systems and their underlying procedures were conceived and developed addressing needs inherent to the artistic ambitions of each work, and have all been thoroughly tested in many performances. The papers forming part of the dissertation describe the artistic and technological problems and their solutions. The solutions are readily expandable to similar problems in other contexts, and they all relate to general issues of their particular applicative area.
Dissertation, 2013
Professor David Borgo. Without his support, motivation and persistence this dissertation would not have been possible. Special thanks also to Professor Nancy Guy for her insight, thought-provoking seminars, friendship and encouragement. Phil Skaller and Ben Power, two members of my cohort, provided invaluable help with my work, editing and providing meticulous comments. Most importantly: they were great friends to me through the whole process. My heartfelt thanks, I am deeply indebted to both of you. I also wish to acknowledge friends and family that were constant companions, encouraging me through the process. If they weren't present in person, they were on
2017
Rationale There has been little chance for researchers, performers and designers in the UK to come together in order to explore the use and design of new and evolving technologies for performance. This workshop examines the interplay between people, musical instruments, performance and technology. Now, more than ever technology is enabling us to augment the body, develop new ways to play and perform, and augment existing instruments that can span the physical and digital realms. By bringing together performers, artists, designers and researchers we aim to develop new understandings how we might design new performance technologies. Some Topics - Methods and Approaches; What are the methods and approaches that we can employ to understanding interaction and interplay in performance and what impact does technology have on this? - Sonic Augmentation; can performance and sound change the experiential attributes of places, e.g. make them more accessible, more playful? -Physical/digital aug...
… of the 7th international conference on New …, 2007
This paper reports on a survey conducted in the autumn of 2006 with the objective to understand people's relationship to their musical tools. The survey focused on the question of embodiment and its different modalities in the fields of acoustic and digital instruments. The questions of control, instrumental entropy, limitations and creativity were addressed in relation to people's activities of playing, creating or modifying their instruments. The approach used in the survey was phenomenological, i.e. we were concerned with the experience of playing, composing for and designing digital or acoustic instruments. At the time of analysis, we had 209 replies from musicians, composers, engineers, designers, artists and others interested in this topic. The survey was mainly aimed at instrumentalists and people who create their own instruments or compositions in flexible audio programming environments such as SuperCollider, Pure Data, ChucK, Max/MSP, CSound, etc.
2015
With all computer music improvements in the last decades, musical gestures started to be used as parameter for controlling dynamic processes that compound a musical performance. This article intends to describe the processes and methods employed in a composition of this type: "Gest'Ação I" for classical guitar and live electronics which uses a set of 'instrumental gestures' for the guitar as part of the musical performance. The intent here was to elaborate a musical piece that could join, in a free process, the idea of 'extended technique' to the "writing/instrument extended' using open-sourced musical software. This article briefly discusses the concept of "gesture" as defined by Wanderley (2000) and Delande (1998), especially concerning the guitar performance. It is also discussed the concepts of "extended technique" and "extended instrument" as defined by Padovani and Ferraz (2011). Here two open-source softwares were used: MuseScore2.0 (for the score writing) and Pure Data (for the audio processing in real time).
2010
The Arcontinuo is an electronic musical instrument designed from a perspective based in the study of their potential users and their interaction with existing musical interfaces. The study of this interaction yields information that could give place to better interfaces, which respond to natural gestures in musical situations, and at the same time incorporate the latest advances in digital technology. The article explains the methodology fundamentals both qualitatively and quantitatively, focusing on two key elements in the performer interaction with instruments: gestures and objects. The statistical analysis of the results leads to the proposition of several mockups, one of which is chosen and described in details including its hardware and software implementation.
This author commentary chapter accompanies the re-publication of my co-authored 2006 paper ‘Mobile Music Technology: Report on an Emerging Community’ - one of 30 papers selected from 1,200 NIME papers to be included in the book ‘A NIME Reader: Fifteen Years of New Interfaces for Musical Expression, published by Springer and edited by Alexander Refsum Jensenius and Michael J. Lyons.
Organised Sound, 2000
This article considers the notion of the invention as an instrumental concept in designing an interactive music system. The derivation of the idea is traced from Dreyfus' Bach and the Patterns of Invention. Issues of the nature of the interactive musical context are problematised on the basis of ideas from Adorno and Lyotard. The invention is presented as a mechanism for implementing the concepts of the embodiment and distribution of musical activity, which are shown to be generalisable. The relationship of composer and computer is considered in the light of a ‘prosthetic culture’. It is suggested that a crucial property of the invention is that of self-simulation.
CHI '01 extended abstracts on Human factors in computer systems - CHI '01, 2001
Topic: The impact of interface technology on all aspects of musical expression.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.