Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2010
The goal of this project is to provide an interface that frees the 'laptop musician' from the laptop and in doing so, encourages more interactive performances within the genre of electronic music. This paper examines interactive systems that use motion and other sensors to control computer music applications and discusses interface design in this context. The various interfaces already tackling this goal demonstrate issues with regard to complexity, affordability and usability. We are particularly interested in interfaces that utilise simple design concepts to provide a flexible, intuitive controller that is appealing and affordable to a broad range of electronic musicians.
A NIME Reader
One driving philosophy of the NIME community is that controller design greatly influences the sort of music one can make (Cook 2001). With respect to this, we are interested in developing control methods in specific settings (e.g., laptop ensembles), and in encouraging the community to consider all impacts of controller choice in scenarios presenting practical limitations. Customization of controller to musical task is desirable, yet so are availability, ease of use, development time, and portability. In a real-world environment, these needs must all be addressed to maximize musicality, efficiency, and fun. To achieve this end, one must think creatively about all inputs at one's disposal. Music performed using laptops has a large and growing body of practitioners in both academic and popular realms. Recent proliferation of software tools (PD (Puckette 1996), SuperCollider (McCartney 1996), ChucK (Wang and Cook 2003), also new uses for Perl (McLean 2004), Python, etc.) has greatly reduced the barriers of expertise, time, and money required to create music, opening the door to hobbyists
Lecture Notes in Computer Science, 2014
The new interfaces are changing the way we interact with computers. In the musical context, those new technologies open a wide range of possibilities in the creation of New Interfaces for Musical Expression (NIME). Despite 10 years of research in NIME, it is hard to find artifacts that have been widely or convincingly adopted by musicians. In this paper, we discuss some NIME design challenges, highlighting particularities related to the digital and musical nature of these artifacts, such as virtuosity, cultural elements, context of use, creation catalysis, success criteria, adoption strategy, etc. With these challenges, we aim to call attention for the intersection of music, computing and design, which can be an interesting area for people working on product design and interaction design.
Journal of New Music Research, 2003
Throughout history, each set of technologies, from woodcraft to water pumps and from electricity to computers, has ushered its own set of changes into the way people generate and interact with music. Acoustic musical instruments have settled into canonical forms, taking centuries, if not millennia, to evolve their balance between sound production, ergonomics, playability, potential for expression, and aesthetic design. In contrast, electronic instruments have been around for little more than a century, during which rapid, often exponential (Kurzweil, 2000) advances in technology have continually opened new possibilities for sound synthesis and control, keeping the field in continual revolution and allowing few instruments to be mastered by a significant community of players. In fact, one might well ask what does "mastery" of an electronic musical instrument mean? A by-product of the power to create and manipulate artificial sounds is the ability to decouple the synthesis of an instrument's sound from the physics of the instrument's sound-producing mechanism, allowing musical controllers to diverge from the familiar physical forms of acoustic instruments. While this frees controllers to evolve along different paths, it also imposes constraints on an instrument's playing qualities because of limited control over and lack of connection to the sound production mechanism. Thus the normal path to learning an instrument-acquiring an increasingly complex representation of the relationship between one's actions as a player and the instrument's response-no longer holds because each instance of an instrument, controller plus synthesis model, requires that this mapping be learned anew. Indeed, electronic music controllers evolve so rapidly that it's rare for a musician to work long enough with one to develop virtuosic technique.
Procedia Manufacturing, 2015
A large variety of musical instruments, either acoustical or digital, are based on a keyboard scheme. Keyboard instruments can produce sounds through acoustic means but they are increasingly used to control digital sound synthesis processes with nowadays music. Interestingly, with all the different possibilities of sonic outcomes, the input remains a musical gesture. In this paper we present the conceptualization of a Natural User Interface (NUI), named the Intangible Musical Instrument (IMI), aiming to support both learning of expert musical gestures and performing music as a unified user experience. The IMI is designed to recognize metaphors of pianistic gestures, focusing on subtle uses of fingers and upper-body. Based on a typology of musical gestures, a gesture vocabulary has been created, hierarchized from basic to complex. These piano-like gestures are finally recognized and transformed into sounds.
2020
The development of musical interfaces has moved from static to malleable, where the interaction mode can be designed by the user. However, the user still has to specify which input parameters to adjust, and inherently how it affects the sound generated. We propose a novel way to learn mappings from movements to sound generation parameters, in an explorative way. The goal is to make the user interface evolve with the user, creating a unique, tailor made interaction mode with the instrument.
2007
We draw on our experiences with the Princeton Laptop Orchestra to discuss novel uses of the laptop's native physical inputs for flexible and expressive control. We argue that instruments designed using these built-in inputs offer benefits over custom standalone controllers, particularly in certain group performance settings; creatively thinking about native capabilities can lead to interesting and unique new interfaces. We discuss a variety of example instruments that use the laptop's native capabilities and suggest avenues for future work. We also describe a new toolkit for rapidly experimenting with these capabilities.
Proceedings of the 2006 conference on New interfaces …, 2006
Using mobile devices as instruments in computer music is one of the goals of the Pure Data anywhere project [5]. An obstacle we encounter is controllability, because most of the devices do not offer the necessary interface, such as MIDI or USB, in order to be controlled by ...
CHI '01 extended abstracts on Human factors in computer systems - CHI '01, 2001
Topic: The impact of interface technology on all aspects of musical expression.
Squatouch: object oriented sounds/tangible musical interface, 2009
This article investigates the theoretical and the practical terms in the formation of Squatouch, which is a computer-based musical interface and offers a multitouch user interface via the forms of direct interaction. Multi-disciplinary structure, named as Interaction Design today, carries the communication methods with computers to a different level day by day. Initially based on only mouse and keyboard interaction and single-user oriented interaction paradigms, it now provides multi-user oriented alternative interaction methods thanks to the rapid improvements in technology. However, the technology providing the user those opportunities expects the user to learn a new language. Multitouch interfaces are among these new languages through which more than one user can interact directly by using their hands without a mouse or a keyboard. In that context, the following article presents Squatouch, which carries the humancomputer interaction to a higher level by providing a tangible inter...
2020
Non-rigid interfaces allow for exploring new interactive paradigms that rely on deformable input and shape change, and whose possible applications span several branches of human-computer interaction (HCI). While extensively explored as deformable game controllers, bendable smartphones, and shape-changing displays, nonrigid interfaces are rarely framed in a musical context, and their use for composition and performance is rather sparse and unsystematic. With this work, we start a systematic exploration of this relatively uncharted research area, by means of (1) briefly reviewing existing musical interfaces that capitalize on deformable input, and (2) surveying 11 among experts and pioneers in the field about their experience with and vision on non-rigid musical interfaces. Based on experts’ input, we suggest possible next steps of musical appropriation with deformable and shape-changing technologies. We conclude by discussing how cross-overs between NIME and HCI research will benefit...
2001
The conception and design of new musical interfaces is a multidisciplinary area that tightly relates technology and artistic creation. In this paper, the author first exposes some of the questions he has posed himself during more than a decade experience as a performer, composer, interface and software designer, and educator. Finally, he illustrates these topics with some examples of his work.
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction - TEI '12, 2012
Computers offer a wealth of promises for real-time musical control. One of them is to enable musicians to change the structure of their instruments in the same time they are playing them, allowing them to adapt their tools to their wills and needs. Few interaction styles provide enough freedom to achieve this. Improvised interfaces are tangible interfaces made out of found objects and tailored by their users. We propose to take advantage of these improvised interfaces to turn the surrounding physical environment into a dynamic musical instrument with tremendous possibilities. Methods dealing with design issues are presented and an implementation of this novel approach is described.
Proceedings of the 24th BCS Interaction Specialist …, 2010
This paper presents observations on the creation of digital music controllers and the music that they generate from the perspectives of the designer and the artist. In the case of musical instruments, what is the role of the form (the hardware) where it concerns the function (the production of musically interesting sounds)? Specific projects are presented, and a set of operational principles is supported from those examples. The associated encounter session will allow delegates to experiment with the interfaces exhibited, further informing these principles.
Lecture Notes in Computer Science, 2011
Compared to pop music, the audience of classical music has decreased dramatically. Reasons might be the way of communication between classic music and its audience that depends on vocal expression such as timbre, rhythm and melody in the performance. The fine details of classic music as well its implied emotion among the notes become implicit to the audience. Thus, we apply a new media called dynamic skin for building up the interface between performers and audiences. Such interface is called "Musical Skin" is implemented with dynamic skin design process with the results from gesture analysis of performer/audience. Two skins-system of Musical Skin are implemented with virtual visualization/actuators/sensible spaces. The implementation is tested using scenario and interviews.
2009
In this project we have developed reactive instruments for performance. Reactive instruments provide feedback for the performer thereby providing a more dynamic experience. This is achieved through the use of haptics and robotics. Haptics provide a feedback system to the control surface. Robotics provides a way to actuate the instruments and their control surfaces. This allows a highly coordinated "dance" between performer and the instrument. An application for this idea is presented as a linear slide interface. Reactive interfaces represent a dynamic way for music to be portrayed in performance.
Human-Computer Interaction with music systems has not yet presented a solution for non-musicians to create music. In this paper I present “Musikus”; an application meant to allow non-musicians to interact and play good-sounding music in an intuitive and simplistic manner. Three different user interfaces were created, each allowing the users to interact with the music in a different way. A thorough user study was conducted and the user feedback is presented and analyzed. Results show that the interface is indeed intuitive but the level of control that the user has over the music, is a fine line that needs to be further examined. This opens the discussion for further work in Human-Computer Interaction studies to find the point at which the users have enough control to express themselves and at the same time do not get overwhelmed with complicated control paradigms that can frustrate them.
2017
We developed an interactive system for music performance, able to control sound parameters in a responsive way with respect to the user’s movements. This system is conceived as a mobile application, provided with beat tracking and an expressive parameter modulation, interacting with motion sensors and effector units, which are connected to a music output, such as synthesizers or sound effects. We describe the various types of usage of our system and our achievements, aimed to increase the expression of music performance and provide an aid to music interaction. The results obtained outline a first level of integration and foresee future cognitive and technological research related to it.
Lecture Notes in Computer Science, 2004
This paper describes a performance interface called iFP that enables players to play music as if he or she had the hands of the virtuoso. iFP is a tapping-style musical interface and refers to a pianist's expressiveness described in a performance template. The paper describes the scheduler that allows a player to mix her/his own intension with the expressiveness in the performance template and the user interfaces. The results of a subjective study suggest that using the expression template and the tapping-style interface contribute to the subject's joy of playing music. This result is also supported by a brain activation study that was done using near-infrared spectroscopy.
Proceedings of the 13th annual ACM international conference on Multimedia - MULTIMEDIA '05, 2005
Music is a very important part of our lives. People enjoy listening to the music, and many of us find a special pleasure in creating the music. Computers further extended many aspects of our musical experience. Listening to, recording, and creating music is now easier and more accessible to various users. On the other hand, various computing applications exploit the music in order to better support the interaction with users. However, listening to music is generally a passive experience. Although we may change many parameters, the music we listen to generally does not reflect our response, or does so very roughly. In this paper we present a flexible framework that enables active creation of instrumental music based of the implicit dynamics and content of human-computer interaction. Our approach is application independent, and it provides a mapping of musical features to the abstraction of user interaction. This mapping is based on analysis of the dynamic and content of the humancomputer interaction. In contrast to the most existing interactive music composition tools, which require explicit interaction with the system, we have provided a more flexible solution that implicitly maps user interaction parameters to the various musical features.
2003
New digital musical instruments designed for professional and trained musicians can be quite complex and challenging, offering as a counterpart a great amount of creative freedom and control possibilities to their players. On the other hand, instruments designed for amateur musicians or for audiences in interactive sound installations, tend to be quite simple, often trying to bring the illusion of control and interaction to their users, while still producing 'satisfactory' outputs. Logically, these two classes of instruments are often mutually exclusive. But wouldn't it be possible to design instruments that can appeal to both sectors? In this paper we will show , with two projects developed in our research group, how visual feedback can highly increase the intuitiveness of an interactive music system, making complex principles understandable.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.