Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2015, Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems
Deformable interfaces offer new possibilities for gestures, some of which have been shown effective in controlled laboratory studies. Little work, however, has attempted to match deformable interfaces to a demanding domain and evaluate them out of the lab. We investigate how musicians use deformable interfaces to perform electronic music. We invited musicians to three workshops, where they explored 10 deformable objects and generated ideas on how to use these objects to perform music. Based on the results from the workshops, we implemented sensors in the five preferred objects and programmed them for controlling sounds. Next, we ran a performance study where six musicians performed music with these objects at their studios. Our results show that (1) musicians systematically map deformations to certain musical parameters, (2) musicians use deformable interfaces especially to filter and modulate sounds, and (3) musicians think that deformable interfaces embody the parameters that they control. We discuss what these results mean to research in deformable interfaces.
2020
Non-rigid interfaces allow for exploring new interactive paradigms that rely on deformable input and shape change, and whose possible applications span several branches of human-computer interaction (HCI). While extensively explored as deformable game controllers, bendable smartphones, and shape-changing displays, nonrigid interfaces are rarely framed in a musical context, and their use for composition and performance is rather sparse and unsystematic. With this work, we start a systematic exploration of this relatively uncharted research area, by means of (1) briefly reviewing existing musical interfaces that capitalize on deformable input, and (2) surveying 11 among experts and pioneers in the field about their experience with and vision on non-rigid musical interfaces. Based on experts’ input, we suggest possible next steps of musical appropriation with deformable and shape-changing technologies. We conclude by discussing how cross-overs between NIME and HCI research will benefit...
2018
Currently, deformable user interfaces are a popular topic in the Human Computer Interaction (HCI) community. These interfaces enable intuitive manipulation for users. Deformable interfaces are also reported in the field of musical expression, such as sound controllers. However, the user experience of the manipulation of these interfaces is not well studied so far. This paper therefore focuses on the clarification of pleasantness in the manipulation of deformable interfaces for musical expression. First, evaluation experiments were conducted to investigate the pleasantness and user impressions of deformable interface manipulation using 36 dissimilar interface mockups. The evaluation results explained that the impressions related to the activity factor particularly correlate with pleasantness. Also, Hayashi’s quantification theory type I revealed the relationship between the physical features of the interfaces and pleasantness. Based on the findings, the ideal deformable interface des...
2020
The development of musical interfaces has moved from static to malleable, where the interaction mode can be designed by the user. However, the user still has to specify which input parameters to adjust, and inherently how it affects the sound generated. We propose a novel way to learn mappings from movements to sound generation parameters, in an explorative way. The goal is to make the user interface evolve with the user, creating a unique, tailor made interaction mode with the instrument.
2003
In this paper, we describe a new interface for musical performance, using the interaction with a graphical user interface in a powerful manner: the user directly touches a screen where graphical objects are displayed and can use several fingers simultaneously to interact with the objects. The concept of this interface is based on the superposition of the gesture spatial place and the visual feedback spatial place; i t gives the impression that the graphical objects are real. This concept enables a huge freedom in designing interfaces. The gesture device we have created gives the position of four fingertips using 3D sensors and the data is performed in the Max/MSP environment. We have realized two practical examples of musical use of such a device, using Photosonic Synthesis and Scanned Synthesis.
Shape-retaining freely-deformable interfaces can take innumerable distinct shapes, and creating specific target configurations can be a challenge. In this paper, we investigate how audio can guide a user in this process, through the use of either musical or metaphoric sounds. In a formative user study, we found that sound encouraged action possibilities and made the affordances of the interface perceivable. We also found that adding audio as a modality along with vision and touch, made a positive contribution to guiding users’ interactions with the interface.
Procedia Manufacturing, 2015
2017
This paper presents FabricKeyboard: a novel deformable keyboard interface based on a multi-modal fabric sensate surface. Multi-layer fabric sensors that detect touch, proximity, electric field, pressure, and stretch are machine-sewn in a keyboard pattern on a stretchable substrate. The result is a fabric-based musical controller that combines both the discrete controls of a keyboard and various continuous controls from the embedded fabric sensors. This enables unique tactile experiences and new interactions both with physical and noncontact gestures: physical by pressing, pulling, stretching, and twisting the keys or the fabric and non-contact by hovering and waving towards/against the keyboard and an electromagnetic source. We have also developed additional fabric-based modular interfaces such as a ribbon-controller and trackpad, allowing performers to add more expressive and continuous controls. This paper will discuss implementation strategies for our system-on-textile, fabric-ba...
williambrent.conflations.com
The flexibility of current hardware and software has made the mapping of relationships between a sound's parameters and physical source of control a relatively trivial task. Consequently, the endeavor of sophisticated digital instrument design has been accessible to the creative community for several years, which has resulted in a host of new instruments that explore a variety of physical mappings. The emphasis on physicality exhibited by so-called "gestural controllers" stands in contrast to the practice of conventional laptop performance. While the laptop computer is certainly a digital musical instrument, its associated performance practice is often criticized based on a perceived lack of physicality. This paper examines motivations behind the foregrounding of gesture in computer-based performance. Critical theory and neuroscience research are drawn upon in order to consider ways in which the desire for connections between sound and motion amount to more than mere fascination with virtuosity.
2016
This paper explores how an actuated pin-based shape display may serve as a platform on which to build musical instruments and controllers. We designed and prototyped three new instruments that use the shape display not only as an input device, but also as a source of acoustic sound. These cover a range of interaction paradigms to generate ambient textures, polyrhythms, and melodies. This paper first presents existing work from which we drew interactions and metaphors for our designs. We then introduce each of our instruments and the back-end software we used to prototype them. Finally, we offer reflections on some central themes of NIME, including the relationship between musician and machine.
2011
Flexible sensors can find many useful applications detecting vibrations, contacts and impacts, air and liquid flows, pressures and compressions, displacements and motions. So they are utilized in the fields of robotic, medical, fitness, assistive technology, gaming, etc. But we want here point out the adoption of such sensors for realizing a data glove, capable to associate a sound to each single movement of every joints of the fingers of a human hand. In addiction force sensing resistors, applied to each fingertips, measure of the pressure applied on a surface when the hands mimic the gestures of a pianist.
Proceedings of the 1st …, 2007
In recent years we have seen a proliferation of musical tables. Believing that this is not just the result of a tabletop trend, in this paper we first discuss several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop ...
TEI Conference, Barcelona, 2013
This paper presents the actual stage of implementation of a new type of malleable tangible object for musical expression. Sculpton is an autonomous sonic object which uses the metaphor of sound sculpting for connecting physical information in digital audio. By manipulating the object the user can literally sculpt the sound through a real-time sound synthesis which reverberates the object structure. This project explores a novel approach compared to previous work and research around the topic of sound sculpting: user gestures is not externally sensed but within the artifact itself. Sculpton is an attempt for the development of a new kind of embodied musical instruments which combine multidimensional control, tangible and malleable characteristics with an organic handling. We will briefly describe the research, our framework and the actual state of the experimental prototypes.
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, 2011
We present an ensemble of tangible objects and software modules designed for musical interaction and performance. The tangible interfaces form an ensemble of connected objects communicating wirelessly. A central concept is to let users determine the final musical function of the objects, favoring customization, assembling, repurposing. This might imply assembling the wireless interfaces with existing everyday objects or musical instruments. Moreover, gesture analysis and recognition modules allow the users to define their own action/motion for the control of sound parameters. Various sound engines and interaction scenarios were built and experimented. Some examples that were developed in a music pedagogy context are described.
International Journal of Arts and Technology, 2008
This paper explores one of the application domains in which tangible and tabletop interfaces have currently shown more positive results, studying and unveiling the essential reasons that turn live music performance and tabletop interaction into promising and exiting fields of multi-disciplinary research and experimentation. The paper is structured in three parts. The first one exposes the main reasons that turn live music performance into an ideal test-bed for tangible interaction and advanced human-computer interaction. Reciprocally, the second part studies why tabletop interfaces promise remarkable new musical instruments. The third part describes the main design issues that lead to the development of the reactable, a tabletop musical instrument that has been conceived based on many of the criteria exposed on the previous two parts.
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction - TEI '12, 2012
Computers offer a wealth of promises for real-time musical control. One of them is to enable musicians to change the structure of their instruments in the same time they are playing them, allowing them to adapt their tools to their wills and needs. Few interaction styles provide enough freedom to achieve this. Improvised interfaces are tangible interfaces made out of found objects and tailored by their users. We propose to take advantage of these improvised interfaces to turn the surrounding physical environment into a dynamic musical instrument with tremendous possibilities. Methods dealing with design issues are presented and an implementation of this novel approach is described.
2010
The goal of this project is to provide an interface that frees the 'laptop musician' from the laptop and in doing so, encourages more interactive performances within the genre of electronic music. This paper examines interactive systems that use motion and other sensors to control computer music applications and discusses interface design in this context. The various interfaces already tackling this goal demonstrate issues with regard to complexity, affordability and usability. We are particularly interested in interfaces that utilise simple design concepts to provide a flexible, intuitive controller that is appealing and affordable to a broad range of electronic musicians.
2006
Design practice in the Human-Computer Interaction (HCI) tradition often focuses on developing a task-based model of behavior and extrapolating system requirements from this model. Some tasks, however, are too complex to model. Consider the problem of text input beyond the traditional desktop and laptop computing paradigm. Natural, seamless, efficient and comfortable text input is a complex activity involving the translation of language into psychomotor rhythms acting on a spatial topology. Accurate models with appropriate emphases are difficult, if not impossible, to construct. Fortunately, as this paper shows, we may make informed design decisions not by modelling, but by leveraging phenomena. In this design study I leverage the patterns of language and innate tendencies of the human typist to create an imprint of text input activity on a technological artifact.
Lecture Notes in Computer Science, 2011
Compared to pop music, the audience of classical music has decreased dramatically. Reasons might be the way of communication between classic music and its audience that depends on vocal expression such as timbre, rhythm and melody in the performance. The fine details of classic music as well its implied emotion among the notes become implicit to the audience. Thus, we apply a new media called dynamic skin for building up the interface between performers and audiences. Such interface is called "Musical Skin" is implemented with dynamic skin design process with the results from gesture analysis of performer/audience. Two skins-system of Musical Skin are implemented with virtual visualization/actuators/sensible spaces. The implementation is tested using scenario and interviews.
2000
In this paper we describe the MoGMI project that explores ways of enabling the mobile phone to become a musical instrument for naïve users. Two applications enabled 10 subjects to use physical gesture movements to either record and play back continuous musical pieces using an onboard MIDI player or to play back simple and short digital sound files in real time. A user study explored which accelerometer axis mapping model is preferred by users. Results show that subjects preferred the three axis model in which every axis is mapped to a different dimension of music generation (attack, amplitude, and pitch). This mapping was deemed better by subjects over simpler or more complicated mapping models in three of five dimensions (easier to learn, produces ''nicer'' music, and in how easy it is to understand the relationship between gestures performed and the music that is subsequently generated).
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.