Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2020
Solitária is a performance that employs puppetry techniques to manipulate digital media. The<br> performance was designed upon the duality between the material and the non material existence, between<br> the physicality and the virtuality. A gestural vocabulary was developed to manipulate both sound and visuals<br> in realtime. We have built an interactive system to support both tangible and intangible manipulation, as<br> well to respond and react to a specific body movement. This work explores puppetry techniques with sound<br> and digital animation, engaging the audience with a dialog between the body language and the digital media.<br> We found that sound and visuals can be manipulated in a similar manner through puppetry methods, as if<br> we pull the strings from the same instrument. This framework was implemented with success responding to<br> the requirements of the performance. Solitária was presented at the Festival Internaci...
Proceedings of the …, 2010
Lecture Notes in Computer Science, 2012
Pictures at an Exhibition is a physical/digital puppetry piece that uses tangible interface puppets to modify a virtual scene projected at the back of the stage in real-time. The piece merges traditional puppeteering practices with tangible interaction technologies and virtual environments to create a novel performance for the live stage. This paper describes the design and development of piece, as well as our lessons learned from this process and from on-stage performances of Pictures at an Exhibition in a puppetry theatre.
Puppetry is one of the most ancient forms of performance in the world. Even though it was universally popular in the past, most of traditional puppet theatres have lost their appeal and their vital communication with communities. In ancient times, puppetry played an important role as village "ritual", as shown in many world puppetries. There is clear evidence of how ritual objects such as masks have been transformed into puppets throughout time, showing the inherent connection between ritual and puppetry [1]. As an example, Wayang Kulit, Indonesian shadow puppetry, is one of the most ancient forms of puppetry, storytelling, and ritual . The shadows are considered spirits of the deceased, in keeping with the traditional Javanese animistic belief that everything had a soul . Wayang Kulit functions as a ritual for calling spirits in order to ask for advice or help in overcoming problems related to disharmony, and to bring balance between positive and negative forces of the community . In this ritualistic context, the puppeteer played the role of shaman, entering into a transformative relationship with his ritual object, the "puppet". This has later resulted in forms of freely improvised storytelling and of lively interaction with the public in the community. As in Wayang Kulit, we define the puppet as a source of energy continuously sending users into altered states of consciousness, breaking constraints, and boundaries of the material world. However, the word "puppet" frequently appears in the digital realm to indicate a form of Avatar, being mainly manifested as a representation of the materialism of real world. Deeply rooted in the Cartesian hierarchy of separation between subject and object, in the digital media culture a puppet is considered as something to be manipulated and controlled, ignoring its transformative relationship with the user. In the digital translation of puppetry, we are interested in how interactive technology will support ancient wisdoms of ritual, revealing transformative relationships of puppet and puppeteer, resulting in performances of great excitement, public engagement and reflection of the community, creating rich layers of mixed reality environments. In fact, Wayang Kulit offers complicate layers of mixed reality, which allow viewers to walk around the screen, watching the real puppet and its virtual form as a shadow at the same time. Mixed reality does not only happen in viewer/puppeteer's consciousness at the moment the shadow becomes alive with its own spirit, but also in the viewer's perception struggling between real and virtual forms of presentation, walking around the screen. This kind of setting creates rich platforms for discussing reality, virtuality and mixed reality all together, which we adopt as our methodology in order to explore the full potential of virtual puppetry.
This paper explores the challenge of achieving nuanced control and physical engagement with gestural interfaces in performance. Performances with a prototype gestural performance system, Gestate, provide the basis for insights into the application of gestural systems in live contexts. These reflections stem from a performer's perspective, summarising the experience of prototyping and performing with augmented instruments that extend vocal or instrumental technique through gestural control. Successful implementation of rapidly evolving gestural technologies in real-time performance calls for new approaches to performing and musicianship, centred on a growing understanding of the body's physical and creative potential. For musicians hoping to incorporate gestural control seamlessly into their performance practice, a balance of technical mastery and kinaesthetic awareness is needed to adapt existing approaches to their own purposes. Within non-tactile systems, visual feedback mechanisms can support this process by providing explicit visual cues that compensate for the absence of haptic feedback. Experience gained through prototyping and performance can yield a deeper understanding of the broader nature of gestural control and the way in which performers inhabit their own bodies.
2004
Abstract This article introduces a conceptual design for an interactive artwork called Feel-in-Touch! Its aim is to improve the use of imagination in artworks using abstract images in the formats of interactive media and vibro-tactile aids. New technologies can visually realize every surrealistic narration we can imagine, but these technologies limit our perceptions by presenting only one way of imagining, instead of multiple alternatives. This restricts creative thinking.
Proceedings of the International Conference on New Interfaces for Musical Expression, 2020
This paper outlines the development process of an audio-visual gestural instrument—the AirSticks—and elaborates on the role 'miming' has played in the formation of new mappings for the instrument. The AirSticks, although fully-functioning, were used as props in live performances in order to evaluate potential mapping strategies that were later implemented for real. This use of mime when designing Digital Musical Instruments (DMIs) can help overcome choice paralysis, break from established habits, and liberate creators to realise more meaningful parameter mappings. Bringing this process into an interactive performance environment acknowledges the audience as stakeholders in the design of these instruments, and also leads us to reflect upon the beliefs and assumptions made by an audience when engaging with the performance of such 'magical' devices. This paper establishes two opposing strategies to parameter mapping, 'movement-first' mapping, and the less conven...
Puppetry is one of the most ancient forms of representation, diffused all over the world in different shapes, degrees of freedom in movements and forms of manipulation. Puppets make an ideal environment for creative collaboration, inspiring the development of supporting technologies (from carpentry to digital worlds). The CoPuppet project explores the possibilities offered by multimodal and cooperative interaction, in which performers, or even audience members, are called to affect different parts of a puppet through gestures and voice. In particular, we exploit an existing architecture for the creation of multimodal interfaces to develop the CoPuppet framework for designing, deploying and interacting during performances in which virtual puppets are steered by multiple multimodal controls. The paper illustrates the steps needed to define performances, also showing the integration of digital and real puppetry for the case of wayang shadowplay.
This paper focuses on the research project, Projecting Performance, in which off-stage technical operators take on the role of performer through the live manipulation of digital ‘sprites’ in a theatrical environment. The sprites are projected onto gauzes in the stage space and operators control them with graphics tablets and pens to perform with on-stage dancers. Operators have frequently described experiences of dislocation or translocation during the experience of operating and this paper investigates the reasons for such reports. It presents the tripartite models of Zich and Castronova from the fields of theatre studies and human-computer interaction respectively, cross-referencing them to analyse the relationship between performer-operator and sprite. Merleau-Ponty's phenomenological theories are then employed through the writings of Crowther and Fraleigh to explore the experience of the performer-operator. The paper proposes an understanding of the digital interface in Projecting Performance as embodied and experienced both visually and kinaesthetically by the performer-operator.
Digital Theatrograph is a hybrid media performance supported by a cinematic theatrical object. We call this device the Cinetroscope-a miniaturized live interactive studio for theatrical performances. This object was developed in response to a multidisciplinary challenge of adapting the literary work " Peregrinação " into an augmented paper theatre. Fusing the theatrical performance with the cinematic techniques we found a novel puppetry genre that we entitled " live cinematic-puppetry ". This genre incorporates the improvisation and spontaneity that characterizes the puppet theater, as well the narrative structure of cinematography supported by its visual techniques. In this paper we present the concepts associated with this performative object, as well as the description of our methodology. The feasibility of this technological platform was partially demonstrated through several successfully performances.
2010
The proposition in the title of this paper is intended to draw a link between psychological processes involved in aesthetic gestural performance (e.g. music, dance) for both performers and perceivers. In the performance scenario, the player/dancer/etc., perceptually guides their actions, and acquires the skill for a performance through their previous perceptions. On the other side, the perceiver watching, listening to and experiencing another's motor performance, simulates the actions of the performance within the range of their own motor capabilities. These phenomena are possible due to common mechanisms of action and perception, and in tandem provide the basis for the rich experience of gestural performance.
GeKiPe, a gesture-based interface for audiovisual performance, 2017
We present here GeKiPe, a gestural interface for musical expression , combining images and sounds, generated and controlled in real time by a performer. GeKiPe is developed as part of a creation project, exploring the control of virtual instruments through the analysis of gestures specific to instrumentalists , and to percussionists in particular. GeKiPe was used for the creation of a collaborative stage performance (Sculpt), in which the musician and their movements are captured by different methods (infrared Kinect cameras and gesture-sensors on controller gloves). The use of GeKiPe as an alternate sound and image controller allowed us to combine body movement, musical gestures and audiovisual expressions to create challenging collaborative performances.
Journal International Journal of Advanced Computer Science, 2013
Puppets can be great storytellers when performed in a dramatic way. They create the illusion of life, making the audience believe in the story. But animating puppets using traditional key frame animation is not a trivial task taking too much time and practice, in particular for the non-expert artists. Digital puppetry presents performance-driven animation, making the puppet reactive to the motion of the performer in real-time. Motion capture methods makes the puppet animation fast and simple, based on the acting of the performer, but they are out of reach for the major consumers. We present a low-cost performance-driven technique that allows real-time puppet animation based on the inter-acting, which can be used in live storytelling. In this paper we study how users can interpret simple actions like walking, using their bodies as puppet controllers. The system was deployed using the Microsoft Kinect and by assuming the marionette aesthetics and constraints, showing how low-cost devices can provide a new mean for motion capture representation. We extend the previous study by presenting another method to interact with puppets using indirect mapping by connecting the puppet to the puppeteer with virtual strings. Last, we performed a pilot experiment animating silhouettes and 3D puppets, to better understand differences in the interaction. An audience had to identify by the output final animation, the actions performed by non-expert artists using their bodies to drive the puppets. We conclude, inter-acting with 2D puppets is similar to the marionette manipulation and needs more interpretation than with 3D puppets.
Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, 2015
Deformable interfaces offer new possibilities for gestures, some of which have been shown effective in controlled laboratory studies. Little work, however, has attempted to match deformable interfaces to a demanding domain and evaluate them out of the lab. We investigate how musicians use deformable interfaces to perform electronic music. We invited musicians to three workshops, where they explored 10 deformable objects and generated ideas on how to use these objects to perform music. Based on the results from the workshops, we implemented sensors in the five preferred objects and programmed them for controlling sounds. Next, we ran a performance study where six musicians performed music with these objects at their studios. Our results show that (1) musicians systematically map deformations to certain musical parameters, (2) musicians use deformable interfaces especially to filter and modulate sounds, and (3) musicians think that deformable interfaces embody the parameters that they control. We discuss what these results mean to research in deformable interfaces.
SOUND AND MUSIC COMPUTING, 2022
We present a novel human-centered gestural system for vocal improvisation, ªKinesthesisº, to be used in new opera and musical theatre. We demonstrate the relationship between music composition, gesture and programming. An advantage of our system is the multimodal interaction technique through the invisible interface, which examines the use of electroacoustic techniques in human voice, in a theatre context. It explores the use of live or recorded, digitally processed voice as a sound source for the development of music cues for playback through a multi-channel speaker system. We also describe the application of the proposed system, in a real-time season theatre performance, where ªKinesthesisº was controlled through hand gestures from the actors, reinforcing the interactivity and highlighting the use of electroacoustic techniques in music theatre.
International Journal of Arts and Technology, 2008
This paper explores one of the application domains in which tangible and tabletop interfaces have currently shown more positive results, studying and unveiling the essential reasons that turn live music performance and tabletop interaction into promising and exiting fields of multi-disciplinary research and experimentation. The paper is structured in three parts. The first one exposes the main reasons that turn live music performance into an ideal test-bed for tangible interaction and advanced human-computer interaction. Reciprocally, the second part studies why tabletop interfaces promise remarkable new musical instruments. The third part describes the main design issues that lead to the development of the reactable, a tabletop musical instrument that has been conceived based on many of the criteria exposed on the previous two parts.
Madhya Bharti - Humanities and Social Science , 2023
This study focuses on a new age that emerged in the early 2000s when both modern and traditional puppeteers began experimenting with new performing arts mediums with puppetry at their centre, resulting in what is known as a hybrid performance that is puppet-centric. I argue that the hybrid forms extend the creative and animated expression of emotions and narratives, sustaining the presence of the performer, the space and the audience.
2021
«How do you prove to be human?» It is the head of an Amletic robot that interrogates the audience at the end of the show Robot Dreams by the German company Meinhardt Krauss Feigl, a performance that includes dancers, automata, robots and animatronics. The title is a clear reference to the work of Asimov, whose main themes are present within the show with references and quotations, such as the constant reflection on «What do robots dream of?». The show was performed at the Festival mondial des théâtres de marionnettes in Charleville-Mézières in 2019, confirming its belonging, in the section related to new experiments, to the world of puppetry. This is not an episode in itself, increasingly in the world of puppetry we can see the appearance of new technologies and robotic elements in performances. From Puppet Robots to mechanical gloves, from animatronics to great puppets: puppet theatre is moving much faster than other theatrical techniques towards the discovery of new technological means. The present study aims to investigate the relationship between puppeteer and robot within the contemporary scene, exploring the interaction between the artist and the object manipulated through sensors and new technologies. In puppet theatre, the puppeteer, wearing a glove, gives life to the puppet, the body of the puppet is formed by the hand of its manipulator. What happens when the puppet is made through a prosthesis created in animatronics? How is the manipulation handled during the performance? Who manages the machine in relation to the puppet master's work? The aim of this research will be precisely to try to answer these questions, starting from some embryonic examples, such as Bit the electronic puppet animated through a cyberglove created by Giacomo Verde to the prostheses created as animatronics in the show Robot Dreams by the company Meinhardt Krauss Feigl.
SMC (SOUND AND MUSIC COMPUTING), 2024
Kinesthesis 2.0 is an innovative interactive tool designed for augmenting the voice of singers and actors, enabling them to dynamically modify their voice in real-time during music theatre performances. This tool enhances immersive theatre by integrating technology and artistic expression, allowing for an interplay between verbal and gestural elements. Kinesthesis 2.0 focuses on achieving embodiment, where actors use multi-sensory stimuli, integrating voice, facial expressions, and gestures to create a comprehensive portrayal of characters. This embodiment is facilitated by the tool's ability to capture and translate these modulations, offering a dynamic interaction between auditory and visual storytelling. Technically, Kinesthesis 2.0 uses Python, Open Sound Control protocol, and Max/MSP to map the facial movements into predefined gestures. MediaPipe is used for face and hand detection, while OpenCV processes the webcam feed, capturing head movements, facial expressions, and hand gestures. Through the gesture and facial expressions detector, the artist controls various effects (pitch shifter, multi-channel vocoder, reverb, multi-tap delay and reverse delay). Artistically, Kinesthesis 2.0 enriches the expressive capabilities of performers by amplifying and augmenting their voice, thus enhancing their ability to convey emotions and narratives. As a case study, the tool was used by the singers and actors in the theatre performance "Rayman Scream".
2011
ABSTRACT This paper introduces GAVIP, an interactive and immersive platform allowing for audio-visual virtual objects to be controlled in real-time by physical gestures and with a high degree of intermodal coherency. The focus is particularly put on two scenarios exploring the interaction between a user and the audio, visual, and spatial synthesis of a virtual world. This platform can be seen as an extended virtual musical instrument that allows an interaction with three modalities: the audio, visual and spatial modality.
Proceedings of the International Symposium on Computational Aesthetics in Graphics, Visualization, and Imaging - CAe '11, 2011
In this paper we discuss the design of Performing Animator, an expressive instrument for live media, we developed in support of our situated interdisciplinary performance practice. The concept of a cinema of braided processes is introduced as a basic structure for media instrument design. This media performance instrument is described in terms of its conceptual, design and performative aspects. The Performing Animator Instrument is analogous to a musical instrument that enables generative animation, film editing and compositing, tailored for improvisational expression of projected visual media elements. Design of the instrument evolved based on eight years of development (2003)(2004)(2005)(2006)(2007)(2008)(2009)(2010)(2011) initiated by a number of interdisciplinary and cross-cultural performance productions as well as inspirations drawn from our study of Balinese Shadow Play (Wayang Kulit). Our instrument presents the performer with a large set of techniques that enable flexible media manipulation and generation. The paper also addresses issues related to the tensions between narrative structure and perfomative expression, live and recorded media and the structuring of improvised media.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.