Papers by Liam O Sullivan

International Journal of Mobile HCI
Tactile surfaces can display information in a variety of applications for all users, but can be o... more Tactile surfaces can display information in a variety of applications for all users, but can be of particular benefit to blind and visually impaired individuals. One example is the use of paper-based tactile maps as navigational aids for interior and exterior spaces; visually impaired individuals may use these to practice and learn a route prior to journeying. The addition of an interactive auditory display can enhance such interfaces by providing additional information. This article presents a prototype system which tracks the actions of a user's hands over a tactile surface and responds with sonic feedback. The initial application is an Audio-Tactile Map (ATM); the auditory display provides verbalised information as well as environmental sounds useful for navigation. Two versions of the interface are presented; a desktop version intended as a large-format information point and a mobile version which uses a tablet computer overlain with tactile paper. Details of these implementations are provided, including observations drawn from the participation of a partially-sighted individual in the design process. A usability test with five visually impaired subjects also gives a favourable assessment of the mobile version.
Algorithmic composition is the creation of music using algorithms, or more specifically, computer... more Algorithmic composition is the creation of music using algorithms, or more specifically, computer programming. Real-time Algorithmic Composition seeks to extend this compositional technique to real-time scenarios like concert performances or interactive installations. As such, a more novel mode of interaction between the composer and the system is needed for effective expression and parameter control. We summarise the design and implementation of a Real-time Algorithmic Composition system that employs a tabletop musical interface for input control. Several algorithms developed for real-time performance are discussed. The application and inclusion of the tabletop interface is then outlined and evaluated.

Tactile surfaces can display useful information in a variety of applications for blind, visually-... more Tactile surfaces can display useful information in a variety of applications for blind, visually-impaired and even sighted users. One example is the use of paper-based tactile maps as navigational aids for interior and exterior spaces; visually-impaired individuals may use these to practice and learn a route prior to journeying. The addition of an interactive auditory display could enhance such tactile interfaces by providing additional information. This paper presents preliminary work on a prototype multi-modal interface which tracks the actions of a user's hands over a tactile surface and responds with sonic feedback. The initial application being considered is an Auditory Tactile Map (ATM); the auditory display provides verbalised information as well as environmental sounds useful for navigation. Another proposed implementation adds interactivity to reproductions
of museum exhibits, making these more accessible to the visually-impaired and allowing exploration of their tactile affordances while preserving the original works.

Navigation within historic spaces requires analysis of a variety of acoustic, proprioceptive and ... more Navigation within historic spaces requires analysis of a variety of acoustic, proprioceptive and tactile cues; a task that is well-developed in many visually-impaired individuals but for which sighted individuals rely almost entirely on vision. For the visually-impaired, the creation of a cognitive map of a space can be a long process for which the individual may repeat various paths numerous times. While this action is typically performed by the individual on-site, it is of some interest to investigate to what degree this task can be performed off-site using a virtual simulator. We propose a tactile map navigation system with interactive auditory display. The system is based on a paper tactile map upon which the user’s hands are tracked. Audio feedback provides; (i) information on user-selected map features, (ii) dynamic navigation information as the hand is moved, (iii) guidance on how to reach the location of one hand (arrival point) from the location of the other hand (departure point) and (iv) additional interactive 3D-audio cues useful for navigation. This paper presents an overview of the initial technical development stage, reporting observations from preliminary evaluations with a blind individual. The system will be beneficial to visually-impaired visitors to heritage sites; we describe one such site which is being used to further assess our prototype.
We investigated the impact of exploratory head movements on sound localization accuracy using rea... more We investigated the impact of exploratory head movements on sound localization accuracy using real and virtual 5.1 loudspeaker arrays. Head orientation data in the horizontal plane was provided either by the Microsoft Kinect face-tracking or Oculus Rift's built-in Inertial Measurement Unit (IMU) which resulted in significantly different precision and accuracy of measurements. In both cases, results suggest improvements in virtual source localization accuracy in the front and rear quadrants.

Proceedings of the 7th Annual Irish Human-Computer Interaction Conference (iHCI13), Jun 12, 2013
Visual Music refers to those visual arts which attempt to illicit the feelings normally associate... more Visual Music refers to those visual arts which attempt to illicit the feelings normally associated with traditional aural music. From paintings, through film to computer graphics, dynamic musical qualities are sometimes represented through abstract graphics; the composer may encode subjective associations in mapping musical qualities to the properties and motion of onscreen events, for example. The natures of such cross-domain mappings may be more generally appreciated however, offering potential avenues of investigation for the design of more effective musical controllers. This paper briefly reviews examples from two pioneering artists in the field and documents ongoing attempts to practically implement the inherent audiovisual relationships involved in new software applications. With the enriched interaction possibilities facilitated by electronic interfaces such as mobile touch-screens, this approach potentially provides enhanced expressive capability and heightens engagement through more natural manipulation of musical material.
Linux Audio Conference 2013, May 9-12 @ IEM, Graz, Austria (Forthcoming)
MorphOSC is a new toolkit for building graphical user interfaces for the control of sound using m... more MorphOSC is a new toolkit for building graphical user interfaces for the control of sound using morphing between parameter presets. It uses the multidimensional interpolation space paradigm seen in some other systems, but hitherto unavailable as open-source software in the form presented here. The software is delivered as a class library for the Processing Development Environment and is cross-platform for desktop
computers and Android mobile devices. This paper positions the new library within the context of similar software, introduces the main features of the initial code release and details future work on the project.

Proceedings of the International Conference on New Interfaces for Musical Expression (NIME), May 19, 2012
Development of new musical interfaces often requires experimentation with the mapping of availabl... more Development of new musical interfaces often requires experimentation with the mapping of available controller inputs to output parameters. Useful mappings for a particular application may be complex in nature, with one or more inputs being linked to one or more outputs. Existing development environments are commonly used to program such mappings, while code libraries provide powerful data-stream manipulation. However, room exists for a standalone application with a graphical user interface for dynamically patching between inputs and outputs. This paper presents the prototype version of a software tool that allows the user to route control signals in real time. It is cross-platform and runs as a standalone program. The program has a number of potential applications, including fast prototyping of musical interfaces and the real-time re-mapping of controllers to musical parameters during performance.
Audio Engineering Society 41st Conference: Audio for Games (2011).
Developments in abstract representations of sound from the field of computer music have potential... more Developments in abstract representations of sound from the field of computer music have potential applications for designers of musical computer games. Research in cognition has identified correlations in the perceptions of visual objects and audio events; -experiments show that test subjects associate certain qualities of graphical shapes with those of vocal sounds. Such 'sound symbolism' has been extended to non-vocal sounds and this paper describes attempts to exploit this and other phenomenon in the visualization of audio. The ideas are expanded upon to propose control for sound synthesis through the manipulation of virtual shapes. Mappings between parameters in the auditory and visual feedback modes are discussed. An exploratory user test examines the technique using a prototype system.
Proc. of the 13th Int. Conference on Digital Audio Effects (DAFx-10), 2010
A fuzzy logic-based approach can be used to simulate human agents in many control situations. Num... more A fuzzy logic-based approach can be used to simulate human agents in many control situations. Numerous authors have noted that this methodology has advantages for a variety of tasks within the realm of computer music. In this paper, a review of such projects is conducted and a rudimentary example application of fuzzy logic techniques is presented. This automatically achieves a basic level of 'humanisation' of a drum pattern through strike velocity modification. Such a tool could significantly reduce the time spent on editing individual drum hits in a music production environment and has potential applications for rhythmic composition and performance.
"Analogous perceptual experiences in multiple sensory modalities provide possibilities for abstra... more "Analogous perceptual experiences in multiple sensory modalities provide possibilities for abstract expression using modern computer-based audio-visual systems. This work examines the phenomenon of such cross-modal links with a focus on mappings between the auditory and visual realms.
The design and implementation of a graphics-based sound synthesis controller using such inter-sensory associations is presented. A review of literature relevant to the design of computer-based musical instruments is provided, including discussions of parameter mapping, the use of graphical displays and computer vision as a gesture capture mechanism.
An analysis of the software instruments realised and the physical interface setup is provided. System improvements and possible applications are also discussed with the direction of future work with the system being suggested."
Talks by Liam O Sullivan

Interfaces for computer music systems commonly provide a graphical user interface (GUI) for visua... more Interfaces for computer music systems commonly provide a graphical user interface (GUI) for visual feedback and/or control. These mainly use traditional on-screen widgets (such as dials, sliders and buttons) for modification of parameter values and scheduling musical events. There is potential for richer, more engaging interfaces though reconsideration of these standard visual presentations, however. This paper outlines a number of applications under development by the author that emphasise the manipulation of visual material as a means of musical control. These prototype interfaces are inspired by the perceptual analogies between domains found in common metaphor, in artistic works, and from empirical results of subjective studies. These afford control of sound and image using a number of approaches. In one example, the metaphor of interacting with a textured image is used to control the parameters of a granular synthesiser. In another, an artistic representation of harmonic structure (inspired by the Visual Music works of James Whitney) allows novel interaction with sound from an additive synthesiser. The sonification and processing of image data is used in a further example to unify material in both sensory domains, providing a true audio-visual instrument. Initial subjective observations are made from the use of the prototypes in the author's arts practice, with more focussed discussion on a particular performance piece that utilises the image sonification method (rasterisation). This treats digital image data as music synthesizers use source signals, affording a level of live musical improvisation not typically seen in audio-visual practice. The author then generalises these observations to suggest future research directions for the development of interfaces which incorporate novel interactive visualization techniques to provide useful, meaningful and engaging feedback in musical systems. The paper is an attempt to stimulate discussion on more experimental use of graphical content in expressive musical interfaces.
Uploads
Papers by Liam O Sullivan
of museum exhibits, making these more accessible to the visually-impaired and allowing exploration of their tactile affordances while preserving the original works.
computers and Android mobile devices. This paper positions the new library within the context of similar software, introduces the main features of the initial code release and details future work on the project.
The design and implementation of a graphics-based sound synthesis controller using such inter-sensory associations is presented. A review of literature relevant to the design of computer-based musical instruments is provided, including discussions of parameter mapping, the use of graphical displays and computer vision as a gesture capture mechanism.
An analysis of the software instruments realised and the physical interface setup is provided. System improvements and possible applications are also discussed with the direction of future work with the system being suggested."
Talks by Liam O Sullivan
of museum exhibits, making these more accessible to the visually-impaired and allowing exploration of their tactile affordances while preserving the original works.
computers and Android mobile devices. This paper positions the new library within the context of similar software, introduces the main features of the initial code release and details future work on the project.
The design and implementation of a graphics-based sound synthesis controller using such inter-sensory associations is presented. A review of literature relevant to the design of computer-based musical instruments is provided, including discussions of parameter mapping, the use of graphical displays and computer vision as a gesture capture mechanism.
An analysis of the software instruments realised and the physical interface setup is provided. System improvements and possible applications are also discussed with the direction of future work with the system being suggested."