Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2009
…
4 pages
1 file
This paper presents AudioCycle, a prototype application for browsing through music loop libraries. AudioCycle provides the user with a graphical view where the audio extracts are visualized and organized according to their similarity in terms of musical properties, such as timbre, harmony, and rhythm. The user is able to navigate in this visual representation and listen to individual audio extracts, as well as query the database by providing audio examples. AudioCycle draws from a range of technologies, including audio analysis from music information retrieval research, 3D visualization, spatial auditory rendering, audio time-scaling and pitch modification. The proposed approach extends on previously described music and audio browsers. Possible extension to multimedia libraries is also suggested.
2009
This paper presents AudioCycle, a prototype application for browsing through music loop libraries. AudioCycle provides the user with a graphical view where the audio extracts are visualized and organized according to their similarity in terms of musical properties, such as timbre, harmony, and rhythm. The user is able to navigate in this visual representation, and listen to individual audio extracts, searching for those of interest. AudioCycle draws from a range of technologies, including audio analysis from music information retrieval research, 3D visualization, spatial auditory rendering, audio time-scaling and pitch modification. The proposed approach extends on previously described music and audio browsers. Concepts developed here will be of interest to DJs, remixers, musicians, soundtrack composers, but also sound designers and foley artists. Possible extension to multimedia libraries are also suggested.
2008
This paper presents AudioCycle, a prototype application for browsing through music loop libraries. AudioCycle provides the user with a graphical view where the audio extracts are visualized and organized according to their similarity in terms of musical properties, such as timbre, harmony, and rhythm. The user is able to navigate in this visual representation, and listen to individual audio extracts, searching for those of interest. AudioCycle draws from a range of technologies, including audio analysis from music information retrieval research, 3D visualization, spatial auditory rendering, audio time-scaling and pitch modification. The proposed approach extends on previously described music and audio browsers. Concepts developped here will be of interest to DJs, remixers, musicians, soundtrack composers, but also sound designers and foley artists. Possible extension to multimedia libraries are also suggested.
2014 27th SIBGRAPI Conference on Graphics, Patterns and Images, 2014
Users interact a lot with their personal music collections, typically using standard text-based interfaces that offer constrained functionalities based on assigned metadata or tags. Alternative visual interfaces have been developed, both to display graphical views of music collections that attempt to reflect some chosen property or organization, or to display abstract visual representations of specific songs. Yet, there are many dimensions involved in the perception and handling of music and mapping musical information into computer tractable models is a challenging problem. There is a wide variety of possible approaches and the search for novel strategies to visually represent songs and/or collections persists, targeted either at the general public or at musically trained individuals. In this paper we describe a visual interface to browse music collections that relies on a graphical metaphor designed to convey the basic musical structure of a song. An iconic representation of individual songs is coupled with a spatial placement of groups of songs that reflects their structural similarity. The iconic representation is derived from features extracted from MIDI files, rather than from audio signals. The very nature of MIDI descriptions enables the identification of simple, yet meaningful, musical structures, allowing us to extract features that support both a music comparison function and the generation of the icon. A similarity-based spatial placement is obtained by projecting the feature vectors with the Least Square Projection multidimensional projection, with feature similarity evaluated with the Dynamic Time Warping distance function. We describe the process of generating such visual representations and illustrate potentially interesting usage scenarios. keywords:Visualization of Music CollectionsMultidimensional Projection High-Dimensional Data VisualizationSimilarity-based Visualizations
128th AES Convention, London, …, 2010
This paper presents a prototype tool for browsing through multimedia libraries using content-based multimedia information retrieval techniques. It is composed of several groups of components for multimedia analysis, data mining, interactive visualization, as well as connection with external hardware controllers. The musical application of this tool uses descriptors of timbre, harmony, as well as rhythm and two different approaches for exploring/browsing content. First, a dynamic data mining allows the user to group sounds into clusters according to those different criteria, whose importance can be weighted interactively. In a second mode, sounds that are similar to a query are returned to the user, and can be used to further proceed with the search. This approach also borrows from multi-criteria optimization concept to return a relevant list of similar sounds.
Computer Music Journal, 2006
2006
Collections of electronic music are mostly organized according to playlists based on artist names and song titles. Music genres are inherently ambiguous and, to make matters worse, assigned manually by a diverse user community. People tend to organize music based on similarity to other music and based on the music's emotional qualities. Taking this into account, we have designed a music player which derives a set of criteria from the actual music data and then provides a coherent visual metaphor for a similarity-based navigation of the music collection.
2010
In this paper we present MuVis, an interactive visualization and exploration tool for large music collections, based on music content and metadata. We combined a user-centered design with three main components: information visualization techniques (based on semantic ordered treemaps), music information retrieval mechanisms (for semantic and contentbased information extraction) and dynamic queries, to offer users a more efficient, flexible and yet, easy to use solution for browsing music collections and to create playlists. Preliminary results reveal that our solution is faster and easier to use than the Windows Media Player, allowing users to perform a more effective and fast navigation, while getting a deeper knowledge of their library. Satisfaction survey revealed that users liked our approach for browsing, filtering and creating playlists, while at the same time they were able to "re-discover" forgotten music, due to the similarity mechanisms incorporated in our solution.
2021
This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY
2018
This paper describes computational methods for the visual display and analysis of music information. We provide a concise description of software, music descriptors and data visualization techniques commonly used in music information retrieval. Finally, we provide use cases where the described software, descriptors and visualizations are showcased.
2016
FEATUR.UX (Feature ous) is an audio visualisation tool, currently in the process of development, which proposes to introduce a new approach to sound visualisation using pre-mixed, independent multitracks and audio feature extraction. Sound visualisation is usually performed using a mixed mono or stereo track of audio. Audio feature extraction is commonly used in the field of music information retrieval to create search and recommendation systems for large music databases rather than generating live visualisations. Visualizing multitrack audio circumvents problems related to the source separation of mixed audio signals and presents an opportunity to examine interdependent relationships within and between separate streams of music. This novel approach to sound visualisation aims to provide an enhanced listening experience in a use case that employs non-tonal, non-notated forms of electronic music. Findings from prior research studies focused on live performance and preliminary quantit...
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Proceedings of SPIE, 2006
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2011
2020 24th International Conference Information Visualisation (IV), 2020
e & i Elektrotechnik und Informationstechnik, 2005
Personal and Ubiquitous Computing, 2020
… of the sixth ACM international conference …, 1998