Academia.eduAcademia.edu

Abstract

Listening to music affects the human brain activities. Emotion based music player with automated playlist can help users to maintain a particular emotional state. This research proposes an emotion based music player that creates a playlists based on captured photos of the user. Manual sorting of a playlist and annotation of songs, in accordance with the current emotion, is more time consuming and quite tedious. Numerous algorithms have been implemented to automate this process. However, existing algorithms are slow, increase cost of the system by using additional hardware (e.g. EEG systems and sensors) and have quite very less accuracy. This paper presents an algorithm that not only automates the process of generating an audio playlist, but also to classify those songs which are newly added and the main task is to capture current mood of person and to play song accordingly. This enhances the system's efficiency, faster and automatic. The main goal is to reduce the overall computational time and the cost of the designed system. It also aims at increasing the accuracy of the system. The most important goal is to make change the mood of person if it is a negative one such as sad, depressed. This model is validated by testing the system against user dependent and user independent dataset.