Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
…
10 pages
1 file
A user's emotion or mood can be detected by his/her facial expressions. These expressions can be derived from the live feed via the system's camera. A lot of research is being conducted in the field of Computer Vision and Machine Learning (ML), where machines are trained to identify various human emotions or moods. Machine Learning provides various techniques through which human emotions can be detected. People often use music as a means of mood regulation, specifically to change a bad mood, increase energy level or reduce tension. Also, listening to the right kind of music at the right time may improve mental health. Thus, human emotions have a strong relationship with music.
IRJET, 2020
Recent studies ensure that humans respond and react to music which music incorporates a high impact on person brain activity. The common yank listens up to four hours of music daily. Individuals tend to concentrate on music that supported their mood and interests. This project focuses on making Associate in Nursing application to counsel songs for user-supported their mood by capturing facial expressions. Facial features could be a type of nonverbal communication. Pc vision is Associate in Nursing knowledge domain field that helps convey a high-level understanding of digital pictures or videos to computers. In this system, PC vision elements square measure accustomed to verify the user's feeling through facial expressions. Once the feeling is recognized, the system suggests a play-list for that feeling, saving a great deal of your time for a user over choosing and taking part in songs manually. Emotion-Based Music Player additionally keeps track of user's details like the number of plays for every song, types of songs supported class and interest level, and reorganizes the play-list on every occasion. The system additionally notifies user concerning the songs that square measure so that they will be deleted or changed. Listening to music affects the human brain activities. Feeling based mostly music player with the machine-driven listing will facilitate users to take care of a selected emotional state. This analysis proposes Associate in Nursing feeling based mostly music player that creates playlists supported captured photos of the user. This enhances the system's potency, faster and automatic. The most goal is to cut back the general machine time and also the cost of the designed system. It additionally aims at increasing the accuracy of the system. The foremost vital goal is to create an amendment to the mood of a person if it's a negative one like unhappy, depressed. This model is valid by testing the system against user-dependent and user freelance dataset. The methodology of finding this downside is to make a completely useful app (Front End and Back finish) that solves this downside, ranging from the face there a straightforward and understandable interface anyone will use, this interface is connected to the rear finish. On the back finish, the most rule during this project is to make a Convolutional Neural Network help within the goal of achieving high accuracy rate, as a result of the Convolutional Neural Network is the best within the science of building any network that works with pictures, and conjointly their area unit plenty of similar analysis papers that achieved o.k. seen success during this field of analysis. A completely useful app that engineered to unravel this downside (Desktop Only) and conjointly trained nearly 28000 pictures with totally different states of emotions (Happy, Sad, Angry, and Normal) with an high accuracy rate that is, "85%" for coaching and "83%" for testing rate, the application is with success suggesting music by suggesting single songs that match any user's feeling.
IRJET, 2022
A user's emotion or mood can be detected by his/her facial expressions. These expressions can be derived from the live feed via the system's camera. A lot of research is being conducted in the field of Computer Vision and Machine Learning, where machines are trained to identify various human emotions or moods. Machine Learning provides various techniques through which human emotions can be detected. Music is a great connector. Music players and other streaming apps have a high demand as these apps can be used anytime, anywhere and can be combined with daily activities, traveling, sports, etc. People often use music as a means of mood regulation, specifically to change a bad mood, increase energy level or reduce tension. Also, listening to the right kind of music at the right time may improve mental health. Thus, human emotions have a strong relationship with music. In our proposed system, a mood-based music player is created which performs real time mood detection and suggests songs as per detected mood. The objective of this system is to analyze the user's image, predict the expression of the user and suggest songs suitable to detect mood.
Research Square (Research Square), 2023
Music always has a special connection to our emotions. It is a method of connecting people all over the world through pure emotions. When all of this is considered, it is exceedingly difficult to generalize music and state that everyone would enjoy the same type. Music recommendation based on mood is greatly needed because they will assist humans in relieving stress and listening to soothing music based on their current emotions. Its primary goal is to accurately predict the user's mood, and songs are played by the application based on the user's selection as well as his current mood. This bot recognizes human emotions from facial images using Human-Computer Interaction (HCI). The extraction of facial elements from the user's face is another critical factor. We used the CNN Algorithm to accurately detect the user's face in the live webcam feed and to detect emotion based on facial attributes such as the arrangement of the user's mouth and eyes. There is also a questionnaire where we can directly select a good option.
Emotion is considered a physiological state that appears whenever a transformation is observed by an individual in their environment or body. While studying the literature, it has been observed that combining the electrical activity of the brain, along with other physiological signals for the accurate analysis of human emotions is yet to be explored in greater depth. On the basis of physiological signals, this work has proposed a model using machine learning approaches for the calibration of music mood and human emotion. The proposed model consists of three phases (a) prediction of the mood of the song based on audio signals, (b) prediction of the emotion of the human-based on physiological signals using EEG, GSR, ECG, Pulse Detector, and finally, (c) the mapping has been done between the music mood and the human emotion and classifies them in real-time. Extensive experimentations have been conducted on the different music mood datasets and human emotion for influential feature extraction, training, testing and performance evaluation. An effort has been made to observe and measure the human emotions up to a certain degree of accuracy and efficiency by recording a person's bio-signals in response to music. Further, to test the applicability of the proposed work, playlists are generated based on the user's real-time emotion determined using features generated from different physiological sensors and mood depicted by musical excerpts. This work could prove to be helpful for improving mental and physical health by scientifically analyzing the physiological signals.
International Journal of Research and Innovation in Applied Science
Face emotion detection has recently attracted a lot of interest because of its uses in computer vision and the field of human-computer interaction. Various methods and applications were suggested and put into use as a result of the ongoing research in this area. In this study, we present an emotion-recognition recommender system that can identify a user’s feelings and offer a selection of suitable songs that might lift his spirits. To gather information and enable us to give the users a selection of music tracks that are effective at lifting the users’ spirits, a quick search was undertaken to learn how music may impact the user mood in the short term. The suggested system recognizes emotions, and if the individual is feeling down, a special playlist including the best kinds of music will be played to lift his spirits. On the other hand, if a favorable mood is recognized, an appropriate playlist will be offered that contains several genres of music that will amplify the pleasant fee...
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2024
This research paper presents the development of an emotional sensor system that combines computer vision techniques with physiological sensors to detect and analyse facial expressions. The system utilizes a webcam for capturing facial images and integrates Arduino-based hardware components such as a pulse sensor, temperature sensor, galvanic skin sensor, and heart rate sensor for comprehensive emotional analysis. The proposed system aims to contribute to various fields such as human-computer interaction, psychology, and healthcare by providing a non-invasive and real-time method for emotion detection and monitoring.
Human expression plays a vital role in determining the current state and mood of an individual, it helps in extracting and understanding the emotion that an individual has based on various features of the face such as eyes, cheeks, forehead or even through the curve of the smile. Music is basically an art form that soothes and calms human brain and body. Taking these two aspects and blending them together our project deals with detecting emotion of an individual through facial expression and playing music according to the mood detected that will alleviate the mood or simply calm the individual and can also get quicker song according to the mood, saving time from looking up different songs and parallel developing a software that can be used anywhere with the help of providing the functionality of playing music according to the emotion detected. By developing a recommendation system, it could assist a user to make a decision regarding which music one should listen to helping the user to reduce his/her stress levels. The user would not have to waste any time in searching or to look up for songs and the best track matching the user's mood is detected, and songs would be shown to the user according to his/her mood. The image of the user is captured with the help of a webcam. The user's picture is taken and then as per the mood/emotion of the user an appropriate song from the playlist of the user is shown matching the user's requirement.
International journal of scientific research in science, engineering and technology, 2024
2019 International Conference on Signal Processing and Communication (ICSC), 2019
The ability of music to produce an emotional response in its listeners is one of its most exciting and yet the least understood property. Music not only conveys emotion and meaning but can also stir a listener's mood. This paper will study various algorithms based on classification to provide a clear methodology to i) classify songs into 4 mood categories and ii) detect users mood through his facial expressions and then combine the two to generate user customized music playlist. Songs have been classified by two approaches; by directly training the models namely KNN, Support Vector Machines (SVM), Random Forest and MLP using selected audio features and by predicting a songs arousal and valence values using these audio features. The first approach attains maximum accuracy of 70% using MLP while the latter achieves accuracy of 81.6% using SVM regression. The face mood classifier using HAAR classifier and fisher face algorithm attains precision of 92%.
Listening to music affects the human brain activities. Emotion based music player with automated playlist can help users to maintain a particular emotional state. This research proposes an emotion based music player that creates a playlists based on captured photos of the user. Manual sorting of a playlist and annotation of songs, in accordance with the current emotion, is more time consuming and quite tedious. Numerous algorithms have been implemented to automate this process. However, existing algorithms are slow, increase cost of the system by using additional hardware (e.g. EEG systems and sensors) and have quite very less accuracy. This paper presents an algorithm that not only automates the process of generating an audio playlist, but also to classify those songs which are newly added and the main task is to capture current mood of person and to play song accordingly. This enhances the system's efficiency, faster and automatic. The main goal is to reduce the overall computational time and the cost of the designed system. It also aims at increasing the accuracy of the system. The most important goal is to make change the mood of person if it is a negative one such as sad, depressed. This model is validated by testing the system against user dependent and user independent dataset.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
International Journal of Innovative Technology and Exploring Engineering, 2022
International Journal for Modern Trends in Science and Technology, 2021
International Journal of Scientific Research in Computer Science, Engineering and Information Technology, 2021
International Journal for Research in Applied Science and Engineering Technology IJRASET, 2020
International journal of Emerging Trends in Science and Technology, 2016
International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2022
International Journal of Advance Research and Innovative Ideas in Education, 2021
International Journal for Research in Applied Science and Engineering Technology, 2021
Frontiers in Artificial Intelligence and Applications, 2021