Papers by Danilo De Rossi

Journal of human-robot interaction, Jun 1, 2014
Human-Robot Interaction (HRI) studies have recently received increasing attention in various fiel... more Human-Robot Interaction (HRI) studies have recently received increasing attention in various fields, from academic communities to engineering firms and the media. Many researchers have been focusing on the development of tools to evaluate the performance of robotic systems and studying how to extend the range of robot interaction modalities and contexts. Because people are emotionally engaged when interacting with computers and robots, researchers have been focusing attention on the study of affective human-robot interaction. This new field of study requires the integration of various approaches typical of different research backgrounds, such as psychology and engineering, to gain more insight into the human-robot affective interaction. In this paper, we report the development of a multimodal acquisition platform called HIPOP (Human Interaction Pervasive Observation Platform). HIPOP is a modular data-gathering platform based on various hardware and software units that can be easily used to create a custom acquisition setup for HRI studies. The platform uses modules for physiological signals, eye gaze, video and audio acquisition to perform an integrated affective and behavioral analysis. It is also possible to include new hardware devices into the platform. The open-source hardware and software revolution has made many high-quality commercial and open-source products freely available for HRI and HCI research. These devices are currently most often used for data acquisition and robot control, and they can be easily included in HIPOP. Technical tests demonstrated the ability of HIPOP to reliably acquire a large set of data in terms of failure management and data synchronization. The platform was able to automatically recover from errors and faults without affecting the entire system, and the misalignment observed in the acquired data was not significant and did not affect the multimodal analysis. HIPOP was also tested in the context of the FACET (FACE Therapy) project, in which a humanoid robot called FACE (Facial Automaton for Conveying Emotions) was used to convey affective stimuli to children with autism. In the FACET project, psychologists without technical skills were able to use HIPOP to collect the data needed for their experiments without dealing with hardware issues, data integration challenges, or synchronization problems. The FACET case study highlighted the real core feature of the HIPOP platform (i.e., multimodal data integration and fusion). This analytical approach allowed psychologists to study both behavioral and psychophysiological reactions to obtain a more complete view of the subjects' state during interaction with the robot. These results indicate that HIPOP could become an innovative tool for HRI affective studies aimed at inferring a more detailed view of a subject's feelings and behavior during interaction with affective and empathic robots.

In this work the development of a sensing seat for human authentication is reported. Such a syste... more In this work the development of a sensing seat for human authentication is reported. Such a system can be used in all the critical scenarios where a seat is available to the human subject. In order to face the authentication task, the sensing seat was developed by means of a novel unobtrusive sensing technology. This is mainly due to two aspects: the unavailability of an existing sensing seat system for human authentication and the inadequacy of the existing sensor technology in order to address the human subject recognition as well as the integration of the sensors in the seat. Thanks to the development of a redundant sensor network, we adopted a hierarchical architecture. Three cooperating classifiers (a distance-based classifier, a KSOM and a MLP) shares the input data and supplying three different classification results. A final classifier (a weighted averager) performs the fusion of the results and supplies the final response.

Zenodo (CERN European Organization for Nuclear Research), May 7, 2003
Electronic noses, instruments for automatic recognition of odours, are typically composed of an a... more Electronic noses, instruments for automatic recognition of odours, are typically composed of an array of partially selective sensors, a sampling system, a data acquisition device and a data processing system. For the purpose of evaluating the quality of oliveoil, an electronic nose based on an arrayofconducting polymer sensors capable of discriminating olive oil aromas was developed. The selection of suitable pattern recognition techniques for a particular application can enhance the performance of electronic noses. Therefore, an advanced neural recognition algorithm for improving the measurement capability of the device was designed and implemented. This method combines multivariate statistical analysis and a hierarchical neural-network architecture based on self-organizing maps and error back-propagation. The complete system was tested using samples composed of characteristic olive oil aromatic components in refined olive oil. The results obtained have shown that this approach is effective in grouping aromas into different categories representative of their chemical structure.

EMBEC & NBC 2017, 2017
Thanks to the new advanced tools and the innovative methods to image deep in the brain at cell re... more Thanks to the new advanced tools and the innovative methods to image deep in the brain at cell resolution, neuroanatomy is quickly redefining its protocols for quantitatively studying neurons in their own three-dimensional arrangement. The huge amount of data generated has to be managed and shared among labs: this need has led us to develop DataBrain, an on-line archive of three-dimensional single neuron reconstructions and their associated morphometrics. DataBrain interface allows users to upload and download data, to easily search neuron using filters and to on-line view both three-dimensional reconstructions and morphological parameters. Here we describe DataBrain's main features and show an example of how it can be used to store morphological quantitative datasets of Purkinje cells from murine clarified cerebellum slices acquired using a confocal microscope.

Sensors, 2015
Bipolar disorder is one of the most common mood disorders characterized by large and invalidating... more Bipolar disorder is one of the most common mood disorders characterized by large and invalidating mood swings. Several projects focus on the development of decision support systems that monitor and advise patients, as well as clinicians. Voice monitoring and speech signal analysis can be exploited to reach this goal. In this study, an Android application was designed for analyzing running speech using a smartphone device. The application can record audio samples and estimate speech fundamental frequency, F 0 , and its changes. F 0 -related features are estimated locally on the smartphone, with some advantages with respect to remote processing approaches in terms of privacy protection and reduced upload costs. The raw features can be sent to a central server and further processed. The quality of the audio recordings, algorithm reliability and performance of the overall system were evaluated in terms of voiced segment detection and features estimation. The results demonstrate that mean F 0 from each voiced segment can be reliably estimated, thus describing prosodic features across the speech sample. Instead, features related to F 0 variability within each voiced segment performed poorly. A case study performed on a bipolar patient is presented.
In this study we propose an automatic method for solving convolutive mixtures separation. The ind... more In this study we propose an automatic method for solving convolutive mixtures separation. The independent components are extracted by frequency domain analysis, where the convolutive model can be solved by instantaneous mixing model approach. The signals are reconstructed back in the observation space resolving the ICA model ambiguities. Simulations are carried out to test the validity of the proposed method

Frontiers in Bioengineering and Biotechnology, 2015
Non-verbal signals expressed through body language play a crucial role in multi-modal human commu... more Non-verbal signals expressed through body language play a crucial role in multi-modal human communication during social relations. Indeed, in all cultures, facial expressions are the most universal and direct signs to express innate emotional cues. A human face conveys important information in social interactions and helps us to better understand our social partners and establish empathic links. Latest researches show that humanoid and social robots are becoming increasingly similar to humans, both esthetically and expressively. However, their visual expressiveness is a crucial issue that must be improved to make these robots more realistic and intuitively perceivable by humans as not different from them. This study concerns the capability of a humanoid robot to exhibit emotions through facial expressions. More specifically, emotional signs performed by a humanoid robot have been compared with corresponding human facial expressions in terms of recognition rate and response time. The set of stimuli included standardized human expressions taken from an Ekman-based database and the same facial expressions performed by the robot. Furthermore, participants' psychophysiological responses have been explored to investigate whether there could be differences induced by interpreting robot or human emotional stimuli. Preliminary results show a trend to better recognize expressions performed by the robot than 2D photos or 3D models. Moreover, no significant differences in the subjects' psychophysiological state have been found during the discrimination of facial expressions performed by the robot in comparison with the same task performed with 2D photos and 3D models.

Conference proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual Conference, 2011
People with ASD (Autism Spectrum Disorders) have difficulty in managing interpersonal relationshi... more People with ASD (Autism Spectrum Disorders) have difficulty in managing interpersonal relationships and common life social situations. A modular platform for Human Robot Interaction and Human Machine Interaction studies has been developed to manage and analyze therapeutic sessions in which subjects are driven by a psychologist through simulated social scenarios. This innovative therapeutic approach uses a humanoid robot called FACE capable of expressing and conveying emotions and empathy. Using FACE as a social interlocutor the psychologist can emulate real life scenarios where the emotional state of the interlocutor is adaptively adjusted through a semi closed loop control algorithm which uses the ASD subject's inferred "affective" state as input. Preliminary results demonstrate that the platform is well accepted by ASDs and can be consequently used as novel therapy for social skills training.

In this work the development of a sensing seat for human authentication is reported. Such a syste... more In this work the development of a sensing seat for human authentication is reported. Such a system can be used in all the critical scenarios where a seat is available to the human subject. In order to face the authentication task, the sensing seat was developed by means of a novel unobtrusive sensing technology. This is mainly due to two aspects: the unavailability of an existing sensing seat system for human authentication and the inadequacy of the existing sensor technology in order to address the human subject recognition as well as the integration of the sensors in the seat. Thanks to the development of a redundant sensor network, we adopted a hierarchical architecture. Three cooperating classifiers (a distance-based classifier, a KSOM and a MLP) shares the input data and supplying three different classification results. A final classifier (a weighted averager) performs the fusion of the results and supplies the final response.

Today's increasingly large and complex databases require novel and machine aided ways of explorin... more Today's increasingly large and complex databases require novel and machine aided ways of exploring data. To optimize the selection and presentation of data, we suggest an unconventional approach. Instead of exclusively relying on explicit user input to specify relevant information or to navigate through a data space, we exploit the power and potential of the users' unconscious processes in addition. To this end, the user is immersed in a mixed reality environment while his bodily reactions are captured using unobtrusive wearable devices. The users' reactions are analyzed in real-time and mapped onto higher-level psychological states, such as surprise or boredom, in order to trigger appropriate system responses that direct the users' attention to areas of potential interest in the visualizations. The realization of such a close experience-based human-machine loop raises a number of technical challenges, such as the real-time interpretation of psychological user states. The paper at hand describes a sensing architecture for empathetic data systems that has been developed as part of such a loop and how it tackles the diverse challenges.

2009 Ninth International Conference on Intelligent Systems Design and Applications, 2009
The profound, pervasive and enduring consequences of ageing population present enormous challenge... more The profound, pervasive and enduring consequences of ageing population present enormous challenges as well as enormous opportunities for Information and Communication Technology. The EU funded OASIS project, a Large Scale Integrated Project, is aimed to develop an open and innovative reference architecture, based upon ontologies and semantic services, that will allow plug and play and cost-effective interconnection of existing and new services in all domains required for the independent and autonomous living of the elderly and their enhanced quality of life. Among other technological advances, in OASIS we are developing a smart multisensorial platform for monitoring the lower limbs movements, as well as the muscular activations. We are using unobtrusive integrated sensors to transduce posture and kinematic variables and to acquire surface Electromiography (sEMG). The platform is able to analyze and merge the sEMG signals and kinematics variables to provide a single coherent dynamic information of the acquired movements. A Predictive Dynamic Model (PDM) based on machine learning techniques assesses the physiological muscular recruitments as well as muscular fatigue and physiological conditions.

Frontiers in Neuroscience, 2014
Compared to standard laboratory protocols, the measurement of psychophysiological signals in real... more Compared to standard laboratory protocols, the measurement of psychophysiological signals in real world experiments poses technical and methodological challenges due to external factors that cannot be directly controlled. To address this problem, we propose a hybrid approach based on an immersive and human accessible space called the eXperience Induction Machine (XIM), that incorporates the advantages of a laboratory within a life-like setting. The XIM integrates unobtrusive wearable sensors for the acquisition of psychophysiological signals suitable for ambulatory emotion research. In this paper, we present results from two different studies conducted to validate the XIM as a general-purpose sensing infrastructure for the study of human affective states under ecologically valid conditions. In the first investigation, we recorded and classified signals from subjects exposed to pictorial stimuli corresponding to a range of arousal levels, while they were free to walk and gesticulate. In the second study, we designed an experiment that follows the classical conditioning paradigm, a well-known procedure in the behavioral sciences, with the additional feature that participants were free to move in the physical space, as opposed to similar studies measuring physiological signals in constrained laboratory settings. Our results indicate that, by using our sensing infrastructure, it is indeed possible to infer human event-elicited affective states through measurements of psychophysiological signals under ecological conditions.

2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing, 2012
Autism Spectrum Disorder (ASD) is a neural development disorder characterized by specific pattern... more Autism Spectrum Disorder (ASD) is a neural development disorder characterized by specific patterns of behavioral and social difficulties. Beyond these core symptoms, additional problems such as absence of gender differences identification, interactional distortions of environmental and family responses are often present. Taking into account these emotional and behavioral problems researchers and clinicians are hardly working to design innovative therapeutic approaches aimed to improve social capabilities of subjects with ASD. Thanks to the technological and scientific progresses of the last years, nowadays it is possible to create human-like robots with social and emotional capabilities. Furthermore it is also possible to analyze physiological signals inferring subjects' psycho-physiological state which can be compared with a behavioral analysis in order to obtain a deeper understanding of subjects reactions to treatments. In this work a preliminary evaluation of an innovative social robot-based treatment for subjects with ASD is described. The treatment consists in a complex stimulation and acquisition platform composed of a social robot, a multi-parametric acquisition system and a therapeutic protocol. During the preliminary tests of the treatment the subject's physiological signals and behavioral parameters have been recorded and used together with the therapists' annotations to infer the subjects' induced reactions. Physiological signals were analyzed and statistically evaluated demonstrating the possibility to correctly discern the two groups (ASD and normally developing subjects) with a classification percentage higher than 92%. Statistical analysis also highlighted the treatment capability to induce different affective states in subjects with ASDs more than in control subjects, demonstrating that the treatment is well designed and tuned on ASDs deficits and behavioral lacks.

Frontiers in Bioengineering and Biotechnology, 2014
We describe here a wearable, wireless, compact, and lightweight tactile display, able to mechanic... more We describe here a wearable, wireless, compact, and lightweight tactile display, able to mechanically stimulate the fingertip of users, so as to simulate contact with soft bodies in virtual environments. The device was based on dielectric elastomer actuators, as highperformance electromechanically active polymers. The actuator was arranged at the user's fingertip, integrated within a plastic case, which also hosted a compact high-voltage circuitry. A custom-made wireless control unit was arranged on the forearm and connected to the display via low-voltage leads. We present the structure of the device and a characterization of it, in terms of electromechanical response and stress relaxation. Furthermore, we present results of a psychophysical test aimed at assessing the ability of the system to generate different levels of force that can be perceived by users.

Proceedings of the 2014 Virtual Reality International Conference, 2014
The development of systems that allow multimodal interpretation of human-machine interaction is c... more The development of systems that allow multimodal interpretation of human-machine interaction is crucial to advance our understanding and validation of theoretical models of user behavior. In particular, a system capable of collecting, perceiving and interpreting unconscious behavior can provide rich contextual information for an interactive system. One possible application for such a system is in the exploration of complex data through immersion, where massive amounts of data are generated every day both by humans and computer processes that digitize information at different scales and resolutions thus exceeding our processing capacity. We need tools that accelerate our understanding and generation of hypotheses over the datasets, guide our searches and prevent data overload. We describe XIMengine, a bio-inspired software framework designed to capture and analyze multi-modal human behavior in an immersive environment. The framework allows performing studies that can advance our understanding on the use of conscious and unconscious reactions in interactive systems.
2009 Ninth International Conference on Intelligent Systems Design and Applications, 2009
The aim of this paper is constituted by the feasibility study and development of a system based o... more The aim of this paper is constituted by the feasibility study and development of a system based on Field Programmable Gate Array for the most significant cardiac arrhythmias recognition by means of Kohonen Self-Organizing Map. The feasibility study on an implementation on the XILINX Virtex ® -4 FX12 FPGA is proposed, in which the QRS complexes are extracted and classified in real time between normal or pathologic classes. The whole digital implementation is validated to be integrated in wearable cardiac monitoring systems.
Physical Review E, 2008
The complex dynamics of intracellular calcium regulates cellular responses to information encoded... more The complex dynamics of intracellular calcium regulates cellular responses to information encoded in extracellular signals. Here, we study the encoding of these external signals in the context of the Li-Rinzel model. We show that by control of biophysical parameters the information can be encoded in amplitude modulation, frequency modulation or mixed (AM and FM) modulation. We briefly discuss the possible implications of this new role of information encoding for astrocytes.

Measurement Science and Technology, 2006
Electronic nose (e-nose) architectures usually consist of several modules that process various ta... more Electronic nose (e-nose) architectures usually consist of several modules that process various tasks such as control, data acquisition, data filtering, feature selection and pattern analysis. Heterogeneous techniques derived from chemometrics, neural networks, and fuzzy rules used to implement such tasks may lead to issues concerning module interconnection and cooperation. Moreover, a new learning phase is mandatory once new measurements have been added to the dataset, thus causing changes in the previously derived model. Consequently, if a loss in the previous learning occurs (catastrophic interference), real-time applications of e-noses are limited. To overcome these problems this paper presents an architecture for dynamic and efficient management of multi-transducer data processing techniques and for saving an associative short-term memory of the previously learned model. The architecture implements an artificial model of a hippocampus-based working memory, enabling the system to be ready for real-time applications. Starting from the base models available in the architecture core, dedicated models for neurons, maps and connections were tailored to an artificial olfactory system devoted to analysing olive oil. In order to verify the ability of the processing architecture in associative and short-term memory, a paired-associate learning test was applied. The avoidance of catastrophic interference was observed.

IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2005
It is well documented that the processing of social and emotional information is impaired in peop... more It is well documented that the processing of social and emotional information is impaired in people with autism. Recent studies have shown that individuals, particularly those with high functioning autism, can learn to cope with common social situations if they are made to enact possible scenarios they may encounter in real life during therapy. The main aim of this work is to describe an interactive life-like facial display (FACE) and a supporting therapeutic protocol that will enable us to verify if the system can help children with autism to learn, identify, interpret, and use emotional information and extend these skills in a socially appropriate, flexible, and adaptive context. The therapeutic setup consists of a specially equipped room in which the subject, under the supervision of a therapist, can interact with FACE. The android display and associated control system has automatic facial tracking, expression recognition, and eye tracking. The treatment scheme is based on a series of therapist-guided sessions in which a patient communicates with FACE through an interactive console. Preliminary data regarding the exposure to FACE of two children are reported.
Uploads
Papers by Danilo De Rossi