Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2014
…
5 pages
1 file
Explanations in recommender systems have gained an increasing importance in the last few years. It was found that explanations can help increase users' acceptance of collaborative filtering recommender systems, helping them make decisions more quickly, convincing them to buy and even developing trust as a whole. They can also help in decision support and in problem solving. While the majority of research has focused on the algorithms behind recommender systems, little emphasis was put on interface which is crucial in improving user experience especially if we know that communicating reasoning to users is considered an important aspect of assessing recommender systems. The importance of this paper is that it lies in the area of controlling the recommendation process which gained little attention so far. The focus is on the visualization of explanations in recommender systems. We will learn what modalities (E.g. text, graphs, tables, and images) can better present explanations to users, through the review of a selection of papers in the literature over the last few years. The results show that explanations with simple graphs and descriptions can better present explanations (meaning that complex graphical interfaces can confuse users). The rest of this paper is organized as follows: the next section gives an introduction to explanations in recommender systems. Then, we talk about the relationship between visualization of explanations and other disciplines such as human computer interaction and decision making. We then talk about the different methods of information visualization especially those used when explanations are involved. The paper ends with conclusions and perspectives for future work.
2017
This report discusses the explanations in the domain of recommender systems: A review of the research papers in the domain, the different explanation interfaces and the evaluation criteria, our vision in this domain and its application on the e-learning project “METAL”.
2013
Recommender systems are software tools that supply users with suggestions for items to buy. However, it was found that many recommender systems functioned as black boxes and did not provide transparency or any information on how their internal parts work. Therefore, explanations were used to show why a specific recommendation was provided. The importance of explanations has been approved in a number of fields such as expert systems, decision support systems, intelligent tutoring systems and data explanation systems. It was found that not generating a suitable explanation might degrade the performance of recommender systems, their applicability and eventually their value for monetization. Our goal in this paper is to provide a comprehensive review on the main research fields of explanations in recommender systems along with suitable examples from literature. Open challenges in the field are also manifested. The results show that most of the work in the field focus on the set of characteristics that can be associated with explanations: transparency, validity, scrutability, trust, relevance, persuasiveness, comprehensibility, effectiveness, efficiency, satisfaction and education. All of these characteristics can increase the system's trustworthiness. Other research areas include explanation interfaces, over and underestimation and decision making
User Modeling and User-Adapted Interaction, 2012
When recommender systems present items, these can be accompanied by explanatory information. Such explanations can serve seven aims: effectiveness, satisfaction, transparency, scrutability, trust, persuasiveness, and efficiency. These aims can be incompatible, so any evaluation needs to state which aim is being investigated and use appropriate metrics. This paper focuses particularly on effectiveness (helping users to make good decisions) and its trade-off with satisfaction. It provides an overview of existing work on evaluating effectiveness and the metrics used. It also highlights the limitations of the existing effectiveness metrics, in particular the effects of under-and overestimation and recommendation domain. In addition to this methodological contribution, the paper presents four empirical studies in two domains: movies and cameras. These studies investigate the impact of personalizing simple feature-based explanations on effectiveness and satisfaction. Both approximated and real effectiveness is investigated. Contrary to expectation, personalization was detrimental to effectiveness, though it may improve user satisfaction. The studies also highlighted the importance of considering opt-out rates and the underlying rating distribution when evaluating effectiveness.
arXiv (Cornell University), 2021
Explanations are used in recommender systems for various reasons. Users have to be supported in making (highquality) decisions more quickly. Developers of recommender systems want to convince users to purchase specific items. Users should better understand how the recommender system works and why a specific item has been recommended. Users should also develop a more in-depth understanding of the item domain. Consequently, explanations are designed in order to achieve specific goals such as increasing the transparency of a recommendation or increasing a user's trust in the recommender system. In this paper, we provide an overview of existing research related to explanations in recommender systems, and specifically discuss aspects relevant to group recommendation scenarios. In this context, we present different ways of explaining and visualizing recommendations determined on the basis of preference aggregation strategies.
Proceedings of the 24th International Conference on Intelligent User Interfaces, 2019
Recommender systems have become pervasive on the web, shaping the way users see information and thus the decisions they make. As these systems get more complex, there is a growing need for transparency. In this paper, we study the problem of generating and visualizing personalized explanations for hybrid recommender systems, which incorporate many different data sources. We build upon a hybrid probabilistic graphical model and develop an approach to generate real-time recommendations along with personalized explanations. To study the benefits of explanations for hybrid recommender systems, we conduct a crowd-sourced user study where our system generates personalized recommendations and explanations for real users of the last.fm music platform. We experiment with 1) different explanation styles (e.g., user-based, item-based), 2) manipulating the number of explanation styles presented, and 3) manipulating the presentation format (e.g., textual vs. visual). We apply a mixed model statistical analysis to consider user personality traits as a control variable and demonstrate the usefulness of our approach in creating personalized hybrid explanations with different style, number, and format. CCS CONCEPTS • Information systems → Decision support systems; Collaborative filtering; • Human-centered computing → Social networking sites; Empirical studies in visualization.
User Modeling and User-Adapted Interaction, 2017
With the recent advances in the field of artificial intelligence, an increasing number of decision-making tasks are delegated to software systems. A key requirement for the success and adoption of such systems is that users must trust system choices or even fully automated decisions. To achieve this, explanation facilities have been widely investigated as a means of establishing trust in these systems since the early years of expert systems. With today's increasingly sophisticated machine learning algorithms, new challenges in the context of explanations, accountability, and trust towards such systems constantly arise. In this work, we systematically review the literature on explanations in advice-giving systems. This is a family of systems that includes recommender systems, which is one of the most successful classes of advicegiving software in practice. We investigate the purposes of explanations as well as how they are generated, presented to users, and evaluated. As a result, we derive a novel comprehensive taxonomy of aspects to be considered when designing explanation facilities for current and future decision support systems. The taxonomy includes a variety of different facets, such as explanation objective, responsiveness, content and presentation. Moreover, we identified several challenges that remain unaddressed so far, for example related to fine-grained issues associated with the presentation of explanations and how explanation facilities are evaluated.
2022
Recommender systems (RS) have become an integral component of our daily lives by helping decision making easier for us. The use of recommendations has, however, increased the demand for explanations that are convincing enough to help users trust the provided recommendations. The recommendations are desired by the users to be understandable as well as personalized to their individual needs and preferences. Research on personalized explainable recommendation has emerged only recently. To help researchers quickly familiarize with this promising research field and recognize future research directions, we present a multi-dimensional conceptualization framework for personalized explanations in RS, based on five dimensions: WHAT to personalize?, TO WHOM to personalize?, WHO does the personalization?, WHY do we personalize?, and HOW to personalize?. Furthermore, we use this framework to systematically analyze and compare studies on personalized explainable recommendation.
Proceedings of the 2013 international conference on Intelligent user interfaces - IUI '13, 2013
Research on recommender systems has traditionally focused on the development of algorithms to improve accuracy of recommendations. So far, little research has been done to enable user interaction with such systems as a basis to support exploration and control by end users. In this paper, we present our research on the use of information visualization techniques to interact with recommender systems. We investigated how information visualization can improve user understanding of the typically black-box rationale behind recommendations in order to increase their perceived relevance and meaning and to support exploration and user involvement in the recommendation process. Our study has been performed using TalkExplorer, an interactive visualization tool developed for attendees of academic conferences. The results of user studies performed at two conferences allowed us to obtain interesting insights to enhance user interfaces that integrate recommendation technology. More specifically, effectiveness and probability of item selection both increase when users are able to explore and interrelate multiple entities -i.e. items bookmarked by users, recommendations and tags.
ACM Transactions on Interactive Intelligent Systems, 2020
Recommender systems are ubiquitous and shape the way users access information and make decisions. As these systems become more complex, there is a growing need for transparency and interpretability. In this article, we study the problem of generating and visualizing personalized explanations for recommender systems that incorporate signals from many different data sources. We use a flexible, extendable probabilistic programming approach and show how we can generate real-time personalized recommendations. We then turn these personalized recommendations into explanations. We perform an extensive user study to evaluate the benefits of explanations for hybrid recommender systems. We conduct a crowd-sourced user study where our system generates personalized recommendations and explanations for real users of the last.fm music platform. First, we evaluate the performance of the recommendations in terms of perceived accuracy and novelty. Next, we experiment with (1) different explanation st...
This paper describes an approach for generating rich and compelling explanations in recommender systems, based on opinions mined from user-generated reviews. The explanations highlight the features of a recommended item that matter most to the user and also relate them to other recommendation alternatives and the user's past activities to provide a context.
Loading Preview
Sorry, preview is currently unavailable. You can download the paper by clicking the button above.
Fifteenth ACM Conference on Recommender Systems, 2021
Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization
Case-Based Reasoning Research and Development, 2015
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 2019
Proceedings of the 11th International Joint Conference on Knowledge Discovery, Knowledge Engineering and Knowledge Management, 2019
Data Mining and Knowledge Discovery, 2012
Proceedings of the 2007 ACM conference on Recommender systems, 2007
Adjunct Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization
Proceedings of the 31st ACM Conference on User Modeling, Adaptation and Personalization
Proceedings of the Eleventh ACM Conference on Recommender Systems, 2017
Proceedings / AMIA ... Annual Symposium. AMIA Symposium, 1998
International Journal of Human-Computer Studies, 2019
Proceedings of the 16th ACM Conference on Recommender Systems
IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, 2008
… of the 13th international conference on …, 2009
arXiv (Cornell University), 2022
Expert Systems with Applications, 2017