How do people make decisions in disclosing personal information in tourism group recommendations in competitive versus cooperative conditions?

Shabnam Najafian*, Geoff Musick, Bart Knijnenburg, Nava Tintarev

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

Abstract

When deciding where to visit next while traveling in a group, people have to make a trade-off in an interactive group recommender system between (a) disclosing their personal information to explain and support their arguments about what places to visit or to avoid (e.g., this place is too expensive for my budget) and (b) protecting their privacy by not disclosing too much. Arguably, this trade-off crucially depends on who the other group members are and how cooperative one aims to be in making the decision. This paper studies how an individual's personality, trust in group, and general privacy concern as well as their preference scenario and the task design serve as antecedents to their trade-off between disclosure benefit and privacy risk when disclosing their personal information (e.g., their current location, financial information, etc.) in a group recommendation explanation. We aim to design a model which helps us understand the relationship between risk and benefit and their moderating factors on final information disclosure in the group. To create realistic scenarios of group decision making where users can control the amount of information disclosed, we developed TouryBot. This chat-bot agent generates natural language explanations to help group members explain their arguments for suggestions to the group in the tourism domain [more specifically, the initial POI options were selected from the category of "Food" in Amsterdam (see Sect. 3.2 for the details)]. To understand the dynamics between the factors mentioned above and information disclosure, we conducted an online, between-subjects user experiment that involved 278 participants who were exposed to either a competitive task (i.e., instructed to convince the group to visit or skip a recommended place) or a cooperative task (i.e., instructed to reach a decision in the group). Results show that participants' personality and whether their preferences align with the majority affect their general privacy concern perception. This, in turn, affects their trust in the group, which affects their perception of privacy risk and disclosure benefit when disclosing personal information in the group, which ultimately influences the amount of personal information they disclose. A surprising finding was that the effect of privacy risk on information disclosure is different for different types of tasks: privacy risk significantly impacts information disclosure when the task of finding a suitable destination is framed competitively but not when it is framed cooperatively. These findings contribute to a better understanding of the moderating factors of information disclosure in group decision making and shed new light on the role of task design on information disclosure. We conclude with design recommendations for developing explanations in group decision-making systems. Further, we propose a theory of user modeling that shows what factors need to be considered when generating such group explanations automatically.
Original languageEnglish
Number of pages33
JournalUser Modeling and User-Adapted Interaction
DOIs
Publication statusE-pub ahead of print - 1 Jul 2023

Keywords

  • Explanation
  • Group recommendation
  • Privacy calculus
  • Information privacy
  • Personal information disclosure
  • PRIVACY CONCERNS
  • TRUST
  • CALCULUS
  • INTERPLAY
  • PARADOX
  • TRAITS
  • MODEL
  • USERS

Cite this