Trust Dynamics and Verbal Assurances in Human Robot Physical Collaboration

Alhaji, Basel and Prilla, Michael and Rausch, Andreas (2021) Trust Dynamics and Verbal Assurances in Human Robot Physical Collaboration. Frontiers in Artificial Intelligence, 4. ISSN 2624-8212

[thumbnail of pubmed-zip/versions/2/package-entries/frai-04-703504-r1/frai-04-703504.pdf] Text
pubmed-zip/versions/2/package-entries/frai-04-703504-r1/frai-04-703504.pdf - Published Version

Download (2MB)

Abstract

Trust is the foundation of successful human collaboration. This has also been found to be true for human-robot collaboration, where trust has also influence on over- and under-reliance issues. Correspondingly, the study of trust in robots is usually concerned with the detection of the current level of the human collaborator trust, aiming at keeping it within certain limits to avoid undesired consequences, which is known as trust calibration. However, while there is intensive research on human-robot trust, there is a lack of knowledge about the factors that affect it in synchronous and co-located teamwork. Particularly, there is hardly any knowledge about how these factors impact the dynamics of trust during the collaboration. These factors along with trust evolvement characteristics are prerequisites for a computational model that allows robots to adapt their behavior dynamically based on the current human trust level, which in turn is needed to enable a dynamic and spontaneous cooperation. To address this, we conducted a two-phase lab experiment in a mixed-reality environment, in which thirty-two participants collaborated with a virtual CoBot on disassembling traction batteries in a recycling context. In the first phase, we explored the (dynamics of) relevant trust factors during physical human-robot collaboration. In the second phase, we investigated the impact of robot’s reliability and feedback on human trust in robots. Results manifest stronger trust dynamics while dissipating than while accumulating and highlight different relevant factors as more interactions occur. Besides, the factors that show relevance as trust accumulates differ from those appear as trust dissipates. We detected four factors while trust accumulates (perceived reliability, perceived dependability, perceived predictability, and faith) which do not appear while it dissipates. This points to an interesting conclusion that depending on the stage of the collaboration and the direction of trust evolvement, different factors might shape trust. Further, the robot’s feedback accuracy has a conditional effect on trust depending on the robot’s reliability level. It preserves human trust when a failure is expected but does not affect it when the robot works reliably. This provides a hint to designers on when assurances are necessary and when they are redundant.

Item Type: Article
Subjects: Eurolib Press > Multidisciplinary
Depositing User: Managing Editor
Date Deposited: 11 Mar 2023 07:11
Last Modified: 29 Jun 2024 09:45
URI: http://info.submit4journal.com/id/eprint/991

Actions (login required)

View Item
View Item