2.793

2018影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

自动化信任的研究综述与展望

董文莉 方卫宁

董文莉, 方卫宁. 自动化信任的研究综述与展望. 自动化学报, 2020, 46(x): 1−18 doi: 10.16383/j.aas.c200432
引用本文: 董文莉, 方卫宁. 自动化信任的研究综述与展望. 自动化学报, 2020, 46(x): 1−18 doi: 10.16383/j.aas.c200432
Dong Wen-Li, Fang Wei-Ning. Trust in automation: research review and future perspectives. Acta Automatica Sinica, 2020, 46(x): 1−18 doi: 10.16383/j.aas.c200432
Citation: Dong Wen-Li, Fang Wei-Ning. Trust in automation: research review and future perspectives. Acta Automatica Sinica, 2020, 46(x): 1−18 doi: 10.16383/j.aas.c200432

自动化信任的研究综述与展望

doi: 10.16383/j.aas.c200432
基金项目: 北京市自然科学基金(L191018)资助
详细信息
    作者简介:

    董文莉:北京交通大学电子信息工程学院博士研究生. 2017年获得郑州大学轨道交通信号与控制学士学位. 主要研究方向为自动化信任和计算认知建模. E-mail: wldong_bjtu@163.com

    方卫宁:北京交通大学轨道交通控制与安全国家重点实验室教授. 1996年获得重庆大学博士学位. 主要研究方向为人因工程 、轨道交通安全模拟与仿真. 本文通信作者. E-mail: E-mail: wnfang@bjtu.edu.cn

Trust in Automation: Research Review and Future Perspectives

Funds: Supported by Beijing Natural Science Foundation (L191018)
  • 摘要: 随着自动化能力的快速提升, 人机关系发生深刻变化, 人的角色逐渐从自动化的主要控制者转变为与其共享控制的合作者. 为了实现绩效和安全目标, 人机协同控制需要操作人员适当地校准他们对自动化机器的信任, 自动化信任问题已经成为实现安全有效的人机协同控制所面临的最大挑战之一. 本文回顾了自动化信任相关文献, 围绕自动化信任概念、模型、影响因素及测量方法, 对迄今为止该领域的主要理论和实证工作进行了详细总结. 最后, 本文在研究综述和相关文献分析的基础上提出了现有自动化信任研究工作中存在的局限性, 并从人机系统设计的角度为未来的自动化信任研究提供一些建议.
  • 图  1  自动化信任校准示意图

    Fig.  1  Diagram of calibration of trust in automation

    图  2  自动化信任定义涉及的重要特征

    Fig.  2  Important characteristics involved in the definitions of trust in automation

    图  3  自动化信任模型文献的时间分布

    Fig.  3  Time distribution of the literature on models of trust in automation

    图  4  自动化信任影响因素总结

    Fig.  4  Summary of factors influencing trust in automation

    图  5  三种自动化信任测量方法的应用比例

    Fig.  5  Application ratio of three trust in automation measures

    图  6  与文献发表趋势、重点应用领域及研究对象相关的自动化信任文献分析结果

    Fig.  6  Results of literature analysis related to literature publication trends, key application areas and research objects of trust in automation

    表  1  自动化信任计算模型总结

    Table  1  Summary of computational models of trust in automation

    类型离线信任模型在线信任模型
    输入先验参数先验参数及实时行为和
    生理及神经数据
    作用在可能的情景范围内进行
    模拟以预测自动化信任水平
    在系统实际运行期间实时
    估计自动化信任水平
    应用用于自动化系统设计阶段用于自动化系统部署阶段
    结果静态改进自动化设计动态调整自动化行为
    下载: 导出CSV

    表  2  常见的自动化信任行为测量方法总结

    Table  2  Summary of common behavioural measures of trust in automation

    行为典型例子
    依赖①将控制权移交给自动化或从自动化收回控制权[134].
    ②降低对自动化的监视程度[135, 136].
    遵从①接受由自动化提供的建议或选择的动作[137].
    ②放弃自己的决定来遵守自动化的决定[138].
    其他①选择手动还是使用自动化完成任务[58, 84].
    ②选择的自动化水平[139](操作者选择的自动化
    水平越高, 其信任水平越高).
    ③反应时间[140](较长的反应时间代表较高的信任水平).
    下载: 导出CSV

    表  3  重要的生理及神经测量方法及其依据

    Table  3  Important physiological and neural measures of trust in automation and their basis

    测量方法方法依据
    通过眼动追踪捕获操作者的凝视
    行为来对自动化信任进行持续测量.
    监视行为等显性行为与主观自动化信任的联系更加紧密[77]. 虽然关于自动化信任与监视行为的实验证据并不是单一的[142], 但大多数实证研究表明, 自动化信任主观评分与操作者监视频率之间存在显著的负相关关系[48]. 表征操作者监视程度的凝视行为可以为实时自动化信任测量提供可靠信息[133, 142, 143].
    利用EEG信号的图像特征来检测
    操作者的自动化信任状态.
    许多研究检验了人际信任的神经关联[144-148], 使用神经成像工具检验自动化信任的神经关联是可行的. EEG比其他工具(如功能性磁共振成像)具有更好的时间动态性[149], 在脑-机接口设计中使用EEG图像模式来识别用户认知和情感状态已经具有良好的准确性[149]. 自动化信任是一种认知结构, 利用EEG信号的图像特征来检测操作者的自动化信任校准是可行的, 并且已经取得了较高的准确性[67, 68, 150].
    通过EDA水平推断自动化信任水平已有研究表明, 较低的自动化信任水平可能与较高的EDA水平相关[151]. 将该方法与其他生理及神经测量方法结合使用比单独使用某种方法的自动化信任测量准确度更高, 例如将EDA与眼动追踪[142]或EEG结合使用[68, 67].
    下载: 导出CSV

    表  4  自动化信任的主要研究团体及其研究贡献

    Table  4  Main research groups of trust in automation and their research contributions

    序号国别机构团队及代表学者研究贡献文献数
    1美国美国陆军研究实验室人类研究和工程局的
    Chen J Y C
    提出基于系统透明度的一系列自动化信任校准方法26
    2美国美国空军研究实验室人类信任与交互分部的Lyons J B进行军事背景下的自动化信任应用研究24
    3美国中佛罗里达大学仿真模拟与培训学院的Hancock P A建立人-机器人信任的理论体系并进行相关影响因素实证研究21
    4美国克莱姆森大学机械工程系的Saeidi H和Wang Y建立基于信任计算模型的自主分配策略来提高人机协作效能20
    5美国乔治梅森大学心理学系的de Visser E J建立并完善自动化信任修复相关理论, 着重研究
    自动化的拟人特征对信任修复的作用
    18
    6日本筑波大学风险工程系的Itoh M和Inagaki T基于自动化信任校准的人-自动驾驶汽车协同系统设计方法14
    下载: 导出CSV
  • [1] Schörnig N. Unmanned Systems: The Robotic Revolution as a Challenge for Arms Control. Wiesbaden: Springer, 2019. 233−256.
    [2] Meyer G, Beiker S. Road vehicle automation. Cham: Springer, 2019. 73−109.
    [3] Bahrin M A K, Othman M F, Azli N N, Talib M F. Industry 4.0: A review on industrial automation and robotic. Jurnal Teknologi, 2016, 78(6-13): 137−143
    [4] Musen M A, Middleton B, Greenes R A. Clinical decision-support systems. New York: Springer, 2014. 643−674.
    [5] Janssen C P, Donker S F, Brumby D P, Kun A L. History and future of human-automation interaction. International Journal of Human-Computer Studies, 2019, 131(54): 99−107
    [6] 许为, 葛列众. 智能时代的工程心理学. 心理科学进展, 2020, 28(9): 1409−1425

    XU Wei, GE Lie-zhong. Engineering psychology in the era of artificial intelligence. Advances in Psychological Science, 2020, 28(9): 1409−1425
    [7] Parisi G I, Kemker R, Part J L, Kanan C, Wermter S. Continual lifelong learning with neural networks: A review. Neural Networks, 2019, 113(22): 54−71
    [8] Grigsby S S. Artificial Intelligence for Advanced Human-Machine Symbiosis. In: Madhavan P, Wiegmann D A. A new look at the dynamics of human-automation trust: Is trust in humans comparable to trust in machines?. In: Proceeding of Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2004. 581−585.
    [9] Gogoll J, Uhl M. Rage against the machine: Automation in the moral domain. Journal of Behavioral and Experimental Economics, 2018, 74(54): 97−103
    [10] Gunning D. Explainable artificial intelligence (xai) [Online], available: https://www.darpa.mil/attachments/XAIProgramUpdate.pdf, April 26, 2020.
    [11] Endsley M R. From here to autonomy: lessons learned from human–automation research. Human factors, 2017, 59(1): 5−27
    [12] Blomqvist K. The many faces of trust. Scandinavian journal of management, 1997, 13(3): 271−286
    [13] Rotter J B. A new scale for the measurement of interpersonal trust. Journal of personality, 1967, 35(4): 651−665
    [14] Muir B M. Trust in automation: Part I. Theoretical issues in the study of trust and human intervention in automated systems. Ergonomics, 1994, 37(11): 1905−1922
    [15] Lewandowsky S, Mundy M, Tan G. The dynamics of trust: Comparing humans to automation. Journal of Experimental Psychology: Applied, 2000, 6(2): 104−123
    [16] Muir B M. Trust between humans and machines, and the design of decision aids. International journal of man-machine studies, 1987, 27(5-6): 527−539
    [17] Parasuraman R, Riley V. Humans and automation: Use, misuse, disuse, abuse. Human factors, 1997, 39(2): 230−253
    [18] Levin S. Tesla fatal crash: ‘autopilot’ mode sped up car before driver killed, report finds [Online], available: https://www.theguardian.com/technology/2018/jun/07/tesla-fatal-crash-silicon-valley-autopilot-mode-report, June 8, 2020.
    [19] The Tesla Team. An Update on Last Week’s Accident [Online], available: https://www.tesla.com/en_GB/blog/update-last-week%E2%80%99s-accident, March 20, 2020.
    [20] Mayer R C, Davis J H, Schoorman F D. An integrative model of organizational trust. Academy of management review, 1995, 20(3): 709−734
    [21] McKnight D H, Cummings L L, Chervany N L. Initial trust formation in new organizational relationships. Academy of Management review, 1998, 23(3): 473−490
    [22] Jarvenpaa S L, Knoll K, Leidner D E. Is anybody out there? Antecedents of trust in global virtual teams. Journal of management information systems, 1998, 14(4): 29−64
    [23] Siau K, Shen Z. Building customer trust in mobile commerce. Communications of the ACM, 2003, 46(4): 91−94
    [24] Gefen D. E-commerce: the role of familiarity and trust. Omega, 2000, 28(6): 725−737
    [25] McKnight D H, Choudhury V, Kacmar C. Trust in e-commerce vendors: a two-stage model. Madhavan P, Wiegmann D A. A new look at the dynamics of human-automation trust: Is trust in humans comparable to trust in machines?. In: Proceeding of Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2004. 581−585.
    [26] Li X, Hess T J, Valacich J S. Why do we trust new technology? A study of initial trust formation with organizational information systems. The Journal of Strategic Information Systems, 2008, 17(1): 39−71
    [27] Lee J D, See K A. Trust in automation: Designing for appropriate reliance. Human factors, 2004, 46(1): 50−80
    [28] Pan B, Hembrooke H, Joachims T, Lorigo L, Gay G, Granka L. In Google we trust: Users’ decisions on rank, position, and relevance. Journal of computer-mediated communication, 2007, 12(3): 801−823
    [29] Riegelsberger J, Sasse M A, McCarthy J D. The researcher's dilemma: evaluating trust in computer-mediated communication. International Journal of Human-Computer Studies, 2003, 58(6): 759−781
    [30] Hancock P A, Billings D R, Schaefer K E, Chen J Y C, de Visser E J, Parasuraman R. A meta-analysis of factors affecting trust in human-robot interaction. Human factors, 2011, 53(5): 517−527
    [31] Billings D R, Schaefer K E, Chen J Y C, Hancock P A. Human-robot interaction: developing trust in robots. Madhavan P, Wiegmann D A. A new look at the dynamics of human-automation trust: Is trust in humans comparable to trust in machines?. In: Proceeding of Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2004. 581−585.
    [32] Madhavan P, Wiegmann D A. Similarities and differences between human–human and human–automation trust: an integrative review. Theoretical Issues in Ergonomics Science, 2007, 8(4): 277−301
    [33] Walker G, Stanton N, Salmon P. Trust in vehicle technology. International journal of vehicle design, 2016, 70(2): 157−182
    [34] Siau K, Wang W. Building trust in artificial intelligence, machine learning, and robotics. Cutter Business Technology Journal, 2018, 31(2): 47−53
    [35] Schaefer K E, Chen J Y, Szalma J L, Hancock P A. A meta-analysis of factors influencing the development of trust in automation: Implications for understanding autonomy in future systems. Human factors, 2016, 58(3): 377−400
    [36] Hoff K A, Bashir M. Trust in automation: Integrating empirical evidence on factors that influence trust. Human factors, 2015, 57(3): 407−434
    [37] Kaber D B. Issues in human–automation interaction modeling: Presumptive aspects of frameworks of types and levels of automation. Journal of Cognitive Engineering and Decision Making, 2018, 12(1): 7−24
    [38] Hancock P A. Imposing limits on autonomous systems. Ergonomics, 2017, 60(2): 284−291
    [39] Schaefer K. The perception and measurement of human-robot trust[Ph. D. dissertation], University of Central Florida, 2013.
    [40] Schaefer K E, Billings D R, Szalma J L, Adams J K, Sanders T L, Chen J Y C, et al. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction, Technical Report ARL-TR-6984, Army Research Laboratory, USA, 2014.
    [41] Nass C, Fogg B, Moon Y. Can computers be teammates? International Journal of Human-Computer Studies, 1996, 45(6): 669−678
    [42] Madhavan P, Wiegmann D A. A new look at the dynamics of human-automation trust: Is trust in humans comparable to trust in machines?. In: Proceeding of Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2004. 581−585.
    [43] Dimoka A. What does the brain tell us about trust and distrust? Evidence from a functional neuroimaging study. Mis Quarterly, 2010, 96(43): 373−396
    [44] Riedl R, Hubert M, Kenning P. Are there neural gender differences in online trust? An fMRI study on the perceived trustworthiness of eBay offers. MIS quarterly, 2010, 34(2): 397−428
    [45] Billings D, Schaefer K, Llorens N, Hancock P A. What is Trust? Defining the construct across domains. In: Proceeding of the american psychological association conference. Florida, USA: APA, 2012. 76−84.
    [46] Barber B. The logic and limits of trust. New Jersey: Rutgers University Press, 1983. 15−22.
    [47] Rempel J K, Holmes J G, Zanna M P. Trust in close relationships. Journal of personality and social psychology, 1985, 49(1): 95−95
    [48] Muir B M, Moray N. Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. Ergonomics, 1996, 39(3): 429−460
    [49] Ajzen I. The theory of planned behavior. Organizational behavior and human decision processes, 1991, 50(2): 179−211
    [50] Dzindolet M T, Pierce L G, Beck H P, Dawe L A, Anderson B W. Predicting misuse and disuse of combat identification systems. Military Psychology, 2001, 13(3): 147−164
    [51] Madhavan P, Wiegmann D A. Effects of information source, pedigree, and reliability on operator interaction with decision support systems. Human Factors, 2007, 49(5): 773−785
    [52] Goodyear K, Parasuraman R, Chernyak S, de Visser E, Madhavan P, Deshpande G, et al. An fMRI and effective connectivity study investigating miss errors during advice utilization from human and machine agents. Social neuroscience, 2017, 12(5): 570−581
    [53] Hoffmann H, Söllner M. Incorporating behavioral trust theory into system development for ubiquitous applications. Personal and ubiquitous computing, 2014, 18(1): 117−128
    [54] Ekman F, Johansson M, Sochor J. Creating appropriate trust in automated vehicle systems: A framework for HMI design. IEEE Transactions on Human-Machine Systems, 2017, 48(1): 95−101
    [55] Xu A Q, Dudek G. OPTIMo: Online Probabilistic Trust Inference Model for Asymmetric Human-Robot Collaborations. In: Proceeding of the 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI). New York, USA: IEEE, 2015. 221−228.
    [56] Nam C, Walker P, Lewis M, Sycara K. Predicting trust in human control of swarms via inverse reinforcement learning. In: Proceeding of the 26th IEEE International Symposium on Robot and Human Interactive Communication(RO-MAN). Lisbon, Portugal: IEEE, 2017. 528−533.
    [57] Akash K, Polson K, Reid T, Jain N. Improving Human-Machine Collaboration Through Transparency-based Feedback–Part I: Human Trust and Workload Model. IFAC-PapersOnLine, 2019, 51(34): 315−321
    [58] Gao J, Lee J D. Extending the decision field theory to model operators' reliance on automation in supervisory control situations. IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans, 2006, 36(5): 943−959
    [59] Wang Y, Shi Z, Wang C, Zhang F. Human-robot mutual trust in (semi) autonomous underwater robots. Berlin: Springer, 2014. 115−137.
    [60] Clare A S. Modeling real-time human-automation collaborative scheduling of unmanned vehicles, Technical Report, Aerospace Engineering with Information Technology, Massachusetts Institute of Technology, USA, 2013.
    [61] Gao F, Clare A S, Macbeth J C, Cummings M L. Modeling the impact of operator trust on performance in multiple robot control. In: Proceeding of AAAI Spring Symposium Series. Stanford, CA, USA: AAAI, 2013. 164−169.
    [62] Hoogendoorn M, Jaffry S W, Treur J. Cognitive and neural modeling of dynamics of trust in competitive trustees. Cognitive Systems Research, 2012, 14(1): 60−83
    [63] Hussein A, Elsawah S, Abbass H. Towards Trust-Aware Human-Automation Interaction: An Overview of the Potential of Computational Trust Models. In: Proceedings of the 53rd Hawaii International Conference on System Sciences. Hawaii, USA: University of Hawaii, 2020. 47−57.
    [64] Akash K, Hu W-L, Reid T, Jain N. Dynamic modeling of trust in human-machine interactions. In: Proceedings of the American Control Conference. Seattle, WA, USA: IEEE, 2017. 1542−1548.
    [65] Hu W-L, Akash K, Reid T, Jain N. Computational modeling of the dynamics of human trust during human–machine interactions. IEEE Transactions on Human-Machine Systems, 2018, 49(6): 485−497
    [66] Akash K, Reid T, Jain N. Improving Human-Machine Collaboration Through Transparency-based Feedback–Part II: Control Design and Synthesis. IFAC-PapersOnLine, 2019, 51(34): 322−328
    [67] Hu W-L, Akash K, Jain N, Reid T. Real-time sensing of trust in human-machine interactions. IFAC-PapersOnLine, 2016, 49(32): 48−53
    [68] Akash K, Hu W-L, Jain N, Reid T. A classification model for sensing human trust in machines using EEG and GSR. ACM Transactions on Interactive Intelligent Systems (TiiS), 2018, 8(4): 1−20
    [69] Akash K, Reid T, Jain N. Adaptive Probabilistic Classification of Dynamic Processes: A Case Study on Human Trust in Automation. In: Proceedings of the Annual American Control Conference. New York, USA: IEEE, 2018. 246−251.
    [70] Merritt S M, Ilgen D R. Not all trust is created equal: Dispositional and history-based trust in human-automation interactions. Human Factors, 2008, 50(2): 194−210
    [71] Bagheri N, Jamieson G A. The impact of context-related reliability on automation failure detection and scanning behaviour. In: Proceedings of the International Conference on Systems, Man and Cybernetics. New York, USA: IEEE, 2004. 212−217.
    [72] Cahour B, Forzy J-F. Does projection into use improve trust and exploration? An example with a cruise control system. Safety science, 2009, 47(9): 1260−1270
    [73] Cummings M L, Clare A, Hart C. The role of human-automation consensus in multiple unmanned vehicle scheduling. Human Factors, 2010, 52(1): 17−27
    [74] Kraus J M, Forster Y, Hergeth S, Baumann M. Two Routes to Trust Calibration: Effects of Reliability and Brand Information on Trust in Automation. International Journal of Mobile Human Computer Interaction (IJMHCI), 2019, 11(3): 1−17
    [75] Kelly C, Boardman M, Goillau P, Jeannot E. Guidelines for trust in future ATM systems: A literature review, Technical Report HRS/HSP-005-GUI-01, European Organization for the Safety of Air Navigation, Belgium, 2003.
    [76] Riley V. A general model of mixed-initiative human-machine systems. In: Proceedings of the Human Factors Society Annual Meeting. Los Angeles, USA: SAGE, 1989. 124−128.
    [77] Parasuraman R, Manzey D H. Complacency and bias in human use of automation: An attentional integration. Human Factors: The Journal of the Human Factors and Ergonomics Society, 2010, 52(3): 381−410
    [78] Bailey N R, Scerbo M W, Freeman F G, Mikulka P J, Scott L A. Comparison of a brain-based adaptive system and a manual adaptable system for invoking automation. Human factors, 2006, 48(4): 693−709
    [79] Moray N, Hiskes D, Lee J, Muir B M. Trust and human intervention in automated systems. New Jersey: L. Erlbaum Associates Inc, 1995. 183−194.
    [80] Yu K, Berkovsky S, Taib R, Conway D, Zhou J L, Chen F. User trust dynamics: An investigation driven by differences in system performance. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces. New York, USA: Association for Computing Machinery, 2017. 307−317.
    [81] Lee J, Moray N. Trust, control strategies and allocation of function in human-machine systems. Ergonomics, 1992, 35(10): 1243−1270
    [82] De Visser E J, Monfort S S, McKendrick R, Smith M A B, McKnight P E, Krueger F, et al. Almost human: Anthropomorphism increases trust resilience in cognitive agents. Journal of Experimental Psychology: Applied, 2016, 22(3): 331−349
    [83] Pak R, Fink N, Price M, Bass B, Sturre L. Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics, 2012, 55(9): 1059−1072
    [84] De Vries P, Midden C, Bouwhuis D. The effects of errors on system trust, self-confidence, and the allocation of control in route planning. International Journal of Human-Computer Studies, 2003, 58(6): 719−735
    [85] Moray N, Inagaki T, Itoh M. Adaptive automation, trust, and self-confidence in fault management of time-critical tasks. Journal of experimental psychology: Applied, 2000, 6(1): 44−58
    [86] Lewis M, Sycara K, Walker P. The role of trust in human-robot interaction. Florence: Springer, 2018. 135−159.
    [87] Verberne F M, Ham J, Midden C J. Trust in smart systems: Sharing driving goals and giving information to increase trustworthiness and acceptability of smart systems in cars. Human factors, 2012, 54(5): 799−810
    [88] de Visser E, Parasuraman R. Adaptive aiding of human-robot teaming: Effects of imperfect automation on performance, trust, and workload. Journal of Cognitive Engineering and Decision Making, 2011, 5(2): 209−231
    [89] Endsley M R. Situation awareness in future autonomous vehicles: Beware of the unexpected. In: Proceedings of Congress of the International Ergonomics Association. Cham, Switzerland: Springer, 2018. 303−309.
    [90] Dadashi N, Stedmon A W, Pridmore T P. Semi-automated CCTV surveillance: The effects of system confidence, system accuracy and task complexity on operator vigilance, reliance and workload. Applied Ergonomics, 2013, 44(5): 730−738
    [91] Wang L, Jamieson G A, Hollands J G. Trust and reliance on an automated combat identification system. Human factors, 2009, 51(3): 281−291
    [92] Dzindolet M T, Peterson S A, Pomranky R A, Pierce G L, Beck H P. The role of trust in automation reliance. International journal of human-computer studies, 2003, 58(6): 697−718
    [93] Davis S E. Individual Differences in Operators’ Trust in Autonomous Systems: A Review of the Literature, Technical Report DST-Group-TR-3587, Joint and Operations Analysis Division, Defence Science and Technology Group, Australia, 2019.
    [94] Merritt S M. Affective processes in human–automation interactions. Human Factors, 2011, 53(4): 356−370
    [95] Stokes C K, Lyons J B, Littlejohn K, Natarian J, Case E, Speranza N. Accounting for the human in cyberspace: Effects of mood on trust in automation. In: Proceedings of the 2010 International Symposium on Collaborative Technologies and Systems. Chicago, IL, USA: IEEE, 2010. 180−187.
    [96] Merritt S M, Heimbaugh H, LaChapell J, Lee D. I trust it, but I don’t know why: Effects of implicit attitudes toward automation on trust in an automated system. Human factors, 2013, 55(3): 520−534
    [97] Arder J, Hughes D K, Rowe P H, Mottram D R, Green C F. Attitudes and opinions of nursing and medical staff regarding the supply and storage of medicinal products before and after the installation of a drawer-based automated stock-control system. International Journal of Pharmacy Practice, 2009, 17(2): 95−99
    [98] Gao J, Lee J D, Zhang Y. A dynamic model of interaction between reliance on automation and cooperation in multi-operator multi-automation situations. International Journal of Industrial Ergonomics, 2006, 36(5): 511−526
    [99] Reichenbach J, Onnasch L, Manzey D. Human performance consequences of automated decision aids in states of sleep loss. Human factors, 2011, 53(6): 717−728
    [100] Chen J, Terrence P. Effects of imperfect automation and individual differences on concurrent performance of military and robotics tasks in a simulated multitasking environment. Ergonomics, 2009, 52(8): 907−920
    [101] Chen J Y, Barnes M J. Supervisory control of multiple robots in dynamic tasking environments. Ergonomics, 2012, 55(9): 1043−1058
    [102] Naef M, Fehr E, Fischbacher U, Schupp J, Wagner G. Decomposing trust: Explaining national and ethnical trust differences. International Journal of Psychology, 2008, 43(3-4): 577−577
    [103] Huerta E, Glandon T, Petrides Y. Framing, decision-aid systems, and culture: Exploring influences on fraud investigations. International Journal of Accounting Information Systems, 2012, 13(4): 316−333
    [104] Chien S-Y, Lewis M, Sycara K, Liu J-S, Kumru A. Influence of cultural factors in dynamic trust in automation. In: Proceedings of the International Conference on Systems, Man, and Cybernetics (SMC). New York, USA: IEEE, 2016. 2884−2889.
    [105] Donmez B, Boyle L N, Lee J D, McGehee D V. Drivers’ attitudes toward imperfect distraction mitigation strategies. Transportation research part F: traffic psychology and behaviour, 2006, 9(6): 387−398
    [106] Kircher K, Thorslund B. Effects of road surface appearance and low friction warning systems on driver behaviour and confidence in the warning system. Ergonomics, 2009, 52(2): 165−176
    [107] Ho G, Wheatley D, Scialfa C T. Age differences in trust and reliance of a medication management system. Interacting with Computers, 2005, 17(6): 690−710
    [108] Steinke F, Fritsch T, Silbermann L. Trust in ambient assisted living (AAL)-a systematic review of trust in automation and assistance systems. International Journal on Advances in Life Sciences, 2012, 4(3): 23−26
    [109] McBride M, Morgan S. Trust calibration for automated decision aids [Online], available: https://www.ihssnc.org/portals/0/Documents/VIMSDocuments/McBride_Research_Brief.pdf, May 15, 2020.
    [110] Gaines Jr S O, Panter A, Lyde M D, Steers W N, Rusbult C E, Cox C L, et al. Evaluating the circumplexity of interpersonal traits and the manifestation of interpersonal traits in interpersonal trust. Journal of Personality and Social Psychology, 1997, 73(3): 610−610
    [111] Looije R, Neerincx M A, Cnossen F. Persuasive robotic assistant for health self-management of older adults: Design and evaluation of social behaviors. International Journal of Human-Computer Studies, 2010, 68(6): 386−397
    [112] Szalma J L, Taylor G S. Individual differences in response to automation: The five factor model of personality. Journal of Experimental Psychology: Applied, 2011, 17(2): 71−71
    [113] Balfe N, Sharples S, Wilson J R. Understanding is key: an analysis of factors pertaining to trust in a real-world automation system. Human factors, 2018, 60(4): 477−495
    [114] Rajaonah B, Anceaux F, Vienne F. Trust and the use of adaptive cruise control: a study of a cut-in situation. Cognition, Technology & Work, 2006, 8(2): 146−155
    [115] Fan X, Oh S, McNeese M, Yen J, Cuevas H, Strater L, et al. The influence of agent reliability on trust in human-agent collaboration. In: Proceedings of the 15th European conference on Cognitive ergonomics: the ergonomics of cool interaction. New York, USA: Association for Computing Machinery, 2008. 2130−2134.
    [116] Sanchez J, Rogers W A, Fisk A D, Rovira E. Understanding reliance on automation: effects of error type, error distribution, age and experience. Theoretical issues in ergonomics science, 2014, 15(2): 134−160
    [117] Riley V. Operator reliance on automation: Theory and data. Automation and human performance: Theory and applications, 1996, 7(12): 19−35
    [118] Lee J D, Moray N. Trust, self-confidence, and operators' adaptation to automation. International journal of human-computer studies, 1994, 40(1): 153−184
    [119] Perkins L, Miller J E, Hashemi A, Burns G. Designing for human-centered systems: Situational risk as a factor of trust in automation. In: Proceedings of the human factors and ergonomics society annual meeting. Los Angeles, USA: SAGE, 2010. 2130−2134.
    [120] Bindewald J M, Rusnock C F, Miller M E. Measuring Human Trust Behavior in Human-Machine Teams. Cham: Springer, 2018. 47−58.
    [121] Biros D P, Daly M, Gunsch G. The influence of task load and automation trust on deception detection. Group Decis Negot, 2004, 13(2): 173−189
    [122] Workman M. Expert decision support system use, disuse, and misuse: a study using the theory of planned behavior. Computers in Human Behavior, 2005, 21(2): 211−231
    [123] Jian J-Y, Bisantz A M, Drury C G. Foundations for an empirically determined scale of trust in automated systems. International journal of cognitive ergonomics, 2000, 4(1): 53−71
    [124] Buckley L, Kaye S-A, Pradhan A K. Psychosocial factors associated with intended use of automated vehicles: A simulated driving study. Accident Analysis & Prevention, 2018, 115(45): 202−208
    [125] Mayer R C, Davis J H. The effect of the performance appraisal system on trust for management: A field quasi-experiment. Journal of applied psychology, 1999, 84(1): 123−123
    [126] Madsen M, Gregor S. Measuring human-computer trust. In: Proceedings of the 11th Australasian conference on information systems. Brisbane, Australia: Australasian Association for Information Systems, 2000. 6−8.
    [127] Chien S-Y, Semnani-Azad Z, Lewis M, Sycara K. Towards the development of an inter-cultural scale to measure trust in automation. In: Proceedings of the International conference on cross-cultural design. Cham, Switzerland: Springer, 2014. 35−36.
    [128] Garcia D, Kreutzer C, Badillo-Urquiola K, Mouloua M. Measuring trust of autonomous vehicles: a development and validation study. In: Proceedings of the International Conference on Human-Computer Interaction. Cham, Switzerland: Springer, 2015. 610−615.
    [129] Yagoda R E, Gillan D J. You want me to trust a ROBOT? The development of a human–robot interaction trust scale. International Journal of Social Robotics, 2012, 4(3): 235−248
    [130] Dixon S R, Wickens C D. Automation reliability in unmanned aerial vehicle control: A reliance-compliance model of automation dependence in high workload. Human factors, 2006, 48(3): 474−486
    [131] Chiou E K, Lee J D. Beyond reliance and compliance: human-automation coordination and cooperation. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2015. 159−199.
    [132] Bindewald J M, Rusnock C F, Miller M E. Measuring human trust behavior in human-machine teams. In: Proceedings of the International Conference on Applied Human Factors and Ergonomics. Cham, Switzerland: Springer, 2017. 47−58.
    [133] Chancey E T, Bliss J P, Yamani Y, Handley H A H. Trust and the compliance–reliance paradigm: The effects of risk, error bias, and reliability on trust and dependence. Human factors, 2017, 59(3): 333−345
    [134] Hergeth S, Lorenz L, Vilimek R, Krems J F. Keep your scanners peeled: Gaze behavior as a measure of automation trust during highly automated driving. Human factors, 2016, 58(3): 509−519
    [135] Gremillion G M, Metcalfe J S, Marathe A R, Paul, V J, Christensen J, Drnec K, et al. Analysis of trust in autonomy for convoy operations. In: Proceedings of the Micro-and Nanotechnology Sensors, Systems, and Applications VIII. Washington, USA: SPIE, 2016. 9836−9838.
    [136] Basu C, Singhal M. Trust dynamics in human autonomous vehicle interaction: A review of trust models. In: Proceedings of the AAAI Spring Symposium Series. California, USA: AAAI, 2016. 238−245.
    [137] Hester M, Lee K, Dyre B P. “Driver Take Over”: A Preliminary Exploration of Driver Trust and Performance in Autonomous Vehicles. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2017. 1969−1973.
    [138] De Visser E J, Monfort S S, Goodyear K, Lu L, O’Hara M, Lee M R, et al. A little anthropomorphism goes a long way: Effects of oxytocin on trust, compliance, and team performance with automated agents. Human factors, 2017, 59(1): 116−133
    [139] Gaudiello I, Zibetti E, Lefort S, Chetouani M, Ivaldi S. Trust as indicator of robot functional and social acceptance. An experimental study on user conformation to iCub answers. Computers in Human Behavior, 2016, 61(67): 633−655
    [140] Desai M, Kaniarasu P, Medvedev M, Steinfeld A, Yanco H. Impact of robot failures and feedback on real-time trust. In: Proceedings of the 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI). New York, USA: IEEE, 2013. 251−258.
    [141] Payre W, Cestac J, Delhomme P. Fully automated driving: Impact of trust and practice on manual control recovery. Human factors, 2016, 58(2): 229−241
    [142] Khalid H M, Shiung L W, Nooralishahi P, Rasool Z, Martin G, Kiong C L, Ai-vyrn C. Exploring psycho-physiological correlates to trust: Implications for human-robot-human interaction. In: Proceedings of the human factors and ergonomics society annual meeting. Los Angeles, CA, USA: SAGE, 2016. 697−701.
    [143] Gold C, Körber M, Hohenberger C, Lechner D, Bengler K. Trust in automation–Before and after the experience of take-over scenarios in a highly automated vehicle. Procedia Manufacturing, 2015, 3(53): 3025−3032
    [144] Walker F, Verwey W, Martens M. Gaze behaviour as a measure of trust in automated vehicles. In: Proceedings of the 6th Humanist Conference. Washington, DC, USA: HUMANIST, 2018. 117−123.
    [145] Adolphs R. Trust in the brain. Nature neuroscience, 2002, 5(3): 192−193
    [146] Delgado M R, Frank R H, Phelps E A. Perceptions of moral character modulate the neural systems of reward during the trust game. Nature neuroscience, 2005, 8(11): 1611−1618
    [147] King-Casas B, Tomlin D, Anen C, Camerer C F, Quartz S R, Montague P R. Getting to know you: reputation and trust in a two-person economic exchange. Science, 2005, 308(5718): 78−83
    [148] Krueger F, McCabe K, Moll J, Kriegeskorte N, Zahn R, Strenziok M, et al. Neural correlates of trust. Proceedings of the National Academy of Sciences, 2007, 104(50): 20084−20089
    [149] Long Y, Jiang X, Zhou X. To believe or not to believe: trust choice modulates brain responses in outcome evaluation. Neuroscience, 2012, 200(28): 50−58
    [150] Minguillon J, Lopez-Gordo M A, Pelayo F. Trends in EEG-BCI for daily-life: Requirements for artifact removal. Biomedical Signal Processing and Control, 2017, 31(36): 407−418
    [151] Choo S, Sanders N, Kim N, Kim W, Nam C S, Fitts E P. Detecting Human Trust Calibration in Automation: A Deep Learning Approach. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2019. 88−90.
    [152] Morris D M, Erno J M, Pilcher J J. Electrodermal Response and Automation Trust during Simulated Self-Driving Car Use. In: Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Los Angeles, CA, USA: SAGE, 2017. 1759−1762.
    [153] Drnec K, Marathe A R, Lukos J R, Metcalfe J S. From trust in automation to decision neuroscience: applying cognitive neuroscience methods to understand and improve interaction decisions involved in human automation interaction. Frontiers in human neuroscience, 2016, 10(64): 290−297
    [154] 刘伟. 人机融合智能的现状与展望. 国家治理, 2019, 4(04): 7−15

    Liu Wei. Current situation and Prospect of human-computer fusion intelligence. Governance, 2019, 4(04): 7−15
    [155] 许为. 五论以用户为中心的设计: 从自动化到智能时代的自主化以及自动驾驶车. 应用心理学, 2020, 26(02): 108−128 doi: 10.3969/j.issn.1006-6020.2020.02.002

    XU Wei. User-Centered Design (V): From Automation to the Autonomy and Autonomous Vehicles in the Intelligence Era. Chinese Journal of Applied Psychology, 2020, 26(02): 108−128 doi: 10.3969/j.issn.1006-6020.2020.02.002
    [156] 王新野, 李苑, 常明, 游旭群. 自动化信任和依赖对航空安全的危害及其改进. 心理科学进展, 2017, 25(09): 1614−1622

    WANG Xin-ye, LI Yuan, CHANG Ming, YOU Xu-qun. The detriments and improvement of automation trust and dependence to aviation safety. Advances in Psychological Science, 2017, 25(09): 1614−1622
    [157] 曹清龙. 自动化信任和依赖对航空安全的危害及其改进分析. 技术与市场, 2018, 25(04): 160 doi: 10.3969/j.issn.1006-8554.2018.04.082

    CAO Qing-long. Analysis of the detriments and improvement of automation trust and dependence to aviation safety. Technology and Market, 2018, 25(04): 160 doi: 10.3969/j.issn.1006-8554.2018.04.082
    [158] Adams B D, Webb R D. Trust in small military teams. In: Proceedings of the 7th international command and control technology symposium. Virginia, USA: DTIC, 2002. 1−20.
    [159] Parasuraman R, Manzey D H. Complacency and bias in human use of automation: An attentional integration. Human factors, 2010, 52(3): 381−410
    [160] Beggiato M, Pereira M, Petzoldt T, Krems J. Learning and development of trust, acceptance and the mental model of ACC. A longitudinal on-road study. Transportation research part F: traffic psychology and behaviour, 2015, 35(32): 75−84
    [161] Reimer B. Driver assistance systems and the transition to automated vehicles: A path to increase older adult safety and mobility? Public Policy & Aging Report, 2014, 24(1): 27−31
    [162] Large D R, Burnett G E. The effect of different navigation voices on trust and attention while using in-vehicle navigation systems. Journal of safety research, 2014, 49(69): 61−75
    [163] Zhang T, Tao D, Qu X, Zhang X, Zeng J, Zhu H, et al. Automated vehicle acceptance in China: Social influence and initial trust are key determinants. Transportation research part C: emerging technologies, 2020, 112(76): 220−233
    [164] Choi J K, Ji Y G. Investigating the importance of trust on adopting an autonomous vehicle. International Journal of Human-Computer Interaction, 2015, 31(10): 692−702
    [165] Kaur K, Rampersad G. Trust in driverless cars: Investigating key factors influencing the adoption of driverless cars. Journal of Engineering and Technology Management, 2018, 48(24): 87−96
    [166] Lee J D, Kolodge K. Exploring trust in self-driving vehicles through text analysis. Human factors, 2020, 62(2): 260−277
  • 加载中
计量
  • 文章访问数:  55
  • HTML全文浏览量:  39
  • 被引次数: 0
出版历程
  • 收稿日期:  2020-06-17
  • 修回日期:  2020-08-11

目录

    /

    返回文章
    返回