Shared Visualization and Collaborative Interaction Based on Multiple User Eye Tracking Data
Author:
Affiliation:

Fund Project:

National Key Research & Development Program of China (2016YFB1001403); National Natural Science Foundation of China (61772468, 61672451)

  • Article
  • | |
  • Metrics
  • |
  • Reference [41]
  • |
  • Related [20]
  • |
  • Cited by
  • | |
  • Comments
    Abstract:

    With the development of digital image processing technology and computer supported cooperative work, eye tracking has been applied in the process of multiuser collaborative interaction. However, existed eye tracking technique can only track single user's gaze, and the computing framework for multiple user's gaze data tracking is not mature; besides, the calibration process is much complex, and the eye tracking data recording, transition, and visualization mechanisms need to be further explored. Hence, this study proposed a new collaborative calibration method based on gradient optimization algorithms, so as to simplify the calibration process; and then in order to optimize the eye tracking data transition and management, the computing framework oriented to multiple user's eye tracking is proposed. Furthermore, to explore the influence of visual attention caused by visualization of eye tracking data sharing among multiple users, visualizations such as dots, clusters and trajectories are designed, and it is validated that the dots could improve the efficiency for collaborative visual search tasks. Finally, the code collaborative review systems are designed and built based on eye tracking, and this system could record, deliver, and visualize the eye tracking data in the forms of dots, code borders, code background, lines connected codes, among the code reviewing process. The user experiment result shows that, compared to the no eye tracking data sharing condition, sharing eye tracking data among multiple users can reduce the bug searching time with 20.1%, significantly improves the efficiency of collaborative work, and it validates the effectiveness of the proposed approach.

    Reference
    [1] Jacob RJK. What you look at is what you get:Eye movement-based interaction techniques. In:Proc. of the SIGCHI Conf. on Human Factors in Computing Systems. ACM, 1990. 11-18.
    [2] Schneider B, Pea R. Toward collaboration sensing. Int'l Journal of Computer-Supported Collaborate Learning, 2014,9(4):371-395.
    [3] Lin ZC. Eye gaze:Special or not special. Advances in Psychological Science, 2015,13(4):14-21(in Chinese with English abstract).
    [4] Moore C, Dunham PJ. Joint Attention:Its Origins and Role in Development. Lawrence Erlbaum Associates Inc., 1995.
    [5] Lin ZC. Joint attention:A milestone of early development. Psychological Science, 2007,30(5):1155-1157(in Chinese with English abstract).
    [6] Brennan SE, Chen X, Dickinson CA, et al. Coordinating cognition:The costs and benefits of shared gaze during collaborative search. Cognition, 2008,106(3):1465-1477.
    [7] Birmingham E, Bischof WF, Kingstone A, et al. Why do we look at people's eyes. Journal of Eye Movement Research, 2008,1(1).
    [8] Neider M, Voss MW, Kramer AF. Coordinating spatial attention:Using shared gaze to augment search and rescue. Journal of Vision, 2008,8(6):1048-1084.
    [9] Jermann P, Mullins D, Nuessli MA, et al. Collaborative gaze footprints:Correlates of interaction quality. In:Proc. of the Connecting Computer-Supported Collaborative Learning. Hong Kong, 2011. 184-191.
    [10] Li J, Manavalan M, D'Angelo S, et al. Designing shared gaze awareness for remote collaboration. In:Proc. of the CSCW 2016 Companion. San Francisco, 2016. 325-328.
    [11] Cheng SW, Sun ZQ, Sun LY, et al. Gaze-based annotations for reading comprehension. In:Proc. of the 33rd Annual ACM Conf. on Human Factors in Computing Systems (CHI 2015). Seoul, 2015. 1569-1572.
    [12] Sridharan S, Bailey R, McNamara A, et al. Subtle gaze manipulation for improved mammography training. In:Proc. of the Symp. on Eye Tracking Research and Applications (ETRA). Santa Barbara, 2012. 75-82.
    [13] Schneider B, Pea R. Real-time mutual gaze perception enhances collaborative learning and collaboration quality. Proc. of the Int'l Journal of Computer-supported Collaborative Learning, 2013,8(4):375-397.
    [14] Belenky D, et al. Using dual eye-tracking to evaluate students' collaboration with an intelligent tutoring system for elementary-level fractions. Cognitive Science, 2014, 176-181.
    [15] Maurer B, Trösterer S, Gärtner M, et al. Shared gaze in the car:Towards a better driver-passenger collaboration. In:Proc. of the 6th Int'l Conf. on Automotive User Interfaces and Interactive Vehicular Applications. Seattle, 2014. 1-6.
    [16] Sandra T, Magdalena G, Martin W, et al. Four eyes see more than two:Shared gaze in the car. In:Proc. of the Human-computer Interaction. Seoul, 2015. 331-348.
    [17] Chetwood ASA, Kwok KW, Sun LW, et al. Collaborative eye tracking:A potential training tool in laparoscopic surgery. Surgical Endoscopy, 2012,26(7):2003-2009.
    [18] Zhang Y, Pfeuffer K, Chong MK, et al. Look together:Using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing, 2017,21(1):173-186.
    [19] Brennan SE, Chen X, Dickinson CA, et al. Coordinating cognition:The costs and benefits of shared gaze during collaborative search. Cognition, 2008,106(3):1465-1477.
    [20] Liu Y, Hsueh PY, Lai J, et al. Who is the expert? analyzing gaze data to predict expertise level in collaborative applications. In:Proc. of the IEEE Int'l Conf. on Multimedia and Expo (ICME 2009). New York, 2009. 898-901.
    [21] Cheng SW, Sun ZQ, Lu YH. An eye tracking approach to cross-device interaction. Journal of Computer-Aided Design & Computer Graphics, 2016,28(7):1094-1104(in Chinese with English abstract).
    [22] Shen XQ. Research on collaborative eye tracking technique and interactive application[MS. Thesis]. Hangzhou:Zhejiang University of Technology, 2017(in Chinese with English abstract).
    [23] Zhang C, Chi JN, Zhang ZH, et al. Gaze estimation in a gaze tracking system. Science China Information Sciences, 2011,41(5):580-591(in Chinese with English abstract).
    [24] Zhou X, Cai H, Li Y, et al. Two-eye model-based gaze estimation from a Kinect sensor. In:Proc. of the 2017 IEEE Int'l Conf. on Robotics and Automation (ICRA). Marina Bay Sands, 2017. 1646-1653.
    [25] Su W, Boyd S, Candes EJ. A differential equation for modeling Nesterov's accelerated gradient method:Theory and insights. Advances in Neural Information Processing Systems, 2015,3(1):2510-2518.
    [26] Cheng SW, Sun ZQ. An approach to eye tracking for mobile device based interaction. Journal of Computer-Aided Design & Computer Graphics, 2014,26(8):1354-1361(in Chinese with English abstract).
    [27] Wang L. Head-mounted gaze tracking system with head motion[Ph.D. Thesis]. Hefei:University of Science and Technology of China, 2014(in Chinese with English abstract).
    [28] Sellen AJ. Remote conversations:The effects of mediating talk with technology. Human Computer Interaction, 1995,10(4):401-444.
    [29] Vertegaal R. The GAZE groupware system:Mediating joint attention in multiparty communication and collaboration. In:Proc. of the SIGCHI Conf. on Human Factors in Computing Systems. Pittsburgh, 1999. 294-301.
    [30] Gergle D, Kraut RE, Fussell SR. Language efficiency and visual technology:Minimizing collaborative effort with visual information. Journal of Language and Social Psychology, 2004,23(4):491-517.
    [31] Stein R, Brennan SE. Another person's eye gaze as a cue in solving programming problems. In:Proc. of the 6th Int'l Conf. on Multimodal Interfaces. Pittsburgh, 2004. 9-15.
    [32] Cherubini M, Nüssli MA, Dillenbourg P. Deixis and gaze in collaborative work at a distance (over a shared map):A computational model to detect misunderstandings. In:Proc. of the 2008 Symp. on Eye Tracking Research & Applications. Savannah, 2008. 173-180.
    [33] Jermann P, Nüssli MA. Effects of sharing text selections on gaze cross-recurrence and interaction quality in a pair programming task. In:Proc. of the ACM 2012 Conf. on Computer Supported Cooperative Work. Seattle, 2012. 1125-1134.
    附中文参考文献:
    [3] 林志成.眼睛注视:独特的还是不独特的.心理科学进展,2005,13(4):14-2l.
    [5] 林志成.联合注意:早期发展的里程碑.心理科学,2007,30(5):1155-1157.
    [21] 程时伟,孙志强,陆煜华.面向多设备交互的眼动跟踪方法.计算机辅助设计与图形学学报,2016,28(7):1094-1104.
    [22] 沈晓权.协同式眼动跟踪技术及其交互应用研究[硕士学位论文].杭州:浙江工业大学,2017.
    [23] 张闯,迟健男,张朝晖,等.视线追踪系统中视线估计方法研究.中国科学:信息科学,2011,41(5):580-591.
    [26] 程时伟,孙志强.用于移动设备人机交互的眼动跟踪方法.计算机辅助设计与图形学学报,2014,26(8):1354-1361.
    [27] 王林.头部可运动的头戴式视线跟踪系统关键技术研究[博士学位论文].合肥:中国科学技术大学,2014.
    Cited by
Get Citation

程时伟,沈哓权,孙凌云,胡屹凛.多用户眼动跟踪数据的可视化共享与协同交互.软件学报,2019,30(10):3037-3053

Copy
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:August 18,2018
  • Revised:November 01,2018
  • Online: May 16,2019
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063