Human-Computer Interaction Behavior and Intention Prediction Model Based on Eye Movement Characteristics
LIANG Yong-qiang1, WANG Wei1, QU Jue1,2, YANG Jie1, LIU Xiao-wei1
1. Air and Missile Defense College, Air Force Engineering University. Xi'an, Shaanxi 710051, China;
2. School of Aeronautics, Northwestern Polytechnical University. Xi'an, Shaanxi 710072, China
Abstract:Aiming at the demand of predicting adaptive interface user's intention,This paper presents a method of human-computer interaction behavior classification and intention prediction based on eye movement characteristics.By establishing a simplified interface model,the user's operating behavior is divided into 5 categories,design visual interaction experiment to collect the relevant states' eye movement data.The SVM(Support Vector Machine) algorithm is used to establish the classification prediction model,combined with the difference analysis method to select the eye movement feature component.Finally,the position X coordinate,the position Y coordinate,the gaze time,the eye jump amplitude and the pupil diameter of the 3 consecutive sampling fixation points can be used as the characteristic parameters to obtain the better prediction effect,and the prediction accuracy can reach more than 90%.
梁永强, 王崴, 瞿珏, 杨洁, 刘晓卫. 基于眼动特征的人机交互行为意图预测模型[J]. 电子学报, 2018, 46(12): 2993-3001.
LIANG Yong-qiang, WANG Wei, QU Jue, YANG Jie, LIU Xiao-wei. Human-Computer Interaction Behavior and Intention Prediction Model Based on Eye Movement Characteristics. Acta Electronica Sinica, 2018, 46(12): 2993-3001.
[1] 葛列众.工程心理学[M].上海:华东师范大学出版社,2017.163-164.
[2] 宋巍,刘丽珍,王函石.基于兴趣偏好的微博用户性别推断研究[J].电子学报,2016,44(10):2522-2529. SONG Wei,LIU Li-zhen,WANG Han-shi.Userinterface preferences for gender inference on microblog[J].Acta Electronica Sinica,2016,44(10):2522-2529.(in Chinese)
[3] 陈杰,刘学军,李斌,等.一种基于用户动态兴趣和社交网络的微博推荐方法[J].电子学报,2017,45(4):898-905. CHEN Jie,LIU Xue-jun,LI Bin,et.al.Personalized microblogging recommendation based on dynamic interests and social networking of users[J].Acta Electronica Sinica,2017,45(4):898-905.(in Chinese)
[4] Keith Rayner.Eye movements and attention inreading,scene perception,and visual search[J].Quarterly Journal of Experimental Psychology,2009,62(8):1457-506.
[5] Drew T,Evans K,Võ M L,et al.Informatics in radiology:what can you see in a single glance and how might this guide visual search in medical images[J].Radiographics,2013,33(1):263-274
[6] Hasse C,Bruder C.Eye-tracking measurementsand their link to a normative model of monitoring behaviour[J].Ergonomics,2015,58(3):1-13.
[7] Katagiri N,Marumo Y,Tsunashima H.Controllerdesign and evaluation of lane-keeping-assistance system for motorcycles[J].Journal of Mechanical Systems for Transportation & Logistics,2009,2(1):43-54.
[8] 高军峰,司慧芳,余彬,等.基于脑电样本熵的测谎分析[J].电子学报,2017,45(8):1836-1841. Gao Jun-feng,Si Hui-fang,Yu Bin,et.Al.Liedetection analysis based on the sample entropy on EEG[J].Acta Electronica Sinica,2017,45(8):1836-1841.(in Chinese)
[9] Murata A.Improvement of pointing time bypredicting targets in pointing with a PC mouse[J].International Journal of Human computer Interaction,1998,10(1):23-32.
[10] Hertzum M,Hornbã K.The effect of targetprecuing on pointing with mouse and touchpad[J].International Journal of Human computer Interaction,2013,29(5):338-350.
[11] Biswas P,Langdon P M.Multi-modal targetprediction[A].8th International Conference on Universal Access in Human-Computer Interaction[C].Heraklion,Crete,Greec:Springer International Publishing,2014.313-324.
[12] Hsu C K,Lin S C,Li W C.Visual movement andmental-workload for pilot performance assessment[A].International Conference on Engineering Psychology and Cognitive Ergonomics[C].Cham:Springer International Publishing,2015.356-364.
[13] 赵其杰,邵辉,卢建霞.基于头眼行为的交互意图检测方法[J].仪器仪表学报,2014,35(10):2313-2320. Zhao Qi-jie,Shao Hui,Lu Jian-xia.Identificationmethod of interaction intention based on head and eye behaviors[J].Journal of Psychological Science,2014,35(10):2313-2320.(in Chinese)
[14] 卢万譞,贾云得.基于眼动数据的网络搜索行为预测方法[J].北京航空航天大学学报,2015,41(05):904-910. Lu Wan-xuan,Jia Yun-de.Predicting web searchbehavior based on gaze data[J].Journal of Beijing University of Aeronautics and Astronautics,2015,41(05):904-910.(in Chinese)
[15] 操雅琴,郭伏,屈庆星.基于多模式测量的网站用户行为意图预测模型[J].东北大学学报(自然科学版),2014,35(11):1669-1672. Cao Ya-qin,Guo Fu,Qu Qing-xing.Predictionmodels of website users' behavioral intentions based on multi-mode measurement[J].Journal of Northeastern University(Natural Science),2014,35(11):1669-1672.(in Chinese)
[16] Cameirão M S,Faria A L,Paulino T,et al.Theimpact of positive,negative and neutral stimuli in a virtual reality cognitive-motor rehabilitation task:a pilot study with stroke patients[J].Journal of Neuroengineering & Rehabilitation,2016,13(1):70.
[17] 杭久成,何卫平.浅析数控机床触摸屏人机交互界面设计[J].机械制造,2008,(03):23-26. Han Jiu-cheng,He Wei-ping.Analysis onhuman-computer interaction interface design of CNC machine tool[J].Machinery,2008,(03):23-26.(in Chinese)
[18] 李鹏程,张力,戴立操,黄卫刚.核电厂数字化人-机界面特征对人因失误的影响研究[J].核动力工程,2011,32(01):48-52. Li Peng-cheng,Zhang Li,Dai Li-cao,Huang Wei-gang.Effects of digital human-machine interaction characteristics on human error in nuclear plants[J].Nuclear Power Engineering,2011,32(01):48-52.(in Chinese)
[19] 王建峰,王崴,高虹霓,等.威力镜头对雷达界面动态目标点击的可用性研究[J].航天医学与医学工程,2017,30(4):277-282. Wang Jian-feng,Wang Wei,Gao Hong-ni,et.al.Usability study of power-lens for click of dynamic targets in radar interface[J].Space Medicine & Medical Engineering,2017,30(4):277-282.(in Chinese)