TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2

TobiiGlassesPySuite: An open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies

Abstract.

In this paper we present the TobiiGlassesPySuite, an open-source suite we implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom eye-tracking studies. We provide a platform-independent solution for controlling the device and for managing the recordings. The software consists of Python modules, integrated into a single package, accompanied by sample scripts and recordings. The proposed solution aims at providing additional methods with respect to the manufacturer’s software, for allowing the users to exploit more the device’s capabilities and the existing software. Our suite is available for download from the repository indicated in the paper and usable according to the terms of the GNU GPL v3.0 license.

eye-tracking, human-computer interaction, Tobii Pro Glasses 2, wearable eye-tracker, wearable computing, open-source
12345678910111213141516

1. Introduction

The recent developments in eye-tracking technologies have produced remarkable progress especially on wearable devices. As a matter of fact, recent market studies predict a notable growing of interest in wearable eye-tracking technology for the next few years. Nowadays, the commercial products available are built to have the appearance of normal eyeglasses, to ensure an easy wearability and a low weight. They are able to track binocular eye-movements up to 200Hz. Tobii AB, Ergoneers GmbH and Pupil Labs are some of the main companies that produce wearable eye-trackers. Although these devices provide similar features, they differ from each others for performances, namely: gaze sampling frequency, tracking technique, communication protocol and programming tools.

In this paper we present an open-source suite developed for Tobii Pro Glasses 2 [tobiipro.com, 2019a], the mobile eye-tracker produced by Tobii AB. The TobiiGlassesPySuite offers a unified solution for controlling the device and for processing data in cross-platform environments. It provides the experimenters with simplified methods for accessing the device and the recordings. In fact, the suite integrates the functionalities of the Tobii Pro Glasses 2 API and, at the same time, hides to the users the implementation details. Our work aims at extending the functionalities provided by the manufacturer software by making the device more suitable to be integrated in custom eye-tracking studies.

The paper is divided in four main sections. In Section 2 we present related open-source solutions supporting the eye-tracking research. In Section 3 we highlight the reasons which motivated the software developments presented in the paper. Section 4 contains a more detailed description of the product, in terms of hardware specifications and software tools available. In Section 5 the open-source suite is presented, focusing more on the functionalities provided than on the implementation details. In conclusion, we discuss current limits and future improvements of the proposed solution.

2. Related Work

Several open-source solutions have been produced over the years to support eye-tracking research and applications. The presence of high-level development tools, accessible also for non-experienced programmers, has enabled researchers to share the results of their efforts for implementing prototypes of new analysis techniques and custom controllers of eye-tracker equipments. The PyGaze software, for example, is an open-source package for creating eye-tracking experiments using Python [Dalmaijer et al., 2014]. PyGaze implements the methods for presenting visual and auditory stimulus and for collecting responses using standard and custom input devices. It is compatible with several commercial eye-tracker devices and provides developers with interfaces to implement their custom controllers. In addition, one of the main advantages of PyGaze lies in being able to access all the libraries and frameworks already available for the Python programming language. A similar open-source framework, written in Python, is GazeParser [Sogo, 2013]. GazeParser, originally developed for Windows OS, offers the possibility to record eye movements using video-based techniques, to create an eye-tracking study using PsychoPy [Peirce, 2007] and VisionEgg [Straw, 2008] experimental control libraries and later to extract fixations and saccades. Regarding data analysis tools, many software packages have been released to address the most common issues, such as: for processing of eye movement data to static and dynamic scenes, for detecting and filtering artefacts, for detecting gaze events, for generating AOIs (Areas-Of-Interest) and for visualizing data using heatmaps and gaze plots. Some of these frameworks are available for: MATLAB (GazeAlyze [Berger et al., 2012], EyeMMV [Krassanakis et al., 2014], EALab [Andreu-Perez et al., 2016], SacLab [Cercenelli et al., 2017]), Python (PyGazeAnalyzer [Dalmaijer et al., 2014], [Mardanbegi et al., 2017]), R (ETRAN—R [Zhegallo and Marmalyuk, 2015]).

3. Motivation

The wearable solutions have introduced new technical challenges in eye-tracking technology due to the user-centered point of view. Contrary to the static eye-trackers, the gaze coordinates are not referred to a fixed frame of reference (a display screen for example). The lack of a fixed frame presents a significant challenge for the analyst confronted with a highly complex data stream. In mobile eye-trackers, gaze points are registered with respect to the scene camera (placed in front of the glasses) which captures different images depending on the user point of view. Currently, the definition of the AOIs and the relative gaze mapping relies on computer vision algorithms that have still room of improvements, in terms of robustness and easy to use. Some successful attempts have been already proposed in this direction by integrating gaze data with object recognition [Kurzhals et al., 2017; Benjamins et al., 2018; Pfeiffer et al., 2016] and machine learning algorithms [Wolf et al., 2018]. The precision/accuracy of the collected gaze data represents another critical point because it varies depending on the calibration procedure and the target distance. In [MacInnes et al., 2018], they compare calibration accuracy and precision among three commercial models of wearable eye-trackers, including the Tobii Pro Glasses 2. Another important open issue is how to determine fixations and saccades using wearable devices, due to the non-static nature of the observed scene. In [Kasneci et al., 2014], they propose the use of Hidden Markov and Bayesian Mixture models for an online recognition of gaze events. The results show that these probabilistic methods work well in these scenarios thanks to their ability to adapt to the viewing behavior and changes in the scene. In [Munn et al., 2008], they present an automated process based on I-VT (Velocity-Threshold Identification) gaze filter, applied to different types of scene and eye motions recorded with a mobile eye-tracker. The obtained results show encouraging performances with respect to manual coding. Tobii provides a commercial software for analysis (the Tobii Pro Lab Analyzer [tobiipro.com, 2019b]) to extract filtered gaze data based using the Tobii I-VT Fixation (Velocity-Threshold Identification Gaze Filter) [Olsen, 2012], although they did not recommend its use with the Tobii Pro Glasses 2.

Considering the expected growth of interest in wearable eye-tracking technology in the next few years and the number of possible application fields, we believe that open-source solutions may facilitate the access to this technology helping significantly the scientific community to bring out new solutions. In terms of applications, the Tobii Pro Glasses 2 has proved suitable for use in various fields. Recently, it has been employed in research application fields, such as: Cognitive and Social Psychology [Szulewski et al., 2015; Ioannidou et al., 2017; Arai et al., 2017; Rogers et al., 2018], Visual Attention [Jensen et al., 2017; Willemse and Wykowska, 2018], Clinical Research [Monem and Fillmore, 2017; King et al., 2018], Training [Sanchez-Ferrer et al., 2017].

Our work aims at providing open-source tools for experimenters who want to exploit the capabilities of the Tobii Pro Glasses 2 in their research field. Developers can access the source code, modify it and distribute it according to their needs by following the terms and conditions of the GNU GPL v3.0 license. The TobiiGlassesPySuite is written in Python and it cross-platform, so it does not require any commercial software to be used and can be installed in the most common operating systems. The suite provides easy programming tools and examples to facilitate the integration of the device in custom experimental studies. Another aspect that motivated our choices is the fact that the proposed solution can benefit from the many tools already available for Python, including: scientific computing, machine learning, data analysis, computer vision and so on. Moreover, TobiiGlassesPySuite is compatible with PsychoPy [Peirce, 2007] and integrated in PyGaze [Dalmaijer et al., 2014], extending its native features with others for the development and post-analysis of eye-tracking experiments.

Due to the diffusion of the Tobii Pro Glasses 2 and the lack of any open-source suite for managing the device, we expect a growing interest from the research community in the proposed solution and potentially significant contributions from developers. More technical details can be found in the GitHub repository [De Tommaso, 2019d].

Figure 1. Software architecture of the open-source suite

4. Tobii Pro Glasses 2: A general overview

The Tobii Pro Glasses 2 consists of two main units: the Head Unit, containing the sensors, and the Recording Unit, including an embedded system. The Head Unit has the structure of normal eyeglasses and includes a set of infrared projectors and infrared cameras for tracking simultaneously both pupils position, using the two corneal reflection and dark pupil techniques. The Recording Unit is an embedded system with connected with the Head Unit through a HDMI cable. It is able to send video and data streaming through the network (WiFi or Ethernet) and it stores the recordings in a removable SD card. For a more detailed description of the technical specifications of the product, please refer to the official documentation available at [tobiipro.com, 2019a].

The device is also supported by the Tobii Pro Glasses 2 API, a programming interface for accessing all the streamed live data from the Tobii Pro Glasses 2, as well as basic functionalities, such as: managing the calibrations, recordings, projects and participants.

5. TobiiGlassesPySuite: A Python package for the Tobii Pro Glasses 2

In this section we present the open-source solution we developed for interfacing the Tobii Pro Glasses 2. The source code, available from [De Tommaso, 2019d], it is compatible with Python 2.x and 3.x versions and it is implemented to be platform-independent. Our solution has been tested for working in Windows OS, GNU/Linux and Mac OS systems. The software is embedded in a pip package and available from the Python Package Index (PyPI) repository. This makes the installation relatively simple, requiring only a shell command to complete the operation. The suite consists of two main parts: a stand-alone module for controlling the device (e.g. calibrating, recording, data streaming, etc.), namely the TobiiGlassesPyController and a set of other modules for parsing, extracting and filtering the data (Livedata, Gazedata, Recordings, Filters, Exporters and Events). These functionalities are provided by Python classes developed following the OOP (Object Oriented programming) paradigm. All these modules are based and implemented on top of the Tobii Pro Glasses 2 API (see Figure 1). In such a way, any future change in the official API will not affect the user code, but only the involved classes of the suite. The same API is used by the manufacturer’s software for accessing the device and storing the data. This raises us to perform comparisons between our solution and the manufacturer’s software because the API operations are managed internally by the device. In the following sections the modules of the suite are presented more in detail.

Tobii Pro Glasses Controller TobiiGlassesPyController
GUI
Live view with mapped gaze
Logged events
Tobii Pro Lab custom events
Cross-platform
Data streaming
Table 1. Features provided by the existing software and the proposed solution.

5.1. TobiiGlassesPyController

Tobii provides a free of charge software for controlling the Tobii Pro Glasses 2, namely the Tobii Pro Glasses Controller. It allows the users to access a set of features such as: recording, calibrating, and live view using a GUI (Graphical User Interface). The complete list of features can be find in [tobiipro.com, 2019a]. More experienced users may find some important limitations present in this software that may preclude the use of this device in interactive scenarios or custom eye-tracking experiments. Firstly, the software is compatible only on Windows OS systems. Secondly, the recordings stored in the SD card of the Recording Unit are not trivial to retrieve. In fact, data are stored in folders named with unique identifiers generated by the Tobii Pro Glasses 2 API of which the non-expert user is not necessarily aware. Moreover, the recordings are not saved in human-readable formats (CSV or text files), but in the form of JSON objects. Thirdly, gaze data are not accessible on-line during a recording. In order to address these limitations we developed the TobiiGlassesPyController, a Python wrapper that uses the functionalities of the Tobii Pro Glasses 2 API, the same way the manufacturer software does, and integrates some additional features. The controller is also available from the repository [De Tommaso, 2019b]. These features, summarized in Table 1, are shown in the relative examples in Section 5. In addition to the controller, Tobii provides also a commercial software for analyzing data, the Tobii Pro Lab Analyzer [tobiipro.com, 2019b]. A relevant feature to mention, implemented in the TobiiGlassesPyController, allows to send specific JSON messages during a recording that will be shown in the Tobii Pro Lab Analyzer as custom events. This is a feature of particular interest for experimenters who need to analyze separately different conditions in the same recording.

5.2. Livedata and Gazedata

As mentioned in the previous section, data processed by the Tobii Pro Glasses 2 are stored in the SD card in the form of JSON objects according to the models described in Tobii Pro Glasses API. They are located in a file named livedata.json.gz which contains all the samples processed during a recording. Each sample includes a timestamp (ts) and a status flag (s) indicating the presence of any anomalies. The Livedata module parses the JSON objects, recognizing their type and converting them in specific Python objects. On the other hand, the module Gazedata deals with maintaining the gaze data in more efficient structures ordered by timestamps, discarding the not valid or expired samples.

5.3. Importing and exporting the recordings

The module Recordings provides methods for importing automatically the recordings from the SD card. Recording objects contain information about: project name, participant name, recorded video, gaze positions, gaze directions, head movements, logged events, and so on. The Exporter module, instead, implements the mechanisms for exporting the gaze data in CSV format. Specifically, the user can decide to export raw-data or filtered-data. Raw-data consists of a list of ordered samples as they are received by the TobiiGlassesPyController, while the filtered-data are the output of one or more filtering functions (such as logged event filters, timestamp filters or fixations/saccades filters). Currently, only the I-DT fixation filter described in [Salvucci and Goldberg, 2000] is implemented.

6. Examples on how to use the suite

In the following, we present Python examples aimed at introducing the user to start using the TobiiGlassesPySuite for controlling the eye-tracker and for managing the recordings. These scripts are public available and can be found in [De Tommaso, 2019a] and [De Tommaso, 2019c]. Moreover, in order to explain how to design an eye-tracking study with the TobiiGlassesPySuite, a full example of an experiment with recordings is provided.

1TG = TobiiGlassesPyController()
2#TG =TobiiGlassesPyController("192.168.71.50")
3#TG =TobiiGlassesPyController("fe80::fffe:ffff:ff00:ff00%eth0")
4
5pjt_name = input("Project’s name: ")
6pjt_id = TG.create_project(project_name)
7
8ppt_name = input("Participant’s name: ")
9ppt_id = TG.create_participant(pjt_id, ppt_name)
10
11calib_id = TG.create_calibration(pjt_id, ppt_id)
12input("Press ENTER to start calibrating")
13TG.start_calibration(calib_id)
14
15res = TG.wait_until_calibration_is_done(calib_id)
16
17if res == False:
18  print("Calibration failed!")
19  exit(1)
20
21rec_id = tobiiglasses.create_recording(ppt_id)
22
23input("Press ENTER to start recording")
24
25TG.start_streaming()
26TG.start_recording(rec_id)
27
28TG.send_logged_event("recording_start")
29TG.send_tobiipro_event("recording_event", "start")
30
31while True:
32  print("Press ’s’ to stop the recording, ’g’ to get data")
33    c = sys.stdin.read(1)
34    if c == ’s’:
35      break
36    elif c == ’g’:
37      TG.get_data()
38
39TG.send_logged_event("recording_stop")
40TG.send_tobiipro_event("recording_event", "stop")
41
42TG.stop_recording(rec_id)
43TG.stop_streaming()
Listing 1: A Python example showing some basic functionalities of the TobiiGlassesController

6.1. Connecting through the network (WLAN/LAN)

The script connect.py shows how to connect with the glasses. Using the TobiiGlassesPyController constructor without any parameter (see Listing 1, line 1), a discovery process is started. Specifically, the controller sends periodically discovery packets through all the network interfaces available, until a response from the glasses is captured. In that response the network physical address of the glasses is present. Once the discovery process terminates successfully, the network address of the eye-tracker can be specified as argument of the TobiiGlassesController constructor to make the connection process faster. As shown in the sample script in Listing 1, in line 2 a IPv4 address is used in case of WLAN connection, while in line 3 a IPv6 address and the network interface alias (eth0) are specified, in case of LAN connection.

6.2. Data streaming

The script streaming.py shows how to access live data. Specifically, the methods start_streaming() and stop_streaming() allow the developer to control the streaming mode of the eye-tracker (see also Listing 1, lines 25 and 43). The TobiiGlassesController deals with collecting gaze data in a Python dictionary for making them accessible using the method get_data() (see also Listing 1, line 37).

6.3. Managing projects, calibrations, participants and recordings

The script calibrate_and_record.py is a complete example of managing recordings. According to the API, a recording is associated to a single participant, it belongs to a specific project and it requires a successful calibration process. The example shows how to first create a project, a participant profile, how to complete a calibration and finally to control a recording. The script allows accessing the absolute path where the recording are stored in the SD card. The functionalities just mentioned are present in the Listing 1 as well. Differently, in order to retrieve information about already stored recordings, we implemented a set of methods for retrieving information of projects, recordings and segments, as shown in the examples 01_get_projects.py, 02_get_recordings.py and 03_get_segments.py.

6.4. Live video scene and mapped gaze

Live video streaming of the Scene Camera is accessible through a RTSP (Real-Time Streaming Protocol) server running in the Recording Unit. The live_scene.py uses OpenCV [Bradski, 2000] for showing the on-line video from the scene camera, while the livescene_and_gaze.py is an example of on-line mapping the gaze data with the video. In the other hand, the example video_and_gaze.py implements the off-line mapping between video and gaze data.

6.5. Sending custom events

During a recording is possible to send external triggers that indicate the occurrence of specific events. The API allows to send specific JSON objects containing information about these events. We implemented two mechanisms for sending custom events: the first one lets the user to have these information as logged events in the CSV-files (Listing 1, lines 28 and 39), the second one adds the possibility to have custom events in Tobii Pro Lab (Listing 1, lines 29 and 40) as well.

1from tobiiglasses.recordings import Recording
2from tobiiglasses.filters.fixationsDT import FilterDT
3
4rec = Recording(project_dir=’data/’, project_id=’xejxnds’, recording_id=’k3l4jms’)
5rec.exportRawData(filename=’rawdata.csv’)
6ff = FilterDT(dispersion_threshold=5, duration_threshold=100)
7rec.exportFixations(fixation_filter=ff, filename=’fdata.csv’)
Listing 2: Python example about how exporting gaze data from a stored recording

6.6. Exporting data in CSV files

Once a recording is successfully stored in the SD card of the Recording Unit, the experimenter can proceed exporting the data using our suite. In the Listing 2 is shown how to export raw and fixations data from a stored recording in a CSV-file. The output CSV-file rawdata.csv contains all the valid samples present in the livedata.json.gz, including gaze data, IMU data, and logged events. On the other hand, the fdata.csv contains information about fixations points with relative duration, filtered by the I-DT filter. Additional exporting functions are shown in the example export_data.py.

7. Conclusion

A Python based open-source suite for using the Tobii Pro Glasses 2 has been presented in this paper. The availability of the source code in the open access repository and the possibility of installing the entire suite with a single Python package make our solution ready for use and especially suitable for the researcher’s requirements. They can benefit from many examples and a full example experiment provided in the suite, aiming at facilitating the design of custom eye-tracking studies. In addition, the modularity of the software architecture and the use of a development platform for hosting the source code intend to ease future contributions from active developers, including bug fixes. The proposed solution is still under development, so it has great room for improvement in terms of robustness and in the number of implemented features. As future works, three main features are planned to be implemented. Firstly, the possibility to manage AOIs in the recordings. Secondly, the possibility to export eye-tracking metrics for fixations and AOIs. Thirdly, a set of visualization tools allowing to export heat maps and gaze plots.

Acknowledgements.
This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant awarded to AW, titled ”InStance: Intentional Stance for Social Attunement”. Grant agreement No: 715058).

Footnotes

  1. price: 15.00
  2. journalyear: 2019
  3. copyright: acmcopyright
  4. conference: ETRA, Eye Tracking Research and Applications; June 2019; Denver, Colorado (USA)
  5. doi: doi
  6. isbn: 978-1-4503-5706-7
  7. ccs: Human-centered computing Human computer interaction (HCI)
  8. ccs: Human-centered computing User studies
  9. ccs: Human-centered computing Empirical studies in HCI
  10. journalyear: 2019
  11. copyright: acmcopyright
  12. conference: 2019 Symposium on Eye Tracking Research and Applications; June 25–28, 2019; Denver , CO, USA
  13. booktitle: 2019 Symposium on Eye Tracking Research and Applications (ETRA ’19), June 25–28, 2019, Denver , CO, USA
  14. price: 15.00
  15. doi: 10.1145/3314111.3319828
  16. isbn: 978-1-4503-6709-7/19/06

References

  1. EALab (eye activity lab): a matlab toolbox for variable extraction, multivariate analysis and classification of eye-movement data. Neuroinformatics 14 (1), pp. 51–67. External Links: ISSN 1559-0089, Document Cited by: §2.
  2. Analysis of gaze information on actual pedestrian behavior in open space — which body part of an oncoming pedestrian do people gaze at?. In 2017 IEEE/SICE International Symposium on System Integration (SII), Vol. , pp. 704–709. External Links: Document, ISSN 2474-2325 Cited by: §3.
  3. Gazecode: open-source software for manual mapping of mobile eye-tracking data. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA ’18, New York, NY, USA, pp. 54:1–54:4. External Links: ISBN 978-1-4503-5706-7, Document Cited by: §3.
  4. GazeAlyze: a matlab toolbox for the analysis of eye movement data. Behavior Research Methods 44 (2), pp. 404–419. External Links: ISSN 1554-3528, Document Cited by: §2.
  5. The OpenCV Library. Dr. Dobb’s Journal of Software Tools. Cited by: §6.4.
  6. SacLab: a toolbox for saccade analysis to increase usability of eye tracking systems in clinical ophthalmology practice. Computers in Biology and Medicine 80, pp. 45 – 55. External Links: ISSN 0010-4825 Cited by: §2.
  7. PyGaze: an open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behavior Research Methods 46 (4), pp. 913–921. External Links: ISSN 1554-3528, Document Cited by: §2, §3.
  8. TobiiGlassesPyController examples. GitHub. Note: \urlwww.github.com/ddetommaso/TobiiGlassesPyController-examples Cited by: §6.
  9. TobiiGlassesPyController. GitHub. Note: \urlwww.github.com/ddetommaso/TobiiGlassesPyController Cited by: §5.1.
  10. TobiiGlassesPySuite examples. GitHub. Note: \urlwww.github.com/ddetommaso/TobiiGlassesPySuite-examples Cited by: §6.
  11. TobiiGlassesPySuite. GitHub. Note: \urlwww.github.com/ddetommaso/TobiiGlassesPySuite Cited by: §3, §5.
  12. Mind your step: the effects of mobile phone use on gaze behavior in stair climbing. Journal of Technology in Behavioral Science 2 (3), pp. 109–120. External Links: ISSN 2366-5963, Document Cited by: §3.
  13. Wearable gaze trackers: mapping visual attention in 3d. In Image Analysis, P. Sharma and F. M. Bianchi (Eds.), Cham, pp. 66–76. External Links: ISBN 978-3-319-59126-1 Cited by: §3.
  14. The applicability of probabilistic methods to the online recognition of fixations and saccades in dynamic scenes. In Proceedings of the Symposium on Eye Tracking Research and Applications, ETRA ’14, pp. 323–326. External Links: ISBN 978-1-4503-2751-0, Document Cited by: §3.
  15. Using eye tracking technology to compare the effectiveness of malignant hyperthermia cognitive aid design. Korean Journal of Anesthesiology 71, pp. . External Links: Document Cited by: §3.
  16. EyeMMV toolbox: an eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. Journal of Eye Movement Research 7 (1). Cited by: §2.
  17. Visual analytics for mobile eye tracking. IEEE Transactions on Visualization and Computer Graphics 23 (1), pp. 301–310. External Links: Document, ISSN 1077-2626 Cited by: §3.
  18. Wearable eye-tracking for research: automated dynamic gaze mapping and accuracy/precision comparisons across devices. bioRxiv. External Links: Document Cited by: §3.
  19. PSOVIS : an interactive tool for extracting post-saccadic oscillations from eye movement data. Cited by: §2.
  20. Measuring heightened attention to alcohol in a naturalistic setting: a validation study. Experimental and clinical psychopharmacology 25 (6), pp. 496–502 (en). External Links: ISSN 1064-1297, Document Cited by: §3.
  21. Fixation-identification in dynamic scenes: comparing an automated algorithm to manual coding. In Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization, APGV ’08, pp. 33–42. External Links: ISBN 978-1-59593-981-4, Document Cited by: §3.
  22. The tobii i-vt fixation filter. algorithm description. Note: \urlhttps://stemedhub.org/resources/2173/download/Tobii_WhitePaper_TobiiIVTFixationFilter.pdf Cited by: §3.
  23. PsychoPy—psychophysics software in python. Journal of Neuroscience Methods 162 (1), pp. 8 – 13. External Links: ISSN 0165-0270 Cited by: §2, §3.
  24. EyeSee3D 2.0: model-based real-time analysis of mobile eye-tracking in static and dynamic three-dimensional scenes. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, ETRA ’16, pp. 189–196. External Links: ISBN 978-1-4503-4125-7, Document Cited by: §3.
  25. Using dual eye tracking to uncover personal gaze patterns during social interaction. Scientific Reports 8, pp. . External Links: Document Cited by: §3.
  26. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, ETRA ’00, New York, NY, USA, pp. 71–78. External Links: ISBN 1-58113-280-8, Document Cited by: §5.3.
  27. Use of eye tracking as an innovative instructional method in surgical human anatomy. Journal of Surgical Education 74 (4), pp. 668 – 673. External Links: ISSN 1931-7204 Cited by: §3.
  28. GazeParser: an open-source and multiplatform library for low-cost eye tracking and analysis. Behavior Research Methods 45 (3), pp. 684–695. External Links: ISSN 1554-3528, Document Cited by: §2.
  29. Vision egg: an open-source library for realtime visual stimulus generation. Frontiers in Neuroinformatics 2, pp. 4. External Links: Document, ISSN 1662-5196 Cited by: §2.
  30. The use of task-evoked pupillary response as an objective measure of cognitive load in novices and trained physicians: a new tool for the assessment of expertise. Academic medicine : journal of the Association of American Medical Colleges 90. External Links: Document Cited by: §3.
  31. External Links: Link Cited by: §1, §4, §5.1.
  32. External Links: Link Cited by: §3, §5.1.
  33. In natural interaction with embodied robots we prefer it when they follow our gaze: a gaze-contingent mobile eyetracking study. PsyArXiv. External Links: Document Cited by: §3.
  34. Automating areas of interest analysis in mobile eye tracking experiments based on machine learning. Journal of Eye Movement Research 11. External Links: Document Cited by: §3.
  35. ETRAN–r extension package for eye tracking results analysis.. Perception 44 8-9, pp. 1129–35. Cited by: §2.
Comments 0
Request Comment
You are adding the first comment!
How to quickly get a good reply:
  • Give credit where it’s due by listing out the positive aspects of a paper before getting into which changes should be made.
  • Be specific in your critique, and provide supporting evidence with appropriate references to substantiate general statements.
  • Your comment should inspire ideas to flow and help the author improves the paper.

The better we are at sharing our knowledge with each other, the faster we move forward.
""
The feedback must be of minimum 40 characters and the title a minimum of 5 characters
   
Add comment
Cancel
Loading ...
402614
This is a comment super asjknd jkasnjk adsnkj
Upvote
Downvote
""
The feedback must be of minumum 40 characters
The feedback must be of minumum 40 characters
Submit
Cancel

You are asking your first question!
How to quickly get a good answer:
  • Keep your question short and to the point
  • Check for grammar or spelling errors.
  • Phrase it like a question
Test
Test description