×
Welcome to the eye tracking tool I am building on top of WebGazer.js! The intended use case for this tool is for gathering several trials of eye movement data from participants.
Basic steps of usage:
- Set your desired trial length with Auto-stop countdown duration.
- Set your desired measurement interval (at least 100ms for now).
- Click Start Tracking
- Allow camera access
- Calibrate the gaze prediction model by keeping your head in the green square shown in the video stream, keeping your eyes on the cursor, and pretending to click (e.g.) nine imaginary dots on the screen (e.g. see here: https://webgazer.cs.brown.edu/calibration.html?)
- Click Begin auto-stop countdown when you are ready to start measurement, or click Stop tracking to manually stop tracking.
- After the tracking ends, click Download data and/or Download audio.
- Throughout tracking, use the Marker buttons to set markers for certain timestamps that will appear in the final CSV.
- As another form of annotating the data, use the audio recording functionality (audio recording also auto-stops at end of countdown)
Other important notes:
- To run an experiment, I recommend minimizing the window for this tool and dragging a separate window with your desired stimuli over and ontop of it. For experimental purposes, the positions of both windows should be perfectly replicated to get the same type of data.
- You can (but maybe shouldn't, see below) gather multiple separate trials of data without refreshing the page by clicking Start, Stop, Start, Stop, etc.
- As of now, the gaze prediction model is saved between trials, so for scientific purposes, in order to keep measurement conditions the same, separate trials should be gathered after refreshing the page (meaning you will have to collect separate CSVs, sadly).
- In the CSV, the time differences aren't perfect because parts of the program have unpredictable execution times, and I'm not aware of an easy way to fix this.
- I did not create the gaze predictor, I only wrote a bit of code on top of it that might make WebGazer suitable for psychology research. All credit for the gaze predictor belongs to the developers of WebGazer.js: https://webgazer.cs.brown.edu/
- My understanding of WebGazer.js is that it uses a combination of your facial features, cursor movements, and cursor clicks to predict your gaze.
- The original intention of this project was to use WebGazer to obtain data that multifractal analysis could be applied to, but it is still unclear to us whether WebGazer is accurate enough for finding statistically significant (multifractal) patterns.
Known bugs:
- You can't record more than one audio file without refreshing the page.
- Sometimes the first few data points in a trial aren't what they should be (e.g. Time is zero for 2nd row, or Time is very large initially), so generally those points should be cut off from analysis
Citation information:
If you're using this eye-tracking tool for your research, please cite both me and WebGazer.js. Example for this site:
Jack Ryan. (2023). WebGazer Eye Tracking for Psychology Experiments. August 1, 2023 version. Retrieved from https://jackaldenryan.com/pages/eye_tracking/eye_tracking.html.
Visit the WebGazer
website for their citation info.