This proof of concept demonstrates how we could use pupil tracking to investigate how people ingest information from atmospheric science visualisations.
You will need pipenv which you can install with pip install pipenv
git clone [email protected]:niallrobinson/pupiltracker.git
cd pupiltracker
pipenv install
pipenv shell
python server.py
- in a browser go to http://localhost:5000
- Get rid of the warning about security if you get it!
If you want to use analysis.py
you will have to install Iris using Conda.
- wait until the app has locked on to your face
- start the app by clicking
- you can calibrate the system by looking at the mouse pointer and clicking
- the best way to do this is to press "Hide/Display tracking", otherwise its very hard not to look at the dot
- then move the mouse, look at the mouse, and click. It's surprisingly hard to look where you think you're looking
- make sure you do this all over the screen
- press "Hide/Display tracking" again to check it's doing something sensible
- press "Hide/Display tracking" again to get the controls out the way
- press "Measure accuracy", this will pop up a box which asks you to stare at a dot in the centre of the screen for 5s. If you feel like you accidentally looked at something else you can do it again.
- press "Start recording" and inspect the image!
- when you're done, press the "Stop recording" button.
To view a heatmap of your session
cd ./analysis
- python analysis.py
- Make sure your face/eyes are well illuminated
- Bigger screen will increase the accuracy. You can use an external monitor as long as your webcam has a good view of your face.
- Try to keep your head still and inside the box
There are two servers, one for serving HTML and the other providing an API for saving data. You can start them like this
python simple-htts-server.py
python data-server.py
in the root directory.
The web app code is located in src
. Most javascript files are dependencies and standard webgazer template apart form data_logging.js
which sets up the data logging capability. There are also some edits to main.js
Analysis code is in the analysis
directory. generate.py
takes the local NetCDF file and generates a borderless image plot. analyse.py
takes pupil data and creates a plot of dwell time as a function of binned data value.
Todo:
- investigate accuracy [DONE?]
- adapt the data saving to send a proper json and not hide stuff in the parameters [DONE]
- calibrate on mouseclick [DONE]
- heatmap creator [DONE]
- add comment box below
- add Iris to pipenv
- add heatmap processing to backend API
- add more images