Pupil Labs is made up of people who are curious by nature. We are researchers, designers, toolmakers, and professional tinkerers. We enjoy building quick prototypes and demos to explore our curiosities. We built Alpha Lab so that we can have a centralized place to collect the results of our explorations and to share it with the world.
Alpha Lab is not a place for official product documentation. Everything you find here should be considered a work in progress, and may even be a bit rough around the edges. That is the nature of exploration!
We encourage you to read through the results and go further - play around, build from the ideas here, hack away!
Define AOIs and Calculate Gaze Metrics
Here we demonstrate how to make areas of interest using data downloaded from Pupil Cloud’s Reference Image Mapper.
Map and visualise gaze onto a display content using the Reference Image Mapper
Here we show you how you can use Pupil Cloud’s Reference Image Mapper to map gaze onto dynamic on-screen content - like a video.
Map gaze onto body parts using DensePose
Use detectron's densepose AI to segment and know at which part of a body a person is looking.
Map and visualize gaze on multiple reference images taken from the same environment
We pushed the limits of markerless mapping with Pupil Cloud’s Reference Image Mapper - scanning an entire apartment.
Generate static and dynamic scanpaths with Reference Image Mapper
Discover how to generate static and dynamic scanpaths with Pupil Cloud's Reference Image Mapper.
Uncover gaze behaviour on phone screens with Neon
Use Neon and existing Alpha Lab content to capture and characterise viewing behaviour on mobile phone screens.
Create 3D Models of your environment using Reference Image Mapper and nerfstudio
Create 3D Models of your environment using the reference image mapper and NerfStudio
A practical guide to implementing gaze contingency for assistive technology
Build gaze-contingent assistive applications with Neon!