Skip to content

Alpha Lab

Pupil Labs is made up of people who are curious by nature. We are researchers, designers, toolmakers, and professional tinkerers. We enjoy building quick prototypes and demos to explore our curiosities. We built Alpha Lab so that we can have a centralized place to collect the results of our explorations and to share it with the world.

Alpha Lab is not a place for official product documentation. Everything you find here should be considered a work in progress, and may even be a bit rough around the edges. That is the nature of exploration!

We encourage you to read through the results and go further - play around, build from the ideas here, hack away!

Build an AI Vision Assistant

Experiment with assistive scene understanding applications using GPT-4V (an extension of GPT4 that can interpret images) and Pupil Labs eye tracking.


Detect Eye Blinks With Neon

Apply Pupil Labs blink detection algorithm to Neon recordings programmatically, offline or in real-time using Pupil Labs real-time Python API.


Build Gaze-Contingent Assistive Applications

Build your very own gaze-contingent assistive applications (such as a gaze-controlled input device) using Neon eye tracking and our real-time screen gaze package.


Map Gaze Onto a 3D Model of an Environment

Map gaze onto a 3D model of an environment and visualise gaze patterns as 3D heatmaps using Pupil Cloud's Reference Image Mapper and Nerfstudio.


Uncover Gaze Behaviour on Phones

Capture and analyze users' viewing behaviour when focusing on small icons and features of mobile applications using Neon eye tracking alongside existing Cloud and Alpha Lab tools.


Generate Scanpath Visualisations

Generate both static and dynamic scanpath visualisations using exported data from Pupil Cloud's Reference Image Mapper.


Map Gaze Throughout an Entire Room

Use Pupil Cloud's Reference Image Mapper to Map gaze onto multiple areas of an entire room as participants freely navigate around it.


Map Gaze Onto Body Parts

Map gaze behaviour on body parts that appear in the scene video of Neon or Pupil Invisible eye tracking footage.


Map Gaze Onto Dynamic Screen Content

Map and visualise gaze onto a screen with dynamic content, e.g. a video, web browsing, or other, using Pupil Cloud's Reference Image Mapper and screen recording software.


Define Areas of Interest and Gaze Metrics

Define areas of interest and compute gaze metrics, such as dwell time and time to first fixation, with data downloaded from Pupil Cloud's Reference Image Mapper.


Use Neon with Pupil Capture

Use your Neon module as if you were using Pupil Core. Connect it to a laptop, and record using Pupil Capture.


Undistort Video and Gaze Data

Learn how to undistort the scene camera distortions and apply it to gaze positions.

Copyright 2023 Pupil Labs GmbH. All rights reserved.