# Welcome to Alpha Lab!

Pupil Labs is made up of people who are curious by nature. We are researchers, designers, toolmakers, and professional tinkerers. We enjoy building quick prototypes and demos to explore our curiosities. We built Alpha Lab so that we can have a centralized place to collect the results of our explorations and to share it with the world.

Alpha Lab is not a place for official product documentation. Everything you find here should be considered a work in progress, and may even be a bit rough around the edges. That is the nature of exploration!

We encourage you to read through the results and go further - play around, build from the ideas here, hack away!

# Sharing is caring

Some of the prototypes and demos you see here came from our own home-grown curiosities. Others were inspired by questions from you - members of our community - via discussion on discord.

If you have an idea that you want us to explore let us know. It could be a demo of using our existing tools, or an out of the box idea that combines some new state of the art deep learning pipeline with eye tracking.

# Show and tell

Enough talk; let’s dive in.

AOIs

Here we demonstrate how to make areas of interest using data downloaded from Pupil Cloud’s Reference Image Mapper.

Netflix and fixate

Here we show you how you can use Pupil Invisible + Pupil Cloud’s Reference Image Mapper to map gaze onto dynamic on screen content - like a video.

RIM Room

We pushed the limits of markerless mapping with Pupil Cloud’s Reference Image Mapper - scanning an entire apartment.

Look at my hand!

Use detectron's densepose AI to segment and know at which part of a body a person is looking at.