This chapter does not appear in the book.
Did you ever want to control your computer's mouse cursor by simply looking at it? Guiding it from one side of the screen to the other with a flick of your eye, at the speed of thought?
To be clear, we're talking about pupil or iris tracking, with movements calculated relative to the eye's border. The intention is that as the user's pupil moves to the left, right, up and down, then so will the cursor.
Pupil tracking using a PC/laptop's webcam was the subject of the final year project of my student, Chonmaphat Roonnapak. He examined a variety of tracking techniques, chose the best, and wrote a game that was controlled by pupil/iris movements.
I'll describe a simplified version of his work in this chapter, using a Haar classifier and blob detection for the tracking, and employing the results to move a cursor inside a window. An overview of the approach is shown in the diagram below.
Chonmaphat had to give up on the idea of using his laptop's webcam as the input source, since the quality of the captured images weren't good enough. To reliably track a pupil, the camera has to be positioned close to the eye, and be provided with plentiful, constant illumination (as seen in the top-left picture in the diagram). The ideal situation would be to attach the webcam to a helmet or cap so that it would stay mostly stationary relative to the eye as the user's head moves.
The image processing has two main parts: first the eye is found inside the webcam image using a pre-existing Haar classifier trained to detect a left eye. The second stage uses a dark colored blob detector to find the pupil, or more usually the iris.
The center of the eye rectangle is treated as the 'origin', and the offset of the center of the pupil/iris rectangle from that origin is calculated. This offset is scaled to generate coordinates relative to the center of the application window (see the bottom right of the diagram), and the target cursor is drawn at that position.