Developers can now integrate the accessibility feature into their apps, allowing users to control the cursor with facial gestures or by moving their head. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.
Announced during last year’s Google I/O for desktop, Project Gameface uses the device’s camera and a database of computer facial expressions. MediaPipe Facial Landmark Detection API to manipulate the cursor.
“Through the device’s camera, it seamlessly tracks facial expressions and head movements, translating them into intuitive, personalized control,” Google explained in its announcement. “Developers can now create apps where their users can configure their experience by customizing facial expressions, gesture sizes, cursor speed, and more.”
While Gameface was initially made for gamers, Google says it has also partnered with Inclusion — a social enterprise in India focused on accessibility — to see how they can expand it to other settings like work, school and social situations.
Project Gameface was inspired by quadriplegic video game streamer Lance Carr, who has muscular dystrophy. Carr collaborated with Google on the project, aiming to create a more affordable and accessible alternative to expensive head tracking systems.