1715737528 Stk093 Google 04

Developers can now integrate accessibility into their apps, allowing users to control the cursor with facial gestures or head movements. For example, they can open their mouth to move the cursor or raise their eyebrows to click and drag.

Announced during Google I/O for the desktop last year, Project Gameface uses the device’s camera and a database of facial expressions. MediaPipe’s Face Detection API to manipulate the cursor.

“Through the device’s camera, it perfectly tracks facial expressions and head movements, turning them into intuitive and personalized controls,” Google said in its announcement. “Now developers can create apps where users can configure their experience by customizing facial expressions, gesture sizes, cursor speed and more.”

Although Gameface was originally developed for gamers, Google said it is partnering with it as well Enter — a social enterprise in India that focuses on accessibility — to see how they can expand it to other settings like work, school and social situations.

Project Gameface is inspired by the rectangular video game streamer Lance Carrwith muscular dystrophy. Carr partnered with Google on the project to create a more affordable and accessible alternative to expensive head tracking systems.