Announced earlier this month, Google’s Project Gameface is coming to Android. It is an open-source hands-free gaming mouse, which allows differently abled people to control the cursor on a computer using facial gestures and head movements. Initially available only for PCs, it is now expanded to the Android operating system – announced Google at its annual developers-focused conference, Google I/O. So what is Project Gameface and how does it work? Let us explain:
What is Project Gameface
On May 10, Google announced Project Gameface, an open-source code that allows developers to create software for a hands-free gaming mouse. The technology uses head tracking and facial gesture recognition for controlling a computer’s cursor. Google said that the project was inspired by the story of quadriplegic video game streamer, Lance Carr, who suffered from muscular dystrophy, a progressive disease that weakens muscles. Google said that the aim of Project Gameface is to give differently abled people new additional means to operate devices, while being cost-effective for wider reach.
Project Gameface on desktop
Also Read
How Project Gameface works
Through the open-source code, Google offers developers means to track facial expressions and head movements, utilising the device's camera or an external camera. The gestures and movements are then translated into intuitive and personalised control. Using the Project Gameface code, developers can also offer users personalised configurations with the ability to customise facial expressions, gesture sizes, cursor speed, and more.
Google said that it has collaborated with Incluzza, a social enterprise in India, to learn how-to expand the technology beyond gaming. The company said that it is working on improvements to incorporate other abilities and settings such as typing and more.
Project Gameface on Android
At its developers’ conference on May 14, Google announced that it is expanding the compatibility of Project Gameface to work on Android devices.
To work on Android smartphones and tablets, Google has replicated the same idea and has incorporated a new virtual cursor on the operating system (OS) using the Android accessibility service. Google said that it is leveraging MediaPipe’s Face Landmarks Detection application programming interface (API), which understands 52 facial gestures such as raising left eyebrow or opening mouth. Google is utilising the understanding of these expressions to offer control of a wide range of functions. The company said that this will also give developers the option to set different thresholds for recognising each expression, offering customisation options.
Project Gameface on Android
Project Gameface: Availability
Open source code for Project Gameface is now available on Google’s Github repository for developers and enterprises.
How is Google’s Project Gameface different from Apple’s Eye tracking
Apple on May 15 announced an array of accessibility features that will be coming to its platforms later this year. This includes the new Eye Tracking feature for iPhone and iPad that it said will allow users to navigate their iPhone or iPad with just their eyes, without requiring any supplementary hardware. Apple said the eye tracking feature is powered by on-device capabilities and the data used for this feature to work is stored on the device itself and not shared even with Apple.
While Google’s Gameface offers similar capabilities, it uses head gestures and facial expressions to control the navigation rather than eye-tracking. Additionally, Apple’s eye-tracking feature will be built into the devices operating software, while Gameface is an open-source code enabling third-party app developers to allow the accessibility feature on their own apps and software.