Etail giant eBay has released a tool for head-based control of smartphone devices, dubbed HeadGaze, under an open-source licence, offering it freely to anyone looking towards accessibility for those with physical impairments.
The near-perfection of the smartphone is undeniable: A pocket-size glass-and-metal slab which replaces the telephone, still camera, video camera, Walkman, notepad, portable games console, credit card, fax machine, email terminal, web kiosk, and increasingly computers themselves, the smartphone is an incredible device. It's not necessarily an accessible one, however: While most smartphones are great for single-handed use, those without limbs or without the use of their limbs are left unable to take advantage of their capabilities - and that's something eBay, of all companies, has addressed with a new tool called HeadGaze.
'Do you like shopping alone? I wish I could shop by myself, without my mother’s incessant color suggestions and best friend’s unwarranted comments on my brand preferences. As someone with extensive motor impairments, I do not have full control of my limbs,' explains Muratcan Cicek. 'Consequently, I am unable to walk or grab anything with my hands. These limitations hinder my ability to perform everyday tasks, like going to the grocery store and shopping independently - even though I have my own income.
'This year as part of my internship project at eBay, my team and I developed HeadGaze, a reusable technology library that tracks head movement on your iPhone X and starting today, the technology is available via open source on GitHub.com. The first of its kind, this technology uses Apple ARKit and the iPhone X camera to track your head motion so you can navigate your phone easily without using your hands.'
The HeadGaze system works by creating a 'virtual stylus' which follows the motion of the user's head, as read by the depth-sensing front-facing camera on the Apple iPhone X through the ARKit augmented reality application programming interface (API). Mapped to a 3D space, this is then used to move the cursor along the screen and recognise gestures which can be used to trigger scrolling and clicking.
Cicek sees HeadGaze as having potential beyond accessibility, too: 'HeadGaze enables you to scroll and interact on your phone with only subtle head movements. Think of all the ways that this could be brought to life. Tired of trying to scroll through a recipe on your phone screen with greasy fingers while cooking? Too messy to follow the how-to manual on your cell phone while you’re tinkering with the car engine under the hood? Too cold to remove your gloves to use your phone? In addition to this head gazing experience, we’re exploring an experience that tracks eye movements. The fusion of these gazing experiences open up a broader possibility on defining various hands-free gestures, enabling much more interesting applications.'
November 22 2019 | 13:00