Matt Moss had a chance to play around with the iOS 12 developer beta. In the process, he came to a neat realization: ARKit 2.0 opened up some unexpected possibilities.
“I saw that ARKit 2 introduced eye tracking and quickly wondered if it’s precise enough to determine where on the screen a user is looking,” he explained over on Twitter direct message. “Initially, I started to build the demo to see if this level of eye tracking was even possible.”
It turns out that it most definitely is, as the video demo he tweeted shows.
Control your iPhone with your eyes. Just look at a button to select it and blink to press. Powered by ARKit 2. #ARKit #ARKit2 #WWDC #iOS pic.twitter.com/ow8TwEkC8J
— Matt Moss (@thefuturematt) June 7, 2018
As Moss was quick to point out, the potential use for such tech isn’t just for the extremely lazy.
“Once the demo started to work, I began to think of all the possible use cases, the most important of which being accessibility,” he wrote.”I think this kind of technology could really improve the lives of people with disabilities.”
Nope ads won’t abuse this. Nah. No way. This is fine. What could happen. https://t.co/r474HjxUNd
— Josh Pitts (@midnite_runr) June 8, 2018
And while concerns over advertisers abusing extremely precise eye-tracking tools are legitimate, Moss didn’t develop the technology available in the new ARKit. He did, however, manage to find a way to potentially put it to a positive use.