Controlling apps with your feet is the focus of new research at the University of Waterloo.
Researchers found slight variations in how you walk could be enough to tell augmented reality devices what you want to do.
Computer science professor Daniel Vogel at the university was inspired while he was on a walk that is part of his regular routine – ordering Starbucks remotely on the way.
“I pull up my phone and I have mitts on. I have to take them off, I fumble with it, it’s very cold. And I thought there has to be another way,” said Vogel.
He always wants to think one step ahead, so he looked toward his feet.
“The idea is to change subtly the way you walk to control this digital information. I don’t have to take my phone out. I can leave my mittens on,” he said.
With hands occupied, it can take something as simple as the tap of a foot or a slower stride to mimic how we’d usually select commands with our hands.
This method allows for movements that draw a little less attention than someone in a big headset waving their arms and hands around to make a selection.
The goal is to focus on normal gait gestures – the intentional variants in how you walk.
But there is a fine line.
“Using your foot might be risky because in some way, it’s maybe less socially acceptable [if people exaggerate the movements] like Monty Python’s walk,” said lead author and former visiting scholar at UW Ching-Yi Tsai.
Although, it seems like the researchers have found techniques to make the futuristic feet movements look rather normal. The tests have shown they can be used for several different simple tasks too.
“Scrolling through a web page or scrolling through a menu, scrolling through a list of apps, adjusting the brightness or … volume,” said Tsai.
As augmented reality glasses inch closer to mainstream use, the researchers have shown what is possible and that this concept has legs.