Everyday objects in the house can now easily become remote controls. You could even change the channel by moving your cup of coffee.
Christopher Clarke, a Ph.D. student at Lancaster University’s School of Computing and Communications, and developer of the technology said: “Spontaneous spatial coupling is a new approach to gesture control that works by matching movement instead of asking the computer to recognize a specific object.”
In a paper – ‘Matchpoint: Spontaneous spatial coupling of body movement for touch-less pointing’ – researchers from Lancaster University show how the novel technique allows body movement, or movement of objects, to be used to interact with screens.
How Matchpoint technology works
‘Matchpoint’ technology only requires a simple webcam.
It works by displaying moving targets that orbit around a small circular widget in the corner of the screen.
These targets correspond to different functions – such as volume, changing the channel or viewing a menu.
The user synchronizes the direction of movement of the target, with their hand, head or an object, to achieve what researchers call ‘spontaneous spatial coupling’, which activates the desired function.
New gesture control technology can turn everyday objects into remote controls
Unlike existing gesture control technology, Matchpoint software does not look for a specific body part – such as a hand. Lancaster’s technology looks for rotating movement so it doesn’t require calibration or the software to have prior knowledge of objects.
In addition to short-term couplings, users can also link stationary objects to controls, which even when left for prolonged periods will retain their control function.
For example, a mug sat on a table could change a track on a music player when moved left or right, and a rolling toy car could be used to adjust volume.
Objects can lose their coupling with controls simply by removing them from the camera’s field of view.
“Our method allows for a much more user-friendly experience where you can change channels without having to put down your drink, or change your position, whether that is relaxing on the sofa or standing in the kitchen following a recipe,” Clarke said.
Researchers believe Matchpoint is also suitable to be used as an accessibility tool for people who are unable to use traditional pointers, such as remote controls and a mouse and keyboard.
As well as televisions, the technology can also be used with other screens. For example, YouTube videos could be easily paused and rewound on tablets.
Computer-Human Interaction and Computer Graphics
Matchpoint also allows users to manipulate images on whiteboards by using two hands to zoom in and out and rotate images. Multiple pointers can also be created to allow more than one user to point at drawings or pictures on interactive whiteboards simultaneously.
The researchers on the paper ‘Matchpoint: Spontaneous spatial coupling of body movement for touch-less pointing’ are Christopher Clarke and Professor Hans Gellersen, both of Lancaster University’s School of Computing and Communications. The paper will be presented at the UIST2017 conference in Quebec City this October.
The Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology (UIST) is a forum for innovations in Human-Computer Interfaces.
Sponsored by ACM Special Interest Groups on Computer-Human Interaction and Computer Graphics, UIST brings together people from graphical & web user interfaces, tangible & ubiquitous computing, virtual & augmented reality, multimedia, new input & output devices, and Computer-Supported Cooperative Work and Social Computing (CSCW).