The app uses the smartphone's built-in camera to register its environment.
By mimicking the firing of a pistol, for example, a user can switch to another browser tab, change the map's view from satellite to standard, or shoot down enemy planes in a game.
Spreading out fingers magnifies a section of a map or scrolls the page of a book forward.
The information that the app registers - the shape of the gesture, the parts of the hand - is reduced to a simple outline that is classified according to stored gestures.
More From This Section
It also recognises the hand's distance from the camera and warns the user when the hand is either too close or too far away.
"Many movement-recognition programmes need plenty of processor and memory power," said Hilliges, adding that their new algorithm uses a far smaller portion of computer memory and is thus ideal for smartphones.
The app's minimal processing footprint means it could also run on smart watches or in augmented-reality glasses.