The bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.
Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand 'sees' and reacts in one fluid movement.
A small number of amputees have already trialled the new technology developed by researchers at the Newcastle University in the UK.
"Using computer vision, we have developed a bionic hand which can respond automatically - in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction," said Nazarpour.
More From This Section
"Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison," he said.
Current prosthetic hands are controlled via myoelectric signals - that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them takes practice, concentration and time.
Using neural networks - the basis for artificial intelligence - researchers showed the computer numerous object images and taught it to recognise the 'grip' needed for different objects.
"We would show the computer a picture of, for example, a stick," said Ghazal Ghazaei, from at Newcastle University.
"But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up," she said.
The study was published in the Journal of Neural Engineering.