"It seems a lot of people want to buy things they see in someone else's home or in a photo, but they do not know where to look," said Sean Bell from Cornell University in the US.
The system relies on "deep learning," a neural network that enables a computer to match a submitted photo with a vast database of "iconic images" from manufacturers' catalogues or specialised websites devoted to home furnishings.
"Deep learning" combines several layers of neurons that represent different aspects of the data - earlier layers typically represent edges and lines, middle layers represent parts and shapes, and later layers represent entire objects and concepts, they said.
Researchers used crowdsourcing to prepare a collection of images to train the neural network. They showed home workers scene photos and asked them to draw boxes around objects.
Also Read
Rather than force the computer to go through the entire database looking for a match, the system begins by using the neural network to generate a "fingerprint" of a submitted image, based on very broad characteristics of how the pixels are arranged, researchers said.
Then the computer can search just a local area of the database, analogous to searching for a phone number in just one area code.
"I am excited by the importance of this for the design industry," said Kavita Bala, professor at Cornell.
Disclaimer: No Business Standard Journalist was involved in creation of this content