Celebrated portrait photographers like Richard Avedon, Diane Arbus, and Martin Schoeller built their reputations with distinctive visual styles that may a time required careful control of lighting which was possible only in the studio.
Now MIT researchers, and their colleagues at Adobe Systems and the University of Virginia, have developed an algorithm that could allow you to transfer those distinctive styles to your own cellphone photos.
YiChang Shih, an MIT graduate student in electrical engineering and computer science and lead author on the Siggraph paper, said "style transfer" is a thriving area of graphics research - and, with Instagram, the basis of at least one billion-dollar company. But standard style-transfer techniques tend not to work well with close-ups of faces.
He said that they started with those filters but just found that they didn't work well with human faces and that our eyes are so sensitive to human faces. We're just intolerant to any minor errors.
Using off-the-shelf face recognition software, they first identify a portrait, in the desired style, that has characteristics similar to those of the photo to be modified. "We then find a dense correspondence - like eyes to eyes, beard to beard, skin to skin - and do this local transfer," Shih explains.
One consequence of local transfer, Shih says, is that the researchers' technique works much better with video than its predecessors, which used global parameters. Suppose, for instance, that a character on-screen is wearing glasses, and when she turns her head, light reflects briefly off the lenses.
That flash of light can significantly alter the global statistics of the image, and a global modification could overcompensate in the opposite direction. But with the researchers' new algorithm, the character's eyes are modified separately, so there's less variation in the rest of the image from frame to frame.