Did you know a motivated attacker could spy on you simply by reading the vibrations in a bag of chips or on a lightbulb? Now, thanks to this new research, they could also spy using just your phone’s ambient light, in a very creative (and dedicated) type of attack.
The study, which reminds us of the MIT bag of chips and lightbulb spying techniques, shows how it’s possible to detect light intensity changes in your phone’s screen in order to deduce what your hands are doing.
From an IEEE report:
“In the scenario tested by Liu and his colleagues, light from the display is partially blocked by the user’s hand and reflects off their face. It is then picked up by the light sensor. From the dual photography perspective, though, the light can be imagined as traveling in the opposite direction from the sensor, with the hand casting a shadow on the display.
The team designed an inversion algorithm able to convert the readings from the light sensor into a 32-by-32-pixel image that captured the region just above the display. To test the approach, they took an off-the-shelf Samsung Galaxy View2 tablet with a 17.3-inch screen and placed a mannequin head and hand in front of it to simulate a person. They demonstrated that they were able to capture images of a variety of touch gestures, such as two-finger scrolling and three-finger pinches.
They also showed that they could capture a rough image of the user’s hand using a modified video of the cartoon characters Tom and Jerry, suggesting that illumination patterns could be concealed in videos.”
Yes, it’s possible but not easily doable, as both Liu and IEEE conclude that the attack is too slow and cumbersome to implement. In the scenario they studied, the fastest they were able to resolve hand gestures was 3.3 minutes, while the experiment with Tom and Jerry video took more than an hour.
Still, for a determined attacker (think state-sanctioned espionage), this could be another viable vector.
Photo by Eren Li