Skip To Main Content
A smart security camera.
A smart security camera. | Image: Texas A&M Engineering

Smart devices are integrating into our workspaces and homes, providing convenient features such as security cameras, temperature control, motor games and more. Smart devices often have cameras that transmit data to the cloud. The cloud uses analytics to recognize different actions and sends data back to the device. Recently these smart cameras have come under public scrutiny for creating hacking opportunities that infiltrate the privacy of the home. A balance must therefore be reached to protect the security of the user and maintain optimal performance of the device’s functionality.

Texas A&M University researcher Zhangyang Wang, along with his two doctoral students, Zhenyu Wu and Haotao Wang, have partnered with Adobe Research scientists Dr. Zhaowen Wang and Dr. Hailin Jin to find new ways to protect user privacy from video-enabled in-home devices. The collaboration with Adobe began in September 2017.

“We must reach a balance that allows people to use cloud-based services without exposing personally identifiable information,” said Zhangyang Wang, assistant professor in the computer science and engineering department.

An example of the privacy filter.
An example of the privacy filter. | Image: Texas A&M Engineering

Texas A&M and Adobe have developed new, cutting-edge techniques to solve this problem. Their work builds on adversarial machine learning, a research field that lies at the intersection of machine learning and cybersecurity.  

“Traditional machine learning tries to preserve and extract information — to maximize it,” Zhaowen Wang said. “Our approach is different. For example, adversarial learning can help us minimize recognizing a person’s identity while still seeing and understanding their actions.”

The team formulated a unique adversarial training framework to improve privacy-preserving visual recognition. Adversarial machine learning has two models. One model is trying to protect the information, while the other model does the opposite, trying to steal the information. The models then learn from competing with each other and advance their techniques.

Their adversarial training framework learns a smart “filtering” mechanism that can automatically convert a raw image to a privacy-preserving version. The learned filter can be embedded in the camera front-end, so that the image captured by the camera will have privacy information removed at the very beginning, before any transmission, storage or analytics. Experiments show that it cannot be hacked by other machine learning attacker models, and only video contents free of privacy information can get past the scrutiny of the filter.
An example of the privacy filter.
A video captures a man playing golf (left) while new video privacy technology shows a non-identifiable image of the golfer (right). | Image: Texas A&M Engineering

The filtering mechanism can also be used to secure privacy in human clinical trials. “Human research topics currently pixelate faces to retain privacy, but pixelating or downsampling is an oversimplified solution,” said Zhangyang Wang. “We all know someone we could recognize merely by their body shape, clothing or body language.”

Not protecting an individual’s data could lead to an unwanted security breach in these scenarios. Their filter presents a new, crucial layer of protection as their models have undergone many empirical experiments designed specifically to test the hackability of the image transmission and interpretation. A large-scale video dataset and benchmark study will be released soon verifying the results.

Their research is further described in the team’s 2018 European Conference on Computer Vision (ECCV) conference paper “Towards Privacy-Preserving Visual Recognition via Adversarial Training: A Pilot Study.”

 The researchers look forward to the software being accessible as an update that users can download, built into future smart home cameras, or used as a data processing software for de-identifying privacy-sensitive data.