User Privacy a Hot Topic
CES is one of the world’s largest technology exhibitions. For manufacturers it is the place to be to show what they are doing and which products they will launch. Tech giant Apple is usually not present, because they organize their own events to unveil new products. The last time Apple was at CES was in 1992. This year however, Apple was invited to a privacy roundtable, together with executives from Facebook and Procter & Gamble. Lately, privacy has taken center stage for many consumer businesses, including Apple. During last year’s CES, Apple launched a privacy campaign. On large billboards people could read “What happens on your iPhone, stays on your iPhone”, a reference to the famous Las Vegas slogan.
Possible Child Exploitation
During the roundtable, Apple Senior Director of Global Privacy, Jane Horvath, discussed the company’s commitment to help combat child sexual exploitation. Jane Horvath also explained that there is a constant tug of war with investigation agencies. They want to be able to access and scan users’ data, to detect crimes and undermine potential threats. Although Apple did not reveal how long they have been scanning users’ images and what methods they use, it is certainly not doing this in secret. In a statement on the Apple’s legal website the company is upfront (statement taken offline by source): “Apple is dedicated to protecting children throughout our ecosystem wherever our products are used […]. As part of this commitment, Apple uses image matching technology to help find and report child exploitation. Much like spam filters in email, our systems use electronic signatures to find suspected child exploitation. We validate each match with individual review. Accounts with child exploitation content violate our terms and conditions of service, and any accounts we find with this material will be disabled.”
Hashing Technology
Several other companies, including Facebook, Twitter and Google, use an image scanning tool called PhotoDNA. This software compares photos with databases of known abuse images. To do so, the software computes a unique hash that represents a specific image and is resistant to alterations. This way, the photo leaves a sort of fingerprint, even when “hidden” in another image. The software is able to analyze still images, as well as video and audio files, without actually de-encrypting the content itself. PhotoDNA cannot mistakenly flag an innocent image of a child, as it can only be used to identify copies of known and existing material. But unfortunately, the software doesn’t catch everything. In the past, Instagram users have managed to get around scanning software by using a photo that had a hash attached to it as a profile picture and in videos.
Privacy Discussion Ongoing
So far, brands like Apple have been successful in refusing to reveal users’ data. On the other hand, their own track record is somewhat mixed. The question is whether scanning techniques will further erode users’ privacy. After all, scanning techniques could also be used for other types of images as well. So how secure are private images, now and in the future? What happens with controversial images that are not illegal in this day and age, but that users want to keep private anyway? Or that are illegal in other countries? No doubt data security and privacy will take center stage for many more roundtables to come.