By Jedidiah Bracy, CIPP
Representatives from Google, Facebook, Intel, face.com and others using facial detection and recognition technology say they are building privacy protections in from the start and are willing to work with others to ensure that basic fair information practices like consumer notice and choice are maintained in emerging products. Concern remains, however, that the sheer amount of personal information already available online—combined with facial recognition technology—will irreparably erode individual privacy.
Speaking at Thursday’s Federal Trade Commission (FTC) roundtable, Face Facts: A Forum on Facial Recognition Technology, Google Product Counsel Benjamin Petrosky said the company has baked privacy protections into its newest offering, Find My Face. Rolled out in conjunction with the event and available to all Google+ users by next Monday, the technology scans photos that are uploaded to its social networking site and prompts the user to tag the faces of friends in the photo.
Petrosky stressed several times that the company welcomes feedback and comments on user privacy.
Newly appointed Facebook Policy Chief Privacy Officer Erin Egan said “as custodians of people’s photos,” the company takes individuals’ privacy “very seriously.” Egan noted, “it’s key that we educate users about the implications” of facial recognition, while adding that Facebook will take enforcement actions against abusers. A number of times, Egan also stressed that the company is embracing Privacy by Design in its products.
Yet, facial recognition is not just for tagging friends in photos.
Affective Interfaces Founder and CEO Jai Haissman discussed the company’s emotion-sensing technology. By using a webcam, the technology captures a user’s facial expressions to determine changes in emotion in correlation with a given product. As the company’s website says, “We help you understand how your customers are feeling about your brand, products and messaging.” For example, the technology could detect a user’s interest in a video game via changes in the user’s facial expressions. Based on the facial expressions, the software could determine if the user was bored. If so, the video game could change in real time to reengage the user.
Discussions of emotion-sensitive technology had some fearing harmful implications. Privacy Rights Clearinghouse Director Beth Givens, CIPP, said she was concerned that emotion sensitive technology “could be used in a manipulative way.”
Haissman said the company embraces privacy, transparency and informed consent and the company would be willing to “seek counsel” to improve its privacy protections.
Additional concerns about facial recognition technology and privacy were raised by academics and privacy advocates.
Carnegie Mellon University Associate Prof. Alessandro Acquisti said that the combination of technologies—increased disclosure through social networks, facial recognition, cloud computing and statistical re-identification—not only make it possible for the identification of personal information but in some instances, sensitive information such as an individual’s Social Security number, political or sexual preferences or even “inferable sensitive information” that could predict behavior.
Calling facial images “faceprints,” International Biometrics & Identification Association Vice Chairman Joseph Atick asserted that “a faceprint is biometric information and should be treated as personally identifiable information.”
FTC Bureau of Consumer Protection Deputy Director Jessica Rich summed up the day’s findings with optimism. She said that as facial recognition technology becomes more ingrained in products and services, more consumer awareness, control and strong built-in privacy protections are needed to ensure that privacy and innovation are cultivated.
“This workshop,” Rich said “starts that process.”