24/7 Customer Service (800) 927-7671

When I read of MasterCard’s plan to do selfie security with purchases, I wondered what the initially huge breach of biometric data is going to appear like.

by • February 25, 2016 • No Comments

Only MasterCard understands I’m Manny the cat

In 2015, MasterCard’s pilot program for Selfie Pay took place with Silicon Valley’s First Tech Federal Credit Union. So I’m going to manufacture a guess that the opportunities to troubleshoot user skin color were few and far between. I say this for the reason facial recognition innovation has a well documented problem “seeing” black individuals.

HP got a lot of bad press in 2009 for its cameras’ incompetence to “see” black faces. Horrifyingly, Google’s facial recognition software in 2015 tagged two African-Americans as gorillas. Google’s Yonatan Zunger reacted appropriately, yet stated in a tweet that “until not long ago, Google Photos was confvia white faces with dogs and seals. Machine learning is complex.”

Machine learning is indeed complex. So is security.

And don’t let current headlines fool you — the whole selfie-security plan wasn’t entirely a security-based decision.

“Selfie pay” was aimed at MasterCard’s millennial customers when it was revealed in July 2015. Ajay Bhalla, MasterCard’s president of enterprise security solutions, told the press it may be a way for the company to engage with young individuals. He introduced, “The new generation, that is into selfies … I ponder they’ll find it rad. They’ll embrace it.”

Reassuringly, college students reacted to Mr. Bhalla’s remarks with an appropriate amount of skepticism and mistrust. I just hope at any timeyone in Bhalla’s security chain “is into” encryption as much as selfies.

We may share your password with our advertisers

We can yell “encrypt or GTFO” at MasterCard all we want, and it won’t alter our other big problem with all of this: The breach that comes of inside. Meaning, when companies sell our quite own data in backroom deals to greedy brokers, or let it get siphoned into government databases behind the scenes.

Did you at any time ponder someone can sell your password to advertisers as marketable information of you? That is definitely the intersection we’re approximately.

Welcome to the entirely messed-up, behind-the-scenes free-for-all of facial recognition innovation in the private sector. There is nothing preventing private entities (businesses, app createers, data brokers or advertisers) of selling, trading, or otherwise profiting of an individual’s biometric information. Distressingly, the U.S. government has just gotten as far as a working group to create rules around companies via facial recognition. Voluntary rules, that is.

This gets super-worrying when you consider there are companies, hell-bent on via at any timey scrap of user data for profit, that are pouring money into building facial recognition both accurate and ubiquitous. Similar to Facebook, whose “DeepFace” project can many most likely commingle with its billion-user-rich stash of synonymous photos. Even yet its name is a face-palm, DeepFace’s competence to select someone by photo alone is up to a astonishing 97% accuracy.

Entities like Facebook are a excellent example of where facial recognition and data monetization are coming together in ways that are troubling. In fact, Facebook has been via facial recognition to increase the worth of its data since at quite least 2011 — when the Electronic Privacy Information Center appealed to the FTC to “specifically prohibit the use of Facebook’s biometric image database by any law enforcement agency in the world, absent a revealing of enough legal system, consistent with international human rights norms.”

#NoFilter surveillance

EPIC isn’t alone in its worries of protecting consumers of facial recognition databases. At a Senate Judiciary subcommittee hearing in 2012, Sen. Al Franken remarked that “Facebook may have made the world’s biggest privately held database of face prints without the explicit understandledge of its users.”

Franken continued, linking the deficits in consumer protections with the FBI’s then-new facial-recognition program created to select individuals of interest called Next Generation Identification (NGI). “The FBI pilot may be abused to not just select protesters at political events and rallies, but to target them for selective jailing and prosecution, stifling their First Amendment rights,” he said. NGI became fully operational in 2014.

MasterCard’s Ajay Bhalla most likely wasn’t pondering of that when he was attempting to get down with the kids. He most likely in addition is not going to understand that Selfie Pay can cross-match and compare quite well with commercial surveillance products like TrapWire, that is sold to and implemented by private entities, the U.S. government “and its allies overseas.”

TrapWire combines different types of intel surveillance technologies with tracking and location data, individual profile histories of different types of sources (data mining and social media) and image data analysis (such as facial recognition; TrapWire’s video component) to monitor individuals under the guise of threat detection.

Upon the 2012 release of Wikileaks’ Stratfor documents, news of TrapWire and sibling surveillance technologies (like Europe’s INDECT) were met with surprise, fear, outrage, and protests. A worthwhile number of TrapWire and INDECT’s opponents believe the surveillance systems to be direct threats to privacy and civil freedoms, and that their implementation may constitute human rights violations.

MasterCard’s Selfie Pay can quite most likely be opening the door to consumer-level biometric security, and — if done properly — that may be a quite great thing. I just hope the methods of storing and protecting this data are as shrewd and clat any time as the individuals profiting off it by passing it around in the background.

Latest posts

by admin • March 5, 2017