The reason why giving biometric data is dangerous

16th November 2018 / United Kingdom
The reason why giving biometric data is dangerous

Last week TruePublica published an article entitled “UK Government Goes Full Orwellian” with its first paragraph reading; “We’ve been warning about this moment since the first day TruePublica went online. We said that the government would eventually take the biometric data of every single citizen living in Britain and use it for nefarious reasons.  DNA, fingerprint, face, and even voice data will be included. But that’s not all.”

 

We also wrote that – “Without any obstacles put in its way, the Home Office has essentially granted itself the right to end anonymity of any type to all the people of Britain.”

That article concluded that – “a new form of identity theft will develop with biometric data added to the armoury of criminals. At the very least, the government should restrict the collation of different types of biometric data into a single database. And it should certainly require that all biometric data be stored in the most secure manner possible. Currently, it is not proposing either as the database will be available to thousands of governments workers and hundreds of technology contractors.”

 

It is just a matter of weeks since the announcement by the British government to store biometric data when here comes the news that fingerprints can easily be ‘faked’ with a new piece of technology.

The Guardian reports that – “Researchers have used a neural network to generate fake fingerprints that could be used as a skeleton key to break into biometric security systems. The “DeepMasterPrints” created at New York University could mimic more than one in five fingerprints in a system that should only be wrong once in a thousand. It was possible because fingerprint scanners mostly check only a partial image, and there are many common features to fingerprints. The artificial fingerprints also look normal to the human eye. The researchers compared it to a “dictionary attack” where a hacker runs a list of common passwords against a security system. They may not be able to break into a specific account, but attacking a large number of accounts produces enough successes to make it worth the effort.”

 

Based on those insights, the researchers used a common machine learning technique, called a generative adversarial network, to artificially create new fingerprints that matched as many partial fingerprints as possible. To make matters worse, the neural network not only allowed them to create multiple fingerprint images, but it also created fakes which look convincingly like a real fingerprint to a human eye.

The technology is now out there to hack what was once thought of as impenetrable biometric data. It won’t be long before criminals hack their way into such systems with these very technological tools.

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


The government, its agencies or the contractors that work for them cannot, in any way keep personal data safe.

 

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.