How AI helps facial recognition actually get to know your face

Royal Caribbean Cruises has begun using facial recognition systems to speed passengers on their way through security and ID checks.

Royal Caribbean Cruises has begun using facial systems to speed passengers on their way through security and ID checks.

You and your family are at the pier, giddy to board the massive cruise ship docked nearby. Ahead lies a week of sunny beaches, indulgent buffet feasts and lounging around doing absolutely nothing.

And then you see the long lines for security, baggage and ID checks. It often takes 75 minutes for passengers to check in, but the Pool Deck looks a lifetime away.

Royal Caribbean Cruises thinks it has the answer to getting passengers aboard faster: AI-powered facial recognition.

In December, passengers started taking part in a pilot program at a company embarkation point in Ft. Lauderdale, Florida. Passengers take selfies with the company's app, then at the port, an AI-powered database matches their faces. After a quick double-check, Royal Caribbean's staff members direct guests to their cabins.

The result: all-time high customer satisfaction.

“We wanted to turn what was a cold transaction into a really welcoming moment,” said Jay Schneider, who runs the Miami company's digital operations. The goal is to get passengers “from car to bar in 10 minutes.”

Royal Caribbean Cruises is hardly alone. Facial recognition technology is used to spot friends on Facebook and unlock your iPhone. It's been rolled out in airports, at cash registers and on home security systems. It may soon be inescapable.

Propelling the spread of facial recognition systems are huge leaps in artificial intelligence, the technology that seeks to give computers some of the ability, versatility and even creativity of human thinking. The biggest improvements have come through a specific area of AI called neural networks, inspired by the actual workings of human brain cells. and software improvements enabled an approach called deep learning multiple layers of digital neurons that provide increasingly refined image analysis.

Overall, it's a profound change. Recognizing and interpreting human faces is so important to us that whole sections of our brains are devoted to it. As we teach computers those skills, our interactions with them become more convenient less like submitting database commands and more like dealing with the natural world in which we evolved. On the flip side, facial recognition can undercut privacy as our anonymity evaporates.

How neural networks work

In a training phase, neural networks scrutinize vast numbers of images of faces, learning on their own what's important in the recognition process. It's more accurate than the old way, with programmers describing what eyes, noses and mouths look like.

“Some layers capture color and texture and gradients,” said Amit Roy-Chowdhury, chair of electrical and computer engineering at the University of California, Riverside. “As you go deeper, they capture the shape of different parts of the object and ultimately the shape of the object itself.”
Facial recognition: Your face, your password

After training, neural networks create a stripped-down mathematical representation for each . That representation can be compared rapidly with those of other faces, letting a facial recognition system decide if a person entering an office is on an authorized employee list or raise an alert when a potential shoplifter also appears on police arrest records.

To work well, facial recognition systems need images with well-illuminated, clear faces that give a neural network detailed, accurate data. That's why passport photos require even lighting, plain backgrounds, neutral expressions and subjects facing straight toward the camera. “You try to make your input as consistent as possible so your analysis can be easier,” said Raj Minhas, leader of Xerox's PARC Interaction and Analytics Lab.

Errors in the system

Facial recognition systems are getting better, but can still return errors. False positives match a face when no match should exist, such as when a person's image isn't in the database. A false negative occurs when the system misses a match it should have made.

Top-notch facial recognition systems today are 99.7 percent accurate with good lighting conditions, a 2018 study from the National Institute of Standards and Technology found.

One way to reduce errors is to tune the system by pushing some of the data apart to make it clearer for the neural net, reducing the likelihood of a false positive, said Marios Savvides, director of the CyLab Biometrics Center at Carnegie Mellon University.

Savvides' team is also blending modern AI with an older approach called correlation filters that allows neural networks to improve facial recognition accuracy when faces are obscured, poorly lit or facing away from the camera. Overall, Savvides' team is able to reconstruct faces even when they're looking away or obscured by breathing masks, he said. “We live in a time where AI can surpass the human brain's capability,” he said.

Another way to improve facial recognition is to pair it with other attributes, such as fingerprints, voice prints and other biometric data, or factors such as passwords. That might not work well when a system is just scanning people walking into a store, but it's pretty common for controlled situations where people are logging into a network.

You might also like

Comments are closed.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. AcceptRead More