Thursday, July 25, 2024

Revolutionary potential, and potential abuse, of facial recognition technology developed at UNCW

The technology developed at UNCW has made some breakthroughs possible in fighting crime and terrorism. But it also has the potential for abuse. (Port City Daily photo / FILE PHOTO)
The technology developed at UNCW has made some breakthroughs possible in fighting crime and terrorism. But it also has the potential for abuse. (Port City Daily photo / FILE PHOTO)

WILMINGTON — The “Face Lab” at University of North Carolina Wilmington has generated technology that has helped capture criminals and foil terrorists, and could provide life-saving breakthroughs in diagnostic technology. But that technology could also be exploited for profit or to violate privacy.

The small lab at UNCW, including this 3D imaging set-up, has been responsible for millions of dollars of funding, numerous patents and several key breakthroughs in facial recognition technology. (Port City Daily photo / BENJAMIN SCHACHTMAN)
The small lab at UNCW, including this 3D imaging set-up, has been responsible for millions of dollars of funding, numerous patents and several key breakthroughs in facial recognition technology. (Port City Daily photo / BENJAMIN SCHACHTMAN)

It’s a struggle – one that’s more than philosophical. Dr. Karl Ricanek has faced it for years.

Ricanek specialized his Ph.D. studies in facial recognition and then got his start working for the Department of Defense, developing facial recognition software. In 1999 he came to UNCW and then opened the “Face Lab,” for researching facial recognition, in early 2001. He also co-founded the Wilmington-based company Lapetus in 2014; the company uses Ricanek’s research to make predictions about people’s health based on “markers” on their face.

Facial recognition – next generation crime fighting

But before Ricanek founded Lapetus, he was helping government agencies and law enforcement track down suspects. He’s developed algorithms used in some high-profile cases, though he’s not at liberty to say which ones.

Technology developed by Ricanek and his lab has helped several agencies create age-progression models for long-running cases. The work is ongoing, and some of the crime-fighting technology being worked on in the “Face Lab” sounds positively like science fiction.

“One of the things being worked on this lab is generating a composite image of a face from a DNA sample,” Ricanek said.

The state-of-the art research started with about 30 “significant loci” in the DNA that help generate the appearance of the human face, but it’s become increasingly complicated; current research is looking at closer to 500 parts of DNA information that have a direct or secondary impact on what a face looks like. But the technology is solid, Ricanek said. In theory, technology pioneered in his lab could allow law enforcement to generate the image of a victim or a perpetrator from a drop of blood or a strand of hair.

Dr. Karl Ricanek. (Port City Daily photo / BENJAMIN SCHACHTMAN)
Dr. Karl Ricanek in the ‘Face Lab.’ (Port City Daily photo / BENJAMIN SCHACHTMAN)

The man who was his own nurse

Ricanek’s work with governmental agencies has led him to some interesting cases. One of Ricanek’s favorite cases – that he can talk about – is the story of the man who was his own nurse.

“Several years ago, North Carolina changed how they give you a driver’s license. They mail it to you, and most people don’t think about it, they just say, ‘it’s fine, I’ll get it in a week.’ What they’re doing is running your photograph against a database of other photographs to see if there’s a match, same photo, different names,” Ricanek said.

In one such case, an algorithm Ricanek worked on brought up two images: a man who was on disability, and the female health worker who was taking care of him.

“They could have been related, brother and sister, to have such similar faces, but that’s unusual, and you can’t usually have family members work like that — so investigators had to go and look into it. As it turns out, the female driver’s license was the man in drag. So, he was getting money for his disability, but he was also getting paid to work as his own nurse,” Rickanek said.

Good intentions, good technology?

Gender came into play again more recently, when a student showed Ricanek a YouTube video of the process of gender reassignment. The process ended in surgery, but some of the most dramatic facial changes came from the use of hormones. Ricanek was fascinated — and saw a problem. He wondered if a course of hormones could be enough to “spoof” or throw off the facial recognition technology used by agencies responsible for airport and border security.

Ricanek studied the problem and discovered it could be a cheap but effective type of what security agencies call a “presentation attack defense,” a way to sneak past high-tech security measures.

“It turned out, someone – say a terrorist – could use $10 worth of estrogen and fundamentally break the technology we used, at the time, at our border,” Ricanek said.

Ricanek alerted government agencies. He also wrote up his findings in a paper. Then, to his surprise, he was attacked on Twitter.

“The guy hadn’t read the article, he didn’t try to understand it. He just tweeted, ‘this guy is trying to out transgenders,’” Ricanek said.

“It’s a population that’s under attack, I totally understand that. But that was never the goal of the technology,” Ricanek said.

And that’s the problem with a lot of cutting edge technology: how and why it’s created doesn’t always guide how it’s used or perceived.

For Ricanek, creating technology is part of what he refers – only partially jokingly – to as his “addiction for solving puzzles and problems.” It’s a scientific challenge for Ricanek and his lab team. But once technology leaves the lab, there’s no saying for sure how it will be used.

Ricanek said, “it’s inevitable. Every time someone creates something really cool, someone else will come along and look for a way to exploit it. You can’t control how someone else might want to use the technology. We don’t have anyone who’s excited about the technology and the development, but also really concerned with the responsible use of it.”

Potential to help, potential for abuse

Take for example the facial recognition technology that Ricanek has helped pioneer: it’s capable of some impressive feats that could have real benefits to the medical community. Ricanek hopes that Lapetus’ facial recognition software will ultimately be able to detect the early “imprints” or “marks” left on the face by chronic diseases and conditions like cardiovascular disease, diabetes, alcoholism and long-term tobacco use. He even thinks it could have successful applications for detecting opioid addiction.

Lapetus's mobile app allows a user to take a 'selfie,' and get their gender, body mass index, and their 'apparent age.' It's just the first stage of the technology to predict future medical health. (Port City Daily photo / BENJAMIN SCHACHTMAN)
Lapetus’s mobile app allows a user to take a ‘selfie,’ and get their gender, body mass index, and their ‘apparent age.’ It’s just the first stage of the technology to predict future medical health. (Port City Daily photo / BENJAMIN SCHACHTMAN)

“We’re talking five years from now. Your mirror, where you look at yourself every morning, will be able to diagnose long-term health issues, just by scanning your face every morning,” Ricanek said.

But the technology could be abused, for example by health insurance companies that use the information to increase the premiums of those who show these “imprints” on their faces.

“Well, my solution for that would be universal health care, that’s an easy fix. Then the potential for abuse is gone and we’re talking about a diagnostic tool, helping doctors help patients,” Ricanek said. “But, yes, the technology, what we’ve created, is for because we want to make a positive impact in people’s lives. Someone else might just want to exploit it.”

Ricanek acknowledged a similar concern for the surveillance technology.

“People aren’t ready for the next five years,” Ricanek said, adding that when it came to a culture of constant surveillance, “we’ll be there in a few years. In Chicago, there are places where you can’t take two steps without being on three or four surveillance cameras.”

So how far is too far? How much tech is too much?

Ricanek said that, when it came to national security, he had faith in its responsible use. But when it came to other arenas of technology, it was a challenge.

“Absolutely, you’re going to be challenged by that — to study, to do the work in the lab, that’s the work you love. But you’ve also got to speak up, to be an advocate for its responsible use. You’ve got to stand up for that,” Racinek said.


Send comments and tips to Benjamin Schachtman at ben@localvoicemedia.com, @pcdben on Twitter, and (910) 538-2001.

Related Articles