search slide
search slide
pages bottom

Are you a terrorist, an academic, or a poker player? This startup says it can tell from your face

Philip K. Dick’s science fiction short story, “The Minority Report,” foresaw a future in which prescient mutants can predict and prevent crimes, and help the Precrime division apprehend criminals before any crime is committed. Such a crime-free society may seem utopian, but Dick remained skeptical, contemplating eternal questions – What role does free will play in our decision making? What freedoms should we concede for security? And how precise must a prediction method be?

An Israeli startup may help us answer at least some of these questions with their facial personality profiling software, which they claim can identify terrorists, academics, professional poker players, and pedophiles by facial features alone. Supported by computer vision and machine learning technology, Faception has catalogued various classifiers that their software uses to analyze character, behavior, and personality traits “with a high level of accuracy.”

How accurate? About 80 percent, according to the Washington Post.

Prior to a recent poker tournament sponsored by a startup affiliated with Faception, the profiling software analyzed photos of the tournament’s 50 amateur players and compared them to the system’s database of poker professionals. The software identified four players who would excel – two of whom made it to the final three.

Related: FindFace is a new facial recognition app that could end public privacy

Faception’s theory of facial personality profiling is supported by two genetic research observations: DNA affects personality, and DNA determines facial features. Linking these two observations lead the company to infer that personality can be identified in facial features since both are products of genetic expression.

The accuracy of this inference is yet to be determined but, as University of Washington computer science professor Pedro Domingo pointed out to the Washington Post, AI technology itself is idiosyncratic. Machine learning algorithms learn from the data they’re given – and this data can can cause them to learn things in error.

For example, when a computer system was told to differentiate between wolves and dogs, the system quickly perfected the task. Only after further inspection did the researchers realize the system wasn’t identifying the animals – it was identifying the surroundings. Each wolf photo had snow in the background, while the dog photos did not.

Faception said they’ve already signed a contract with an undisclosed homeland security agency in an effort to recognize terrorists.

Leave a Reply

Captcha image