Yet, while in physiognomy, all sorts of people mainly look(ed) for ways to evaluate someone’s character based upon – mostly – their face, nowadays some researchers claim to be able to predict potential criminal behavior based upon ‘minute features’ in an image of their face.
That’s at least what an announcement from Harrisburg University claimed in early May 2020. It stated that Professors Nathaniel J.S. Ashby and Roozbeh Sadeghian, and Ph.D. student and NYPD (!) veteran Jonathan W. Korn, developed an “automated computer facial recognition software capable of predicting whether someone is likely going to be a criminal” (sic).
That seems to go way further than the pseudoscience of physiognomy, facial profiling claims made before by an Israeli startup, Chinese research, and some linking facial profiling with people’s ‘energy type.’ Yet, likely, and capable of predicting someone will be a criminal? Not really.
A deep neural network model to predict criminal behavior based on face profiling and recognition
Moreover, the news release regarding the research and the claim that the software could predict “with 80 percent accuracy and with no racial bias if someone is a criminal based solely on a picture of their face” (sic again) quickly prompted a backlash, also from people in the biometrics market. The ‘no bias’ alone…
As a result, the announcement – for now – has been replaced by a message that the release will be updated after the research article, entitled “A Deep Neural Network Model to Predict Criminality Using Image Processing” is published. Unless something changed, it should be published in the future book series, titled “Springer Nature – Research Book Series: Transactions on Computational Science & Computational Intelligence.”
So, to check out the details regarding the techniques, data, claims, and methodology used in the research paper and software, it’s still ‘wait and see.’
Nevertheless, in times that the adoption of facial recognition is rising despite – and driven by – the COVID-19 pandemic and leading to debates, the announcement again raises questions, beyond some of the BS claims in the press release. It also shows how far some think face recognition tools and neural networks and deep learning could/should go based upon an image of a human being, possible or not (so, skipping the technology, potential, announcement, nonsense and question what really has been developed at Harrisburg University).
The fact that it seems like a good idea to use such technologies to automate the way we want to see if anyone is ‘up to something’ if that would even be feasible is a big concern. Sure, crime is an issue, and in some places more than others. But, as we’ve heard a few times in recent months, the cure can sometimes be worse than the disease. In this case, one can undoubtedly think so since, according to the now-deleted announcement, the researchers aim to ‘produce tools for crime prevention, law enforcement, and military applications that are less impacted by implicit biases and emotional responses.’ Right. And the next step was to find strategic partners to advance this mission, raising again other concerns on who these partners could be and how far they could go in using such technologies.
Facial profiling and physiognomy in the age of AI and face recognition – statements, skepticism, and discussions
As said, it isn’t the first time that face recognition tools and facial profiling are linked with the potential of criminal behavior in people based upon a processed image of their face.
End 2016, Chinese researchers submitted a paper, “Automated Inference on Criminality Using Face Images,” that caused a stir (although not always for the right reasons). They claimed to have found an ability to distinguish pictures of criminals from non-criminals, using four classification techniques (logistic regression, KNN, SVM, CNN), with one, CNN (Convolutional Neural Networks) almost achieving 90 percent accuracy.
You can read it here (PDF opens, do notice the reference to physiognomy) and, next, check out the clear opinion of Professor Carl Theodore Bergstrom and Associate Professor Jevin D. West, authors of “Calling Bullshit. The Art of Skepticism in a Data-Driven World” (that will be published in August 2020).
You also might have heard about Israeli startup Faceception that claimed it was able to predict the chance that someone is likely to be, for instance, a terrorist, pedophile, white-collar offender, genius, or brand promoter (different things indeed). And this with an 80 percent accuracy.
The ‘science’ behind the facial profiling/analysis used for the computer vision and machine learning tech used by the company that allegedly signed a deal with a homeland security agency in the US to find terrorists in a nutshell: ‘our personalities are determined by our genes, and our faces are reflections of our DNA.’
This brings us back to the research conducted at Harrisburg University. The original announcement mentioned that machine learning techniques have shown that they ‘can outperform humans on a variety of tasks related to facial recognition and emotion detection.’ Says who?
Moreover, how this fits in the research that promises to ‘indicate just how powerful these tools are by showing they can extract minute features in an image that are highly predictive of criminality’ remains to be seen. What minute features are we even talking about? Clearview AI and agencies eager to jump on hyped solutions that are announced with a lot of BS do come to mind when reading all this.
“With 80% accuracy and with no racial bias, the software can predict if someone is a criminal based solely on a picture of their face” — I bet the PhD data scientists who wrote this paper also think Westworld S3 is really really smart and intellectual pic.twitter.com/OorC5qHuZk— Dan Nguyen (@dancow) May 6, 2020
You know where to watch for the research once it’s published. Harrisburg University promises to update the release once that’s done and states that the faculty involved in the research is now in the process of updating the paper to address ‘concerns raised.’
We also read that “all research conducted at the University does not necessarily reflect the views and goals of this University.” That seems like a good addition.
Physiognomy, mentioned in close to all articles and papers we linked to in this article, is definitely back and facial profiling or claiming/trying to predict criminal and other behavior, based upon facial expressions, traits, and details seen in images obtained through facial recognition, is definitely not going away. Hopefully, we don’t have to write about phrenology soon.