So I usually speak against just letting companies have your data. This is one of the reasons why. Data might seem harmless, but it is often rich with inference-able structures. Not to mention this kind of data does not deprecate.
Original research
https://arxiv.org/abs/1611.04135
Article overview with less math-speak
https://www.technologyreview.com/2016/11/22/107128/neural-network-learns-to-identify-criminals-by-their-faces/
So..
- Your face alone could give away that you are an independent thinker, had an atypical childhood, or anything that puts a criminal record on you in China.
- Law abiding citizens in tend to all look like each other.
@jmw150
another perspective - if people are judged to be criminal by a jury of their peers, is the public's/machine's ability to judge "criminality" to some degree tautological?
@jmw150 "Law abiding citizens tend to all look like each other."
That seems like an awfully convenient thing for a tyrannical state.
@jmw150
I've always wondered how scientifically robust rejection of phrenology was... maybe it just went out of fashion.
grim that the powers that be have found applications like this - a "world without criminals" would lead to all sorts of deeply twisted crime.
more generally, some of the clinical assessments you see made in 19th century books/fiction do seem to tap into a more holistic view of human health, see Nietzsche and Sherlock Holmes among others. makes you wonder what else we're missing these days.