Post

Avatar
“Research has shown that some emotional AIs disproportionately attribute negative emotions to the faces of black people, which would have clear and worrying implications if deployed in areas such as recruitment, performance evaluations, medical diagnostics or policing.”
Are you 80% angry and 2% sad? Why ‘emotional AI’ is fraught with problemswww.theguardian.com AI that purports to read our feelings may enhance user experience but concerns over misuse and bias mean the field is fraught with potential dangers
Avatar
Avatar
this, plus it really fucks over autistic folks (some whom are also Black of course). what my face is doing at any specific moment does not necessarily correlate to my true emotions at that moment.
Avatar
Avatar
my only qualm with that is that it should read "...some emotional AIs are programmed to disproportionately attribute negative emotions..." they aren't doing this of their own free will. this is the end result of how they are programmed to process input and generate output
Avatar
And the general terribleness of the inputs. Consistently labeled training data is expensive, and The Open Internet is not well labeled.
Avatar
My blood ran cold when I read "policing".