Everything that's wrong with that study which used AI to 'identify sexual orientation'

Tech 11-9-2017 Mashable 21
TwitterFacebook

Advancements in artificial intelligence can be extremely worrying, especially when there are some pretty serious intimate and privacy issues at stake.

SEE ALSO: Elon Musk thinks AI will be the most likely cause of WW3

A study from Stanford University, first reported in the Economist, has raised a controversy after claiming AI can deduce whether people are gay or straight by analysing images of a gay person and a straight person side by side. 

LGBTQ advocacy groups and privacy organisations have slammed the report as "junk science" and called it "dangerous and flawed" because of a clear lack of representation, racial bias and reducing the sexuality spectrum to a binary.  Read more...

More about Artificial Intelligence, Lgbtq, Gay, Straight, and Facial Recognition 
Read The Rest at Mashable- (opens a new tab)





Pages