Drawing from photos on a popular dating website, their program was able to correctly distinguish self-identified gay men from heterosexual men 81% of the time. It had a 71% success rate in distinguishing gay women from heterosexual women. When the algorithm was given five images of the same person to examine, the accuracy rose to 91% for men and 83% for women.
Given that the individuals whose photos were analyzed had already self-identified as straight or gay, you might wonder whether the software is a genuine threat to privacy. It could be if the technology is used by police in countries like Saudi Arabia or Russia to investigate suspected homosexuals.
But that's just the tip of the iceberg in terms of the threat to privacy that face recognition and other biometric technologies pose. And everyone is vulnerable.