Living under the thumb of the metastasizing array of genders that demand to be taken seriously, those who want a return to a tech-lite world where the government doesn’t care what’s in your pants seem to be fighting a losing battle.
But in their quixotic struggle, they have found themselves allied with former nemeses, those ones who don’t want the government poking around through their medical records or learning the history of their transition so that authorities can give them free event tickets or something equally banal yet intrusive.
Os Keyes, a researcher who has spent the last several years writing academic papers about the need for facial recognition to treat gender non-binary individuals with kid gloves, seems unconcerned about the negative possibilities for where such a complex system might go, appearing to believe that they can just jump into bed with the EU government, get what they need in terms of ‘fixing’ the AI, leave and be done with it.
It’s not clear if other groups are out there pushing back in secret, believing that calling up the power of wokeness to their cause’s ‘side,’ can’t possibly go wrong. However, it does open up the possibility that while the programmers are “operating” on the algorithmic “patient,” they might take the time to remove some of the more noxious police state level attributes. Several activist groups like the Electronic Frontier Foundation have been loudly protesting the rise of the technology stateside, and San Francisco, California as well as Somerville, Massachusetts are working on banning it.
Indeed, expecting to walk into a cafe, music show, or restaurant without providing some form of identification in the era of Covid-19 seems like an activity from a bygone era, with governments making no secret of the joy they feel at holding such control over their charges.
Given the way in which the typical censor-happy social justice warrior goes hand in hand with enthusiastic rule-obeying, it would certainly be a change of pace for Keyes’ gender-quest to – however briefly – save Europeans from the EU’s own growing enthusiasm for tracking and tracing every move they make. Keyes didn’t miss the chance to mention that element in their petition, noting that the EU was going to propose a facial recognition framework this month anyway – and could possibly end with participant countries taking EU-style understanding of gender and sexual orientation home with them, perhaps kicking off similar changes back in their own nations.
Especially given the patchwork of surveillance states being cobbled together due to the Covid-19 pandemic, it’s difficult to imagine what kind of individual attributes could be so hypersensitive as to convince the powers-that-be to put down their temperature guns, stop its AI bots meticulously hyper-scrolling through photo after photo of similar-looking individuals, and – the ultimate privacy violation – delivering a match right in front of everyone, including customs officials.
However, there’s one promising option that – according to the pleas of Keyes – is better than just the lesser of two evils. Keyes has written several papers bemoaning the gender binary as it’s seemingly been set in stone on the internet, and earlier this week helped put together a petition to ban “automated recognition of gender and sexual orientation.”
Arguing that “automated recognition of gender and sexual orientation is scientifically flawed and puts LGBT+ lives at risk,” Keyes insisted that unless the EU breaks with tradition and begins offering a user-designated list of gender options, gender non-binary people could be threatened. Whether that means law enforcement going out of its way to harm a non-binary traveler or that traveler merely missing a flight due to confusion about their passport photo no longer looking like them, Keyes insisted that allowing laws to unfold along existing lines could “easily fuel discrimination” – apparently an issue that has never befallen other travelers.
“Many of these AI systems work by sorting people into two groups – male or female. The consequences for people whose gender does not match the sex they were assigned at birth can be severe. The algorithms are also programmed to reinforce outdated stereotypes about race and gender that are harmful to everyone,” they declared. How the visitor would have made it into their stereotype-soaked destination without some sort of functional ID papers is not explained.
As early AI tech’s facial recognition abilities pandered primarily to the needs of the white and cisgender alike while leaving transgendered in the dust, with government police state and the woke gestapo demanding unquestioning obedience to their opposing versions of the ‘gender question’, the issue seems poised to explode.
If you like this story, share it with a friend!
The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.
© 2021, paradox. All rights reserved.