Topic: Al Machines Are Racist & It's All Our Fault
no photo
Tue 08/30/16 07:31 AM
AI machines are racist & it’s all our fault - report

For decades many have feared the rise of robots would lead to AI machines dominating the planet and subjugating humans. Turns out they’re not necessarily a threat to mankind’s survival - they’re just racist bigots.

The racist bot phenomenon became glaringly obvious back in March with Microsoft’s latest chatbot named Tay. Launched amid great fanfare, Tay took less than 24 hours to go rogue...or become a Nazi sympathizer to be precise.

READ MORE: Botty-mouth: Microsoft forced to apologize for chatbot's racist, sexist & anti-semitic rants

The Twitter bot was supposed to learn through engagement and conversation with humans but instead began to aggregate and copy utterances from the more mischievous - and openly hostile - elements lurking online.

Feminists, Jews and Mexicans were caught in Tay’s crosshairs and it also developed something of a potty mouth.

Now, an AI intelligence system named GloVe is causing quite a commotion.

GloVe is perhaps slightly more subtle in its bigotry than Tay but equally offensive, according to researchers at Princeton University.

Using GloVe’s algorithm, those involved in the project conducted a word association test, whereby the AI system was asked to match particular words with other ‘pleasant’ or ‘unpleasant’ words.

Read more
Leslie Jones © Mario Anzuoni

‘White’ names such as Emily and Matt were paired by GloVe with ‘pleasant’ words containing positive connotations, while Ebony and Jamal - names more associated with the black community - were matched with ‘unpleasant’ words. As for gender, GloVe made some word associations based on traditional roles. Female terms were more likely to be paired with ‘family’ or ‘the arts’ while male terms were matched with ‘career’ or ‘maths’.

But here’s the catch: Although GloVe is “self-learning”, it gathers information by reading text and data from the internet - so its prejudice is basically picked up from us.

“Our results indicate that language itself contains recoverable and accurate imprints of our historic biases...machine learning absorbs prejudice as easily as other biases,” read the researchers’ report, which is awaiting publication.

“We show for the first time that human-like semantic biases result from the application of standard machine learning to ordinary language - the same sort of language humans are exposed to every day.”

http://www.rt.com/usa/357642-racist-artificial-intelligence-glove/
*Embebbed Links & Tweets, etc*

no photo
Tue 08/30/16 09:17 AM
Okay... since everyone is jumping on this topic.... hhhaaa... laugh

My take on this, is CONTROLLED SPEECH.

The machines are NOT PC, & it is HUMANS fault. Therefore WE humans must be MORE careful what we say.

Oh well... humans are not PC, & are not meant to be. noway



So bite me, you low life piece of scrap metal!! smokin

Conrad_73's photo
Tue 08/30/16 09:49 AM

Okay... since everyone is jumping on this topic.... hhhaaa... laugh

My take on this, is CONTROLLED SPEECH.

The machines are NOT PC, & it is HUMANS fault. Therefore WE humans must be MORE careful what we say.

Oh well... humans are not PC, & are not meant to be. noway



So bite me, you low life piece of scrap metal!! smokin

got to watch it,just like you have to,if you have a Parrot!laugh

no photo
Tue 08/30/16 10:32 AM


Okay... since everyone is jumping on this topic.... hhhaaa... laugh

My take on this, is CONTROLLED SPEECH.

The machines are NOT PC, & it is HUMANS fault. Therefore WE humans must be MORE careful what we say.

Oh well... humans are not PC, & are not meant to be. noway



So bite me, you low life piece of scrap metal!! smokin

got to watch it,just like you have to,if you have a Parrot!laugh


Then they should, LET it run amuck AGAIN, accidently on purpose, (thread on that), let it go among crime ridden neighborhoods & the BLM & Islamic no go zones... etc.

And tell them... 'be careful what you say'


Hhhaaa...  never happen. Why prevent crime & avoid conflict. :smile: