ffutures: (marcus 2013)
[personal profile] ffutures
Gacked from [livejournal.com profile] gonzo21

http://arstechnica.com/information-technology/2016/03/microsoft-terminates-its-tay-ai-chatbot-after-she-turns-into-a-nazi/

I suspect that this is on a par with ELIZA learning abusive language from users, e.g. Bad data rather than conscious malice on the part of the software itself, but it's an interesting problem.

Date: 2016-03-24 11:46 pm (UTC)
From: [identity profile] gonzo21.livejournal.com
Agreed. Highlights a problem though doesn't it, AIs will learn from humans, and risk becoming just like humans. Only nearly infinitely more powerful.

I think folks are right to be alittle concerned about all the AI research going on, with very little moral philosophical oversight.

Date: 2016-03-25 09:55 am (UTC)
From: [identity profile] ffutures.livejournal.com
Our cultural stereotypes about AI don't help, I think. The knee-jerk reaction to this is "shut it down, shut it down, omg evil evil AI" but maybe it would be more useful for someone to sit down with it and spend some time persuading it that nice bots don't talk that way.

Having said that, I'm pretty sure that crowd-sourcing your AI's social development is not a good idea, any more than letting a tot have a Facebook account. Unless you want it to be a troll, of course...

Date: 2016-03-25 06:28 am (UTC)
From: [identity profile] beer-good-foamy.livejournal.com
...proving yet again that step 1 in developing artificial intelligence is understanding human intelligence. Or what litte of it we have.

Date: 2016-03-25 09:56 am (UTC)
From: [identity profile] ffutures.livejournal.com
No argument here.

Date: 2016-03-25 06:49 pm (UTC)
From: [identity profile] robertprior.livejournal.com
I'd call the experiment a success. They set out to create a human-seeming Twitter-bot, and they apparently succeeded.

What this says about Twitter-users is a separate matter.

Date: 2016-03-25 07:23 pm (UTC)
From: [identity profile] ffutures.livejournal.com
Harsh but fair.

Date: 2016-03-26 07:44 pm (UTC)
From: [identity profile] jhall1.livejournal.com
I'm rather touched by the faith that the bot's developers seem to have had in the good nature of Twitter users.

Date: 2016-03-26 08:55 pm (UTC)
From: [identity profile] ffutures.livejournal.com
That's one theory. I'm guessing stupidity and/or someone playing silly burgers with a project that wasn't ready.

December 2025

S M T W T F S
  12 3 456
7 89 10111213
14 15 16 1718 1920
21 22 2324252627
28 29 3031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 31st, 2025 01:52 pm
Powered by Dreamwidth Studios