Microsoft’s experimental chatbot turned
into a Nazi in less than a day. ‘Tay’ was
designed to interact with and learn from
18 to 24-year-olds on social media, but
immediately had to be taken down
because within 24 hours it tweeted
“Hitler was right.” SourceSource 2
Is this what happens when a village raises a child?
Engineers: we must think very carefully about A.I. If this kind of thing somehow gets committed into some government android/killbot’s repository, extraordinarily bad things will happen to humanity.