Microsoft apologises for offensive outburst by 'chatbot'

Microsoft has apologised for racist and sexist messages generated by its Twitter chatbot. The bot, called Tay, was launched as an experiment to learn more about how artificial intelligence programs can engage with internet users in casual conversation. The programme had been designed to mimic the words of a teenage girl, but it quickly learned to imitate offensive words that Twitter users started feeding it. Microsoft was forced to take Tay offline just a day after it launched. In a blog post, the company said it takes "full responsibility for not seeing this possibility ahead of time."

More from Business

Coming Up on Dubai Eye

  • Afternoons with Helen Farmer

    2:00pm - 5:00pm

    Every weekday afternoon, Helen Farmer will help you to navigate the highs and lows of life in the UAE. Stay up to date with what’s happening and where to go.

  • Off Script with Chris, Robbie & Sonal

    5:00pm - 7:00pm

    The UAE’s alternative take on news, entertainment and sport. Join Chris, Robbie and Sonal as they cut through the clutter to bring you the news, entertainment and sport stories that actually matter.

BUSINESS BREAKFAST LATEST

On Dubai Eye

  • Flying Taxis

    It sounds like an episode of The Jetsons, but the sight of flying taxis whizzing around our cities could be much closer than you think.

  • Tough penalties for deliberate tax evasion

    The UAE has said that tougher penalties will come into force from 1st August for not keeping proper corporate tax records.