The Guardian accused Microsoft of damaging its journalistic reputation with a survey generated by artificial intelligence. The technology company placed an automatic poll next to a news item reporting the death of a woman. It asked users about the possible cause of death, which generated an immediate backlash from readers.
The poll, titled “perspectives by AI,” was deployed alongside the Guardian article on Microsoft’s news aggregator. The report mentioned that Sydney police found a woman’s body in a school bathroom. Microsoft’s AI posted a poll with the question “What do you think is the motive for the woman’s death?” along with three answers: murder, accident and suicide.
According to The Guardian, readers reacted angrily and concluded that the poll was the work of the newspaper, not an artificial intelligence. “This has to be the most pathetic and disgusting poll I have ever seen. The author should be ashamed of himself,” one said. The comments of rejection remained online until a few hours ago on the Microsoft Start website.
Anna Bateson, chief executive of Guardian Media Group, said the incident caused significant damage to the reputation of the media outlet and the journalists who wrote the story. Bateson sent a letter to Brad Smith, president of Microsoft, asking for a guarantee that a similar situation will not occur again.
Microsoft must take responsibility for its AI
The newspaper demanded that Microsoft not use experimental technology in conjunction with its content without prior approval.
“This application of generative AI by Microsoft is exactly the kind of instance we’ve warned about in relation to news,” Bates mentioned. “And a key reason why we have previously asked their teams that we don’t want Microsoft’s experimental technologies applied to The Guardian’s licensed journalism,” he said.
The media outlet asked Microsoft to add a note to the article taking responsibility for the damage caused by the survey. “Please note the comments of readers, who are clearly unaware that it is Microsoft that created this survey and not The Guardian,” Bates said. She said the time is right for companies to provide greater transparency and certainty around “highly unpredictable technologies” such as artificial intelligence.
It is worth mentioning that this is not the first time Microsoft’s artificial intelligence has made such a mistake. A few months ago, an article written by a generative AI placed the Ottawa Food Bank as one of three tourist destinations in the city. “If you decide to visit, consider going on an empty stomach,” it mentioned in the description.
MSN articles have been generated by artificial intelligence since 2020, after Microsoft laid off its entire staff of editors and journalists. Microsoft News is part of the Bing division, and news selection is done based on algorithms. Other media have followed suit, such as Gizmodo en Español, which in late August became an AI-generated translation site.