Arts and Design

London Standard to feature AI-written review ‘by’ dead art critic Brian Sewell


The late British art critic Brian Sewell is making a return to journalism via artificial intelligence, according to the newspaper that established his reputation as a distinctive cultural voice.

The London Evening Standard, rebranded as the London Standard, will feature a one-off AI-written review by the renowned critic, who died in 2015 at the age of 84, when it is published on Thursday.

The Standard’s interim chief executive, Paul Kanareck, said the edition would have multiple features on AI and London’s role as a hub for the technology. “The London Standard is a bold and disruptive new publication,” he said. “It includes an experimental AI review by our legendary critic Brian Sewell, and his estate are delighted.”

Sewell joined joined the Standard in 1984 and won multiple awards for his work.

The Standard did not confirm how it would recreate Sewell’s distinctive style – he once wrote that Banksy “should have been put down at birth” – but chatbots can be prompted to produce work in the style of different writers. When prompted by the Guardian on Wednesday, the ChatGPT chatbot produced a faux-article on Vincent van Gogh written in a rough approximation of Sewell’s style.

The Standard has moved to a weekly publication cycle. In recent months it has cut about 60 editorial roles from an editorial team of about 130.

Reflecting on Sewell’s work after his death in 2015, the Guardian art critic Jonathan Jones wrote: “His views were pungent and got people arguing and that’s what matters.”

The news website Deadline, which first reported the Sewell plan, claimed the piece would be a review of the new Van Gogh exhibition at the National Gallery in London, which the Guardian gave five stars.

Kanareck told the Press Gazette website the review was a “one-off intended to provoke discussion about AI and journalism”. The potential role of AI in news has become one of many hot topics around the emergence of powerful tools such as ChatGPT. They have impressed with their writing ability – and thus their potential to replace work normally carried out by humans – but are also prone to factual errors known as “hallucinations”.

Recreating people via AI can carry legal and moral risk, although Kanareck stressed the Sewell piece had the approval of the late critic’s estate. In May the family of Michael Schumacher won a legal case against the publisher of a German magazine that printed an AI-generated interview with the Formula One champion, who suffered a near-fatal brain injury in 2013. The editor of the magazine was fired ahead of the legal hearing.

A number of commercial AI products have also emerged that impersonate deceased people in chatbot or audio form, but have been criticised for encouraging an “unwillingness to mourn”.



READ NEWS SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.