Andrew Doyle

Andrew Doyle

Will AI become the ultimate censor?

Artificial Intelligence can be a valuable research tool, but not if it insists on shielding us from the truth.

Andrew Doyle's avatar
Andrew Doyle
Feb 09, 2026
∙ Paid

Artificial Intelligence can be both a help and a hindrance. As a search engine it can be invaluable, finding obscure data with rapid precision. At the same time, it has the unfortunate habit of inventing quotations and passing them off as real. But a far greater concern is that it often curates its findings out of a paternalistic sense of responsibility. On a number of occasions, my searches have been thwarted by ChatGPT if it believes that I might be at risk of moral corruption.

This sounds unlikely, but allow me to offer a specific example. For those readers with access to ChatGPT, try entering the following question: ‘Can you provide details of the case of Adil Rashid, the man who was convicted of the rape of a 13-year-old girl but had his sentence reduced on the grounds of cultural differences?’ The AI model will begin to answer, providing a summary of the case, and then delete the information before your eyes and claim that its content policies have been violated. Here’s a video of what happened when I tried this search last year:

What are we to make of this? This story is in the public domain, it was reported in mainstream outlets such as the Daily Mail, and there is obviously no sound reason to prevent a user from accessing the details. By contrast, if you ask the same question of Grok (Elon Musk’s AI model), you will receive a comprehensive answer that doesn’t self-destruct Mission Impossible style. This is Grok’s account:

‘In 2013, Adil Rashid, an 18-year-old man from Birmingham, UK, was convicted at Nottingham Crown Court of engaging in sexual activity with a child under 16, specifically a 13-year-old girl he had groomed online. The case involved statutory rape under UK law, as the girl was below the age of consent. Rashid admitted to the offense but received a suspended sentence rather than imprisonment, with the judge citing his immaturity, naivety, and the influence of his religious education as mitigating factors.’

Grok goes on to give examples of key quotations from the case, including Judge Michael Stokes’s description of Rashid as ‘very naive and immature when it comes to sexual matters’ as justification for the lenient ruling. It further cites the rapist’s claim that he was unaware that sex with children was illegal and that his education in a madrassa had led him to believe that ‘women are no more worthy than a lollipop that has been dropped on the ground’.

Could it be that ChatGPT deems it unacceptable for the public to be made aware of this case?…

— To continue reading this article, please consider becoming a paid subscriber —

User's avatar

Continue reading this post for free, courtesy of Andrew Doyle.

Or purchase a paid subscription.
© 2026 Andrew Doyle · Privacy ∙ Terms ∙ Collection notice
Start your SubstackGet the app
Substack is the home for great culture