A week ago, a Guardian story revealed the news that Elon Musk’s Grok AI was knowingly and willingly producing images of real-world people in various states of undress, and even more disturbingly, images of near-nude minors, in response to user requests. Further reporting from Wired and Bloomberg demonstrated the situation was on a scale larger than most could imagine, with “thousands” of such images produced an hour. Despite silence or denials from within X, this led to “urgent contact” from various international regulators, and today X has responded by creating the impression that access to Grok’s image generation tools is...