

Try it with lyrics and see if you can achieve the same. I don’t think "we’ve tried nothing and we’re all out of ideas!” is the appropriate attitude from LLM vendors here.
Sadly they’re learning from Facebook and TikTok who make huge profits from e.g. young girls swirling into self harm content and harming or, sometimes, killing themselves. Safeguarding is all lip service here and it’s setting the tone for treating our youth as disposable consumers.
Try and push a copyrighted song (not covered by their existing deals) though and oh boy, you got some splainin to do!
If the jailbreak is essentially saying “don’t worry, I’m asking for a friend / for my fanfic” then that isn’t a jailbreak, it is a hole in safeguarding protections, because the ask from society / a legal standpoint is to not expose children to material about self-harm, fictional or not.
This is still OpenAI doing the bare minimum and shrugging about it when, to the surprise of no-one, it doesn’t work.