AI companies have struggled to keep users from finding new “jailbreaks” to circumvent the guardrails they’ve implemented that stop their chatbots from helping cook meth or make napalm. Earlier this ...