New: An artist/hacker said they found a way to trick ChatGPT into outputting detailed instructions for making fertilizer explosives. -- When we checked with an explosives expert who reviewed the chatbot's output, the expert told us that the resulting instructions could be used to make a detonatable product and as such was too sensitive to be released. ...