Dec 15, 2023 19:22
A museum set up an AI chatbot that was supposed to simulate an artist who had killed himself, but they had to program "guardrails" so the AI would give "hopeful" statements in response to any visiting schoolkid who asked a question about suicide.
nobody writes like this