• gens@programming.dev
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      9 hours ago

      Ah yes. Safety knives. Safety buildings. Safety sleeping pills. Safety rope.

      LLMs are stupid. A toy. A tool at best, but really a rubber ducky. And it definitely told him “don’t”.

    • peoplebeproblems@midwest.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 hours ago

      We should, criminaly.

      I like that a lawsuit is happening. I don’t like that the lawsuit (initially to me) sounded like they expected the software itself to do something about it.

      It turns out it also did do something about it but OpenAI failed to take the necessary action. So maybe I am wrong about it getting thrown out.