CEO Sam Altman says OpenAI had made ChatGPT “pretty restrictive” to make sure it was being careful with mental health issues, though that made the chatbot “less useful/enjoyable to many users who had no mental health problems.”

  • yeehaw@lemmy.caOP
    link
    fedilink
    English
    arrow-up
    58
    arrow-down
    1
    ·
    1 day ago

    Just in case you wanted to give more of your personal data to an AI company…

    I feel like many people don’t consider the data they’re collecting on you, the user.

    • jubilationtcornpone@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 day ago

      It’s pretty hard to find a kid that doesn’t have Snapchat anymore. Basically being trained to hand over all their personal info to random strangers as soon as they’re able to hold a cell phone.

      And of course they hide the read messages for the end user to provide the illusion that it’s “deleted” which I have to admit is brilliant. Extremely unethical but brilliant.

      • yeehaw@lemmy.caOP
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        So many are also ignorant to ToS on even basic things like tvs and baby monitors. There’s some next level creep in these things. Convenience over privacy, it’s scary.

        • Rai@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          20 hours ago

          When your messages on Snapchat are no longer visible to you or the person you’re talking to, they’re not deleted. They’re still on their servers. You just don’t see them anymore.

    • breakingcups@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      Not just that, it also seems like the new nicotine, the new gambling. Some people are going to get absolutely hooked on this and not be able to stop.

  • Assassassin@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    1 day ago

    Bro, Sam, we already have so much porn. This isn’t going to be the secret code that makes your mass theft machine finally profitable.

    • Boozilla@lemmy.world
      link
      fedilink
      English
      arrow-up
      29
      ·
      1 day ago

      Well, some of us have niche kinks, like dumping poo from a fighter jet. You won’t find that any ol’ porn hub site.

    • Xaphanos@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      It can be advertised as “cruelty free”. As in “No humans were coerced into doing these acts.”

      • BigFig@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        6
        ·
        edit-2
        1 day ago

        No it was just trained on existing images of people being coerced into doing these acts. Soooo much better

        /s

        • PM_ME_YOUR_ZOD_RUNES@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          1 day ago

          The /s was a mistake right? It actually is better.

          If it’s going to happen either way, isn’t it better that it’s not real? Can you explain how it’s not?

          • BigFig@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            1 day ago

            Why is CSAM being run through AI at all? Why isn’t that data destroyed rather than used as training data

    • Euphoma@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      19 hours ago

      It already is, they just put guardrails to prevent it from making adult content before

  • handsoffmydata@lemmy.zip
    link
    fedilink
    English
    arrow-up
    14
    ·
    24 hours ago

    I think it’s already active for me. I asked it to write a py script exposing a fast api and it chose 8008 as the port.

      • BlackLaZoR@fedia.io
        link
        fedilink
        arrow-up
        4
        ·
        21 hours ago

        The general purpose toolkit is called LLM studio, they have search engine for hugging face with large collection of uncensored models. MoE dark champion is one of the best for naughty storytelling.

        If you want specifically talk with a sexbot customized Ai character then there’s Hammerai.com it has both online and offline versions

        Edit: local toolkits obviously require good GPU - preferably with large amount of VRAM (16 gigs recommend) or at least fast CPU with lot of cores

    • rozodru@pie.andmc.ca
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 hours ago

      wonder if they received backlash when they transitioned to gpt5. it pretty much “killed” many peoples “partners”.

    • yeehaw@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      16 hours ago

      For the corporate elites and us gov so you can be targeted by ads and jailed and stuff, surely.

    • Hasherm0n@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      20 hours ago

      A person I know (and don’t particularly like) created a start up on this idea a couple of years ago already. It’s creepy AF.

      • vane@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        19 hours ago

        They will go for everything that will increase their user base, that’s just my feeling. This shit is addictive like drugs or gambling.

  • SaraTonin@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    19 hours ago

    If this means you can ask it to pretend to be a busty nurse with a limp, that already exists. If it means that you can say “what’s the name of that video with the busty nurse with a lisp?” and it’ll give you a link, then that’s potential, right there. I can imagine them right now torrenting every porn video they can and getting one llm to transcribe it to create training data for another llm while a third llm does image/scene analysis.