• brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    3 days ago

    I think the metaphor is finetuning a LLM for ‘safety’ is like trying to engineer the blades to be “finger safe”, when the better approach would be to guard against fingers getting inside an active blender.

    Finetuning LLMs to be safe is just not going to work, but building stricter usage structures around them will. Like tools.

    This kinda goes against Altman’s assertion that they’re magic crystal balls (in progress), which would pop his bubble he’s holding up. But in the weeds of LLM land, you see a lot more people calling for less censoring, and more sensible and narrow usage.