• GreenKnight23@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    this is bullshit.

    the study was performed by Navinci Diagnostics, which has a vested interest in the use of technological diagnostic tools like AI.

    the only way to truly identify cancer is through physical examination and tests. instead of wasting resources on AI we should improve early detection through improved efficiency of tests, allowing patients to regularly test more often and cheaper.

  • kalkulat@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    2
    ·
    14 hours ago

    From the article: " All 232 men in the study were assessed as healthy when their biopsies were examined by pathologists. After less than two-and-a-half years, half of the men in the study had developed aggressive prostate cancer…"

    HALF? I’d suggest staying away from that study … either they don’t know what they’re doing, or some AI made up that article…

    • brendansimms@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 hours ago

      From the peer-reviewed paper: “This study examined if artificial intelligence (AI) could detect these morphological clues in benign biopsies from men with elevated prostate-specific antigen (PSA) levels to predict subsequent diagnosis of clinically significant PCa within 30 months”… so yes, these were men who all had high cancer risk.

    • Hawk@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      7 hours ago

      Maybe they specifically picked men with increased risk?

      Half sounds pretty nuts otherwise.

  • PmMeFrogMemes@lemmy.world
    link
    fedilink
    English
    arrow-up
    80
    ·
    1 day ago

    This is what machine learning is supposed to be. Specialized models that solve a specific problem. Refreshing to read about some real AI research

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      3 hours ago

      Yeah, this is a typical place for AI to actually shine and we hear almost nothing about it because making fake porn videos of your daughter in law is somehow more important

    • mintiefresh@piefed.ca
      link
      fedilink
      English
      arrow-up
      17
      ·
      1 day ago

      I feel like in an ideal world, people can be using AI to help the quality of their work. Rather than being replaced by AI itself.

      • SugarCatDestroyer@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        edit-2
        3 hours ago

        We live in a world where the strong take the last food from the weak in order to live even more luxuriously, because luxury, so to speak, is created through stolen or simply slave labor.

        In short, the rich are rich only because they exploit the poor or simply slaves, otherwise they would be beggars or middle class.

    • brendansimms@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 hours ago

      After reading the actual published science paper referenced in the article, I would downvote the article because the title is clickbaity and does not reflect the conclusions of the paper. The title suggests that AI could replace pathologists, or that pathologists are inept. This is not the case. Better title would be “Pathologists use AI to determine if biopsied tissue samples contain markers for cancerous tissue that is outside the biopsied region.”

    • Devmapall@lemmy.zip
      link
      fedilink
      English
      arrow-up
      10
      ·
      23 hours ago

      There was also a study going around claiming that llms caused cancer screenings by humans to decrease in accuracy. I’m not a scientist but I’m pretty sure the sample size was super small and localized in one hospital?

      Anyway maybe they’re remembering that in addition to the automatic AI hating down votes.

      Not that I’m a fan of AI being shoved everywhere but this isn’t that

      • absentbird@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        Why would you use a large language model to examine a biopsy?

        These should be specialized models trained off structured data sets, not the unbridled chaos of an LLM. They’re both called “AI”, but they’re wildly different technologies.

        It’s like criticizing a doctor for relying on an air conditioner to keep samples cool when I fact they used a freezer, simply because the mechanism of refrigeration is similar.

        • SugarCatDestroyer@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          3 hours ago

          Well, it would be logical to say that anonymity is a threat. Plus, it makes it easier to block thought-criminals if they become a threat… :3

          What anonymity, don’t joke with me here.

  • GraniteM@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    1 day ago

    I thought the article was telling an unmarried woman that AI can find the cancer pathologists she’s been looking for. Not sure why they would be hiding.