• airrowM
    link
    fedilink
    English
    arrow-up
    2
    ·
    14 days ago

    His mother, The New York Times reports, plans to file a lawsuit against Character.AI, alleging the platform’s “dangerous and untested” technology led to his death

    To me this is a compounding problem of this whole situation, I feel like lawsuits like this can be so extremely bad for society as they raise the cost of starting businesses, thereby almost guaranteeing we are subject to large corporations with greater likelihood (a lawsuit like this could probably bankrupt a small business, whereas large corporations can pay them)

    but as far as AI contributing to the suicide or not, I’m not sure. Perhaps parents should prudently restrict a lot of access to things like this in general. Perhaps the person would have taken their life regardless of interaction with the AI.

    • AliceOPMA
      link
      fedilink
      English
      arrow-up
      2
      ·
      13 days ago

      Don’t most ai apps have like a ‘were not a mental health support app’ thing in their TOS? Either way this is sad