• yarr@feddit.nl
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Are the platforms guilty or are the users that supplied the radicalized content guilty? Last I checked, most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves.

    • KneeTitts@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      most of the content on YouTube, Facebook and Reddit is not generated by the companies themselves

      Its their job to block that content before it reaches an audience, but since thats how they make their money, they dont or wont do that. The monetization of evil is the problem, those platforms are the biggest perpetrators.

      • yarr@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Its their job to block that content before it reaches an audience

        The problem is (or isn’t, depending on your perspective) that it is NOT their job. Facebook, YouTube, and Reddit are private companies that have the right to develop and enforce their own community guidelines or terms of service, which dictate what type of content can be posted on their platforms. This includes blocking or removing content they deem harmful, objectionable, or radicalizing. While these platforms are protected under Section 230 of the Communications Decency Act (CDA), which provides immunity from liability for user-generated content, this protection does not extend to knowingly facilitating or encouraging illegal activities.

        There isn’t specific U.S. legislation requiring social media platforms like Facebook, YouTube, and Reddit to block radicalizing content. However, many countries, including the United Kingdom and Australia, have enacted laws that hold platforms accountable if they fail to remove extremist content. In the United States, there have been proposals to amend or repeal Section 230 of CDA to make tech companies more responsible for moderating the content on their sites.

        • hybrid havoc@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Repealing Section 230 would actually have the opposite effect, and lead to less moderation as it would incentivize not knowing about the content in the first place.

          • afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            3 months ago

            I can’t see that. Not knowing about it would be impossible position to maintain since you would be getting reports. Now you might say they will disable reports which they might try but they have to do business with other companies who will require that they do. Apple isn’t going to let your social media app on if people are yelling at Apple about the child porn and bomb threats on it, AWS will kick you as well, even Cloudflare might consider you not worth the legal risk. This has already happened multiple times even with section 230 providing a lot of immunity to these companies. Without that immunity they would be even more likely to block.

  • Minotaur@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    I really don’t like cases like this, nor do I like how much the legal system seems to be pushing “guilty by proxy” rulings for a lot of school shooting cases.

    It just feels very very very dangerous and ’going to be bad’ to set this precedent where when someone commits an atrocity, essentially every person and thing they interacted with can be held accountable with nearly the same weight as if they had committed the crime themselves.

    Obviously some basic civil responsibility is needed. If someone says “I am going to blow up XYZ school here is how”, and you hear that, yeah, that’s on you to report it. But it feels like we’re quickly slipping into a point where you have to start reporting a vast amount of people to the police en masse if they say anything even vaguely questionable simply to avoid potential fallout of being associated with someone committing a crime.

    It makes me really worried. I really think the internet has made it easy to be able to ‘justifiably’ accuse almost anyone or any business of a crime if a person with enough power / the state needs them put away for a time.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        Sure, and I get that for like, healthcare. But ‘systemic solutions’ as they pertain to “what constitutes a crime” lead to police states really quickly imo

    • PhlubbaDubba@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      3 months ago

      I dunno about social media companies but I quite agree that the party who got the gunman the gun should share the punishment for the crime.

      Firearms should be titled and insured, and the owner should have an imposed duty to secure, and the owner ought to face criminal penalty if the firearm titled to them was used by someone else to commit a crime, either they handed a killer a loaded gun or they inadequately secured a firearm which was then stolen to be used in committing a crime, either way they failed their responsibility to society as a firearm owner and must face consequences for it.

      • Minotaur@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 months ago

        If you lend your brother, who you know is on antidepressants, a long extension cord he tells you is for his back patio - and he hangs himself with it, are you ready to be accused of being culpable for your brothers death?

        • PhlubbaDubba@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          3 months ago

          Did he also use it as improvised ammunition to shoot up the local elementary school with the chord to warrant it being considered a firearm?

          I’m more confused where I got such a lengthy extension chord from! Am I an event manager? Do I have generators I’m running cable from? Do I get to meet famous people on the job? Do I specialize in fairground festivals?

          • Minotaur@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            3 months ago

            …. Aside from everything else, are you under the impression that a 10-15 ft extension cord is an odd thing to own…?

  • muntedcrocodile@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    3 months ago

    What an excellent presedent to set cant possibly see how this is going to become authoritarian. Ohh u didnt report someone ur also guilty cant see any problems with this.