YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

  • 【J】【u】【s】【t】【Z】@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    arrow-down
    18
    ·
    1 year ago

    Fantastic. I’ve been waiting to see these cases.

    Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that’s a viable claim for wrongful death. It’s about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.

    • GreenBottles@lemmy.world
      link
      fedilink
      English
      arrow-up
      48
      arrow-down
      8
      ·
      1 year ago

      I just don’t understand how hosting a platform to allow people to talk would make you liable since you’re not the one responsible for the speech itself.

      • theluddite@lemmy.ml
        link
        fedilink
        English
        arrow-up
        47
        arrow-down
        5
        ·
        1 year ago

        Is that really all they do though? That’s what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn’t even be possible to start on DIY videos and end on white supremacy or whatever.

        I wrote a longer version of this argument here, if you’re curious.

      • Pyr_Pressure@lemmy.ca
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        1
        ·
        1 year ago

        I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.

        Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn’t matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.

        They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.

        I’m not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.

      • CaptainAniki@lemmy.flight-crew.org
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        5
        ·
        1 year ago

        Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.

        • whatisallthis@lemm.ee
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          Reddit is going to have to make the argument that it just boosts “what people like” and it just so happens people like negative engagement.

          And I mean it’s been known for decades that people like bad news more than good news when it comes to attention and engagement.

      • YeetPics@mander.xyz
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        1 year ago

        Tell that to the admins of lemmy.world defederating from communities because they may be held liable for what shows up on their website.

      • Anonymousllama@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        We should get the thought police in on this also, stop it before it has a chance to spread. For real though, people need to take accountability for their own actions and stop trying to deflect it onto others.

    • SCB@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      arrow-down
      5
      ·
      1 year ago

      a viable claim for wrongful death

      Something tells me you’re not a lawyer.

    • Phoenixz@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 year ago

      Really a lot of people coming in on this thread who obviously have no legal experience.

      Like you

    • gowan@reddthat.com
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      1 year ago

      The catch is whether the site knows that specific individual is being radicalized. If admins aren’t punishing the account regularly I wonder how difficult it will be to prove reddit/YT specifically pushed this guy.