• Veloxization@yiffit.net
      link
      fedilink
      English
      arrow-up
      27
      ·
      1 year ago

      Personal tl;dr:

      • Affects only large platforms for now
      • Selling illegal goods is not allowed and there must be a way to report that content
      • Ad targeting based on sexuality, religion, ethnicity or political beliefs is not allowed
      • Targeting ads on children is restricted
      • Transparency on how recommendation algorithms work
      • Option to opt out of recommendation algorithms
      • Requirements for data sharing with researchers and authorities
      • Cooperation with crisis response requirements
      • Requirement for external and independent auditing
      • TenderfootGungi@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Musk is actively trying to stop data sharing. He threatened to sue a researcher that was using the api to collect data for breach of terms of service.

        • EnderofGames@kbin.social
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          …to stop data sharing for AI training (or other purposes) for other people. There is no way I believe Musk wants to stop the collection and use of data, especially since he is trying to make his own AI. Even with the fact he is trying to kill off Twitter at the same time.

    • boredtortoise@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 year ago

      Here’s the key paragraphs:

      online platforms must implement ways to prevent and remove posts containing illegal goods, services, or content while simultaneously giving users the means to report this type of content.

      the DSA bans targeted advertising based on a person’s sexual orientation, religion, ethnicity, or political beliefs and puts restrictions on targeting ads to children. It also requires online platforms to provide more transparency on how their algorithms work.

      The DSA carves out additional rules for what it considers “very large online platforms,” (over 45 million European users/month) forcing them to give users the right to opt out of recommendation systems and profiling, share key data with researchers and authorities, cooperate with crisis response requirements, and perform external and independent auditing.