• towerful@programming.dev
    link
    fedilink
    English
    arrow-up
    17
    ·
    8 months ago

    Where does it say they have access to PII?
    I would imagine reddit would be anonymising the data. Hashes of usernames (and any matches of usernames in content), post/comment content with upvote/downvote counts. I would hope they are also screening content for PII.
    I dont think the deal is for PII, just for training data

    • just_change_it@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      8 months ago

      Where does it say they have access to PII?

      So technically they haven’t sold any PII if all they do is provide IP addresses. Legally an IP address is not PII. Google knows all our IP addresses if we have an account with them or interact with them in certain ways. Sure, some people aren’t trackable but i’m just going to call it out that for all intents and purposes basically everyone is tracked by google.

      Only the most security paranoid individuals would be anonymous.

      • towerful@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 months ago

        Depends where and how its applied.
        Under GDPR, IP addresses are essential to the opperation of websites and security, so the logging/processing of them can be suitably justified without requiring consent (just disclosure).
        Under CCPA, it seems like it isnt PII if it cant be linked to a person/household.

        However, an ip address isnt needed as a part of AI training data, and alongside comment/post data could potentially identify a person/household. So, seems risky under GDPR and CCPA.

        I think Reddit would be risking huge legal exposure if they included IP addresses in the data set.
        And i dont think google would accept a data set that includes information like that due to the legal exposure.

        • just_change_it@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 months ago

          ML can be applied in a great number of ways. One such way could be content moderation, especially detecting people who use alternate accounts to reply to their own content or manipulate votes etc.

          By including IP addresses with the comments they could correlate who said what where and better learn how to detect similar posting styles despite deliberate attempts to appear to be someone else.

          It’s a legitimate use case. Not sure about the legality… but I doubt google or reddit would ever acknowledge what data is included unless they believed liability was minimal. So far they haven’t acknowledged anything beyond the deal existing afaik.

          • towerful@programming.dev
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 months ago

            Yeh, but its such a grey area.
            If the result was for security only, potentially could be passable as “essential” processing.
            But, considering the scope of content posted on reddit (under 18s, details of medical (even criminal) content) it becomes significantly harder to justify the processing of that data alongside PII (or equivalent).
            Especlially since its a change of terms & service agreements (passing data to 3rd party processors)

            If security moderation is what they want in exchange for the data (and money), its more likely that reddit would include one-way anonymised PII (ie IP addresses that are hashed), so only reddit can recover/confirm ip addresses against the model.
            Because, if they arent… Then they (and google) are gonna get FUCKED in EU courts