Is anyone actually surprised by this?

    • PhilipTheBucket@ponder.cat
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Yes. I also like how the alarming take on it is not “People are typing their passwords / medical histories / employer’s source code into ChatGPT and from there it goes straight into the training data not only to be stored forever in the corpus, but also sometimes, to be extracted at a later date by any yahoo who knows the way to tease it back out from ChatGPT via the right carefully crafted prompting!”

      But instead it is “When you type things, they can see what you type! The keystrokes!”

      • catloaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        And they probably aren’t even doing that. More likely, it’s just bot prevention.

        • melroy@kbin.melroy.org
          link
          fedilink
          arrow-up
          0
          ·
          2 days ago

          they are actually training on this data (potentially). Its a fact. Only if you use some kind of special corporate license then they will not train on the data. (and you need to trust them on that)

        • PhilipTheBucket@ponder.cat
          link
          fedilink
          arrow-up
          0
          ·
          2 days ago

          I wouldn’t be so sure. China is at the world’s forefront of automated techniques to be able to spy on and manipulate people through their own devices at massive scale. If they had some semi-workable technology to fingerprint individuals through their typing patterns, in conjunction with fingerprinting the devices they were using through other means, that would make perfect sense to me.

          I don’t think it is especially a concern for Deepseek specifically, for reasons discussed elsewhere in the comments. That one particular aspect of the privacy issue is probably being overblown, when there are other adjacent privacy and security concerns that are a lot more pressing. Honestly, that one particular detail isn’t really proven simply because it’s in the privacy policy, and even if they are doing something like that, its inclusion or not in this particular privacy policy or this app isn’t the particularly notable part about it.