Today i was doing the daily ritual of looking at distrowatch. Todays reveiw section was about a termal called warp, it has built in AI for recomendations and correction for commands (like zhs and nushell). You can also as a chatbot for help. I think its a neat conscept however the security is what makes me a bit skittish. They say the dont collect data and you can check it aswell as opt out. But the idea of a terminal being read by an Ai makes me hesitant aswell as a account needed to use warp. What do you guys think?

  • Saracha@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    8 months ago

    So I took some time to look around and as far as my perspective as a non dev regular user. While this does seem like a useful tool that could be useful for someone who interacts with the command line on a infrequent basis, the drawbacks on it seem pretty big.

    1. Everywhere on their website seems clear that they don’t store your data, but I have trouble believing that? Why on earth they would need for you to create a account that you must log in to use the terminal if they don’t have a need to monitor your data?
    2. While they claim that they are intending to monetize this by charging enterprise users and letting small teams use it for free, they limit free requests to 20 per dday which seems less than useless.
    3. Maybe this is just some confusion since I don’t have any experience as an enterprise but it seems like it would be an unacceptable security risk having a program that it telling you that it sends telemetry back home that users are interacting with using sudo and elevated privileges. Especially when it is a closed box.

    Ignoring all the reasons to be cautious and skeptical about AI in general I struggle to see the use case for this particular tool.

    • filister@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      And now I am imagining some sophisticated hack that breaches their AI generator and starts slipping command arguments that might expose your system. Probably too much of an effort but still plausible.