• QuazarOmega@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      According to the blog post, it relies on the OpenAI API, which more counterintuitively than ever is anything but open, so you can say bye bye to your privacy when you use it, that would be the same for other services too actually, regardless of their openness, at most you can decide to put trust in their privacy policy.

      Until we get a way to interact with online solutions via e.g. homomorphic encryption with decent performance, the only actually private way to use it is to self-host it, if they had implemented a locally run LLAMA based assistant instead, one of the more lightweight models maybe, then I think it would have been an excellent addition with no downsides