> Local LLMs' speed can't be generalized, as the speed of each instance is entirely determined by its particular runtime environment.
sure. today's on-device LLMs are either slower or less capable by orders of magnitude compared to most services. sometimes it can be faster if you use your own fancy graphics cards.
> There's no concrete guarantee that paying will preclude your data from being used.
usually there is for paid plans. sometimes you have to ensure the state of some checkbox. obviously you should pay attention if that is important to you. it is important to a lot of people and usually is easy to figure out.
> Might as well reduce this to "don't use LLMs".
don't use LLMs for things you don't understand. that's the rule. they can be quite useful as long as you understand what you're doing with them. they can be quite dangerous if you use them to bullshit yourself out of your depth.
Local LLMs' speed can't be generalized, as the speed of each instance is entirely determined by its particular runtime environment.
> just pay for the service so they don't use your uploads.
There's no concrete guarantee that paying will preclude your data from being used.
> always read the outputs and don't ask for things you don't understand.
Might as well reduce this to "don't use LLMs".