I don't know what the fuss is about. Right below the shellscript curl-thingy(I do think this approach should die)
https://github.com/ollama/ollama/blob/main/docs/linux.md
It is available, you can do it by hand and the code is also there.
And if you don't feel like tweaking code, you can simply shift it into a docker-container/-containers. and export the ports and some folders to persist the downloaded models.
That they are not advertising it is not uncommon.
OpenWhisper from when OpenAI was open did the same thing.
Being a linux-user and all, we have ways to find those folders :D .
My bad.
(sorry I cannot help it)
Comes to show though, if you want freedom and things the way you want to have them, you are just using the wrong OS :D
That they are not advertising it is not uncommon. OpenWhisper from when OpenAI was open did the same thing. Being a linux-user and all, we have ways to find those folders :D .