Where did you get those instructions from? Is creating a virtual environment necessary if I'm fine with it running on my real system?
I assume the environment part is what the "conda" commands on the GitHub repo readme are doing, but finding "conda" to install seems to be its own process. It's not on MacPorts, pip seems to only install a Python package instead of an executable, and getting a package from some other site feels sketchy.
What is it with ML and Python, anyway? Why is this amazing new technology being shrouded in an ecosystem and language which… well, I guess if I can't say anything nice…
Conda's actually a pretty well respected python distribution package manager from Anaconda.com (see e.g. https://en.wikipedia.org/wiki/Anaconda_(Python_distribution)). Anaconda has a lot of the standard scientific python computing packages in addition to a virtual environment and package manager or you could use Miniconda version for just the conda package manager + virtualenv.
I think whether you need a virtualenv depends on your system python version and compatibility of any of the dependencies, but it's also pretty nice to be able to spin up or blow away envs without bloating your main python directory or worrying that you're overwriting dependencies for a different project.
I’m still stubbornly using MacPorts. If it ain’t broke…
But given that the entire world of technical documentation assumes all technically-inclined people using Macs are using Homebrew, I’ll probably have to give up and switch over at some point. But not yet.
Grew up on BSD so I feel you. I'd say it became time to give in after they cleaned up need for su and put everything in opt.
In fact, if you used it before they cleaned all that up, or used it before moving from Intel to ARM and did a restore to the new arch instead of fresh install, it's worth doing a brew dump to a Brewfile, uninstalling ALL packages and brew, and reinstalling fresh on this side of the permissions and path cleanups.
- Migrate Homebrew from Intel Macs to Apple Silicon Macs:
They are basically conventions for Python but the actual instructions I just found are unexpanded in the README on the GitHub repo. You have to run one of the commands which downloads the model and converts it for you to Core ML. If you've never used Hugging Face, you'll need to create an account to get a token and then use their CLI to login with the token to be able to download the model. Then you can run prompts from CLI with the commands they give.
I assume the environment part is what the "conda" commands on the GitHub repo readme are doing, but finding "conda" to install seems to be its own process. It's not on MacPorts, pip seems to only install a Python package instead of an executable, and getting a package from some other site feels sketchy.
What is it with ML and Python, anyway? Why is this amazing new technology being shrouded in an ecosystem and language which… well, I guess if I can't say anything nice…