Running Pinegrow AI with Locally Installed LLM Ai assistants

Oh and this Native mac app will run, utilising the locally hosted LLM server via Ollama and whatever Model you are running through it… buuuut,
I cant as my Mac OS is too old (12.x Monterey) whereas this Mac app requires 14+.

Just start up the Ollama server and … you should be good to go with it with a compatible Mac OS.

It also runs on iOS… running Ollama… wow!
Ive no idea.

1 Like