Running Pinegrow AI with Locally Installed LLM Ai assistants

Done it!

now running Llama 3.2 locally,
via Ollama app on the Mac

AND …after fumbling around for some time,
worked out the API end point
and the API key …

they are

API Key : Ollama
End point: http://localhost:11434/v1/chat/completions

this was after fooling around with the command liine and checking out the text answers etc.

I have to admit, im impressed. Really so.
But it did then make a few mistakes, like labelling Media queries as a Javascript file…
And why I asked it … it admitted it made a mistake and explained!

all on my local machine.

no data shared, no connection, no Internet required and , it codes far better than me.
I just asked it to create a 3 column web page.
bang!
it used css grid or flexbox, whichever I preferred.

ok now this could be a game changer.
a personal coding tutor.

This may get me back into it for the winter…

Llama3.2_Ollama_Pinegrow_2_Dec-09-2024

2 Likes