Running a local LLM / AI (Ollama)

I’ve recently dove headfirst into running LLMs on my local hardware, and I wanted to share what I’ve learned and how I’ve set it up so others that want to do the same thing, may. If you’re on a Windows Desktop machine, and you just want to interact with LLMs on your desktop, then skip […]

Running a local LLM / AI (Ollama) Read More »