•1 min read
Building a Local LLM Gateway on a Mac Mini
A small OpenAI-compatible proxy in front of Ollama — Bearer auth, model routing, streaming — so no cluster pod has to hold a vendor API key.
A small OpenAI-compatible proxy in front of Ollama — Bearer auth, model routing, streaming — so no cluster pod has to hold a vendor API key.
The first post on the new blog — a brief note on why this exists and what's coming.