Hey, I’m Anthony.

Cloud engineering leader, tinkerer, and occasional writer. Field Notes is where I explore things in tech I find interesting — not always cloud, not always AI, just stuff worth digging into. Pull up a post.

The API in Front of the AI: Part 2

Filed under: Cloud Engineering · AI Infrastructure · Local Lab Picking Up Where We Left Off In Part 1, we got Bifrost running locally on your Mac, wired up Ollama with qwen3.5, and verified the whole stack was humming. Requests flowing through the gateway, streaming working, tool calling confirmed. Now we’re going deeper. This post is about MCP — the Model Context Protocol. If Part 1 was about giving your AI a reliable phone line, Part 2 is about giving it hands. By the end you’ll have a local MCP server running on your machine that exposes real tools, connected through Bifrost, so qwen3.5 can actually do things on your behalf — check system info, run safe shell commands, do math — all without touching the cloud. ...

March 31, 2026 · 8 min · Anthony Mineer

The API in Front of the AI

Filed under: Cloud Engineering · AI Infrastructure · Local Lab You’ve Got APIs. Now You’ve Got AI APIs. Now What? Picture this: you grab an Ollama model, wire it into your app locally — done. Celebrate. But two months later? You’ve got five apps, a handful of models, and absolutely no visibility into what’s being called, how often, or what it’s costing you in compute. Sound familiar? Welcome to the reason LLM gateways exist. This is Part 1 of a two-part Field Notes series. Here we cover what an LLM gateway is, why you’d want one, and how to get Bifrost running locally on your Mac against Ollama with qwen3.5 — fully offline, fully free, fully yours. ...

March 24, 2026 · 8 min · Anthony Mineer