<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"><channel><title>Pipeline Monk</title><description>Practical guides on AI automation, local LLMs, and workflow optimization.</description><link>https://pipelinemonk.com/</link><item><title>The Complete Guide to Running Local LLMs with Ollama in 2026</title><link>https://pipelinemonk.com/local-llms/run-ollama-local-llm-complete-guide/</link><guid isPermaLink="true">https://pipelinemonk.com/local-llms/run-ollama-local-llm-complete-guide/</guid><description>Everything you need to set up Ollama, pick the right model, and start running AI locally on your own hardware — no cloud, no API costs, full privacy.</description><pubDate>Sun, 12 Apr 2026 00:00:00 GMT</pubDate></item><item><title>Build an AI Content Pipeline with N8N and Ollama (Step-by-Step)</title><link>https://pipelinemonk.com/n8n-workflows/n8n-ai-content-pipeline-ollama/</link><guid isPermaLink="true">https://pipelinemonk.com/n8n-workflows/n8n-ai-content-pipeline-ollama/</guid><description>How to wire up N8N workflows and a local Ollama model to research, write, and publish blog articles automatically — fully self-hosted, zero API costs.</description><pubDate>Sat, 11 Apr 2026 00:00:00 GMT</pubDate></item><item><title>Build Your First AI Agent with Ollama and a Tool-Calling Loop</title><link>https://pipelinemonk.com/ai-agents/build-your-first-ai-agent-with-ollama/</link><guid isPermaLink="true">https://pipelinemonk.com/ai-agents/build-your-first-ai-agent-with-ollama/</guid><description>A practical guide to building a working AI agent from scratch: a reasoning loop, tool definitions, and real tool execution — all running locally.</description><pubDate>Fri, 10 Apr 2026 00:00:00 GMT</pubDate></item><item><title>Open WebUI: The Best ChatGPT-Style Interface for Local LLMs</title><link>https://pipelinemonk.com/tools-reviews/open-webui-setup-guide/</link><guid isPermaLink="true">https://pipelinemonk.com/tools-reviews/open-webui-setup-guide/</guid><description>A hands-on review and setup guide for Open WebUI — the self-hosted chat interface that works with Ollama, LiteLLM, and any OpenAI-compatible endpoint.</description><pubDate>Thu, 09 Apr 2026 00:00:00 GMT</pubDate></item></channel></rss>