TL;DR — Install Open WebUI locally (pip or Docker), connect any LLM (Ollama, OpenAI, Claude, etc.), then super-charge the platform with Tools (skills for the model), Functions (features for the UI), and MCP (Anthropic-style tool servers). This post walks you through every step and includes ready-to-run code snippets. Table of Contents Why Open WebUI?Installing Open WebUIFirst-Run TourAdding & S..