OpenWebUI Pipelines
Pipelines run alongside OpenWebUI to handle OpenAI-compatible traffic on port 9099. The Deployment mounts a persistent volume at /app/pipelines so installed pipeline modules and dependencies survive pod restarts.
Configuration
- Storage:
openwebui-pipelines-storageis a10GiLonghornPersistentVolumeClaimmounted at/app/pipelinesin theopenwebui-pipelinesDeployment. - Secrets: The ExternalSecret
app-openwebui-pipelines-api-keysourcesPIPELINES_API_KEYfrom Bitwarden and injects it into the container. - Service: The
pipelinesClusterIPService exposes port9099inside theopen-webuinamespace. - Access: Add an OpenAI connection in the OpenWebUI admin panel with base URL
http://pipelines.open-webui.svc.cluster.local:9099and the same API key stored in Bitwarden. Use that connection for models that should run through pipelines. - Upstream provider: Configure pipeline valves in the OpenWebUI UI to call LiteLLM at
http://litellm.litellm.svc.cluster.local:4000/v1with the existing LiteLLM API key.