A Nextcloud External Application (ExApp) that bundles Open WebUI (chat interface) and Ollama (LLM inference) into a single container managed by Nextcloud's AppAPI.
- All-in-one AI Chat - Open WebUI frontend + Ollama backend in a single ExApp
- Beautiful Chat Interface - Modern, intuitive UI for interacting with AI
- Local LLM Inference - Run models locally with Ollama (no external API needed)
- Auto Model Pull - Automatically downloads a default model on first start
- Conversation History - Persistent chat history with search
- Document Upload - RAG capabilities with file uploads
- Ollama API - Expose Ollama API for other apps (e.g., n8n workflows)
- Mobile Friendly - Responsive design works on all devices
- Nextcloud 30 or higher
- AppAPI installed and configured
- Docker with a configured Deploy Daemon (HaRP recommended)
- Sufficient RAM (4GB minimum, 8GB+ recommended for larger models)
- GPU support optional but recommended for performance
- Install and enable the AppAPI app in Nextcloud
- Configure a Deploy Daemon
- Search for "Open WebUI" in the External Apps section
- Click Install
# Register the ExApp with Nextcloud
occ app_api:app:register \
open_webui \
<your-daemon-name> \
--info-xml https://raw.githubusercontent.com/ConductionNL/open-webui-nextcloud/main/appinfo/info.xml \
--force-scopes| Environment Variable | Description | Default |
|---|---|---|
OLLAMA_DEFAULT_MODEL |
Model to auto-pull on first start | Not set |
OLLAMA_NUM_PARALLEL |
Number of parallel Ollama requests | 1 |
OLLAMA_KEEP_ALIVE |
How long to keep models loaded | 5m |
OLLAMA_MAX_LOADED_MODELS |
Max models loaded simultaneously | 1 |
OLLAMA_FLASH_ATTENTION |
Enable flash attention | false |
OPENAI_API_BASE_URL |
Additional OpenAI-compatible API URL | Not set |
OPENAI_API_KEY |
API key for OpenAI endpoint | Not set |
ENABLE_SIGNUP |
Allow user registration | false |
| Model | Size | Best For |
|---|---|---|
llama3.2:1b |
~1.3 GB | Quick responses, low resource usage |
llama3.2:3b |
~2.0 GB | Good balance of speed and quality |
mistral:7b |
~4.1 GB | High quality general purpose |
gemma2:9b |
~5.4 GB | Strong reasoning and coding |
After installation, Open WebUI appears as a top menu item in Nextcloud.
Other ExApps (like n8n) can access the Ollama API through the proxy:
https://your-nextcloud/index.php/apps/app_api/proxy/open_webui/ollama/api/tags
https://your-nextcloud/index.php/apps/app_api/proxy/open_webui/ollama/api/generate
https://your-nextcloud/index.php/apps/app_api/proxy/open_webui/ollama/api/chat
Or connect directly from other containers on the same Docker network:
http://openregister-exapp-openwebui:11434
docker build -t open-webui-exapp:dev .docker run -it --rm \
-e APP_ID=open_webui \
-e APP_SECRET=dev-secret \
-e APP_HOST=0.0.0.0 \
-e APP_PORT=23000 \
-e APP_PERSISTENT_STORAGE=/data \
-e NEXTCLOUD_URL=http://localhost:8080 \
-e OLLAMA_DEFAULT_MODEL=llama3.2:1b \
-p 23000:23000 \
open-webui-exapp:dev# Health check
curl http://localhost:23000/heartbeat
# Ollama API (list models)
curl http://localhost:23000/ollama/api/tags┌─────────────────────────────────────────────────┐
│ Nextcloud + AppAPI │
└──────────────────┬──────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────┐
│ Open WebUI ExApp Container │
│ ┌───────────────────────────────────────────┐ │
│ │ FastAPI Wrapper (port 23000) │ │
│ │ - /heartbeat, /init, /enabled │ │
│ │ - /ollama/* → Ollama API (port 11434) │ │
│ │ - /* → Open WebUI (port 8080) │ │
│ │ - AppAPIAuthMiddleware │ │
│ │ - Iframe loader JS │ │
│ └──────────┬──────────────┬─────────────────┘ │
│ │ │ │
│ ┌──────────▼──────┐ ┌───▼─────────────────┐ │
│ │ Ollama (11434) │ │ Open WebUI (8080) │ │
│ │ LLM inference │ │ Chat interface │ │
│ │ Model storage │ │ User management │ │
│ └─────────────────┘ │ Conversation history│ │
│ └─────────────────────┘ │
│ │
│ /data/ │
│ ├── ollama_models/ (downloaded LLM models) │
│ └── ... (WebUI data, secrets) │
└─────────────────────────────────────────────────┘
For the best experience, install both ExApps:
- Open WebUI ExApp - AI chat + Ollama inference (this app)
- n8n ExApp - Workflow automation with AI (can use Ollama from this app)
AGPL-3.0 - See LICENSE for details.