Adam Outler 655ba84834
Some checks reported errors
Updates/open-webui-ollama-docker-deploy/pipeline/head Something is wrong with the build of this commit
Initial commit
2025-04-03 16:40:06 -04:00

142 lines
4.8 KiB
Markdown

# Ollama and Open-WebUI Docker Setup
This repository provides the necessary configuration files to deploy two Docker services, Ollama and Open-WebUI, using Docker Compose and Jenkins for automated management. These services are deployed on the `shark-wrangler` node, making use of NVIDIA GPU acceleration for Ollama.
## Prerequisites
To get started, ensure you have the following installed and configured:
- Docker and Docker Compose
- NVIDIA GPU drivers (compatible with the Ollama container)
- Jenkins with access to the `shark-wrangler` node
## Services Overview
### Ollama
- **Image**: `ollama/ollama:latest`
- **Container Name**: `ollama`
- **Ports**: `11434:11434`
- **Environment Variables**:
- `OLLAMA_LLM_LIBRARY=cuda_v12` for GPU acceleration.
- **Volumes**: Mounts data at `/root/.ollama` to persist Ollama's state.
- **Networks**: Connected to `shared_net`.
### Open-WebUI
- **Image**: `ghcr.io/open-webui/open-webui:dev`
- **Container Name**: `open-webui`
- **Ports**: `3000:8080`
- **Environment Variables**:
- **OAuth Configuration**: Enables login and OAuth features.
- **OpenAI API Key**: Configured to enable integrations.
- **OLLAMA_BASE_URL**: Points to the running Ollama instance for interoperability.
- **Volumes**: Mounts data at `/app/backend/data` to persist WebUI state.
- **Networks**: Connected to `shared_net`.
### Docker Compose Configuration
The Docker Compose configuration deploys the following:
- `ollama` and `open-webui` services with defined volumes and ports.
- `shared_net` network for container communication.
## Jenkinsfile for CI/CD
The `Jenkinsfile` automates the deployment of Ollama and Open-WebUI services using Docker Compose:
### Pipeline Stages
1. **Check NVIDIA Driver Version**:
- Ensures NVIDIA drivers are available and compatible.
2. **Check Ollama and Open-WebUI Versions (Before Deploy)**:
- Retrieves the current version of the Ollama and Open-WebUI containers.
3. **Deploy with Docker Compose**:
- Pulls the latest images and recreates the containers using Docker Compose.
4. **Check Ollama and Open-WebUI Versions (After Deploy)**:
- Ensures that the services are running and updated to the latest version after deployment.
The relevant Jenkinsfile snippet:
```groovy
pipeline {
agent {
node { label 'shark-wrangler' }
}
environment {
DOCKER_HOST = 'unix:///var/run/docker.sock'
}
stages {
stage('Check NVIDIA Driver Version') {
steps {
script {
catchError(buildResult: 'SUCCESS', stageResult: 'UNSTABLE') {
sh 'nvidia-smi --query-gpu=driver_version --format=csv,noheader'
}
}
}
}
stage('Check Ollama and Open-WebUI Versions (Before Deploy)') {
steps {
script {
catchError(buildResult: 'SUCCESS', stageResult: 'UNSTABLE') {
echo 'Checking Ollama version before deploy:'
sh 'docker exec -i ollama ollama -v || echo "Ollama check failed"'
echo 'Checking Open-WebUI version before deploy:'
sh 'docker exec -i open-webui jq -r .version /app/package.json || echo "Open-WebUI check failed"'
}
}
}
}
stage('Deploy with Docker Compose') {
steps {
script {
sh '''
docker pull ollama/ollama
docker pull ghcr.io/open-webui/open-webui:main
docker compose up -d --force-recreate
'''
}
}
}
stage('Check Ollama and Open-WebUI Versions (After Deploy)') {
steps {
script {
catchError(buildResult: 'SUCCESS', stageResult: 'UNSTABLE') {
echo 'Checking Ollama version after deploy:'
sh 'docker exec -i ollama ollama -v || echo "Ollama check failed"'
echo 'Checking Open-WebUI version after deploy:'
sh 'docker exec -i open-webui jq -r .version /app/package.json || echo "Open-WebUI check failed"'
}
}
}
}
}
post {
always {
echo 'Pipeline finished.'
}
}
}
```
## Usage
To deploy the services, simply use Docker Compose:
```sh
docker compose up -d
```
For automated deployments, you can use Jenkins with the provided `Jenkinsfile` to ensure the latest versions are deployed and tested.
## Notes
- Ensure all environment variables are correctly set, particularly the OAuth credentials and OpenAI API key.
- Update Docker images regularly to maintain security and access new features.
## License
This project is open-source and licensed under the [MIT License](LICENSE).