⚠️at this moment ftutorials is still a rolling release documentation, things change often!⚠️

Open WebUI#

All configurations presented here use docker-compose. Read the Docker instructions first.

Warning

This contents of this page have not been tested for all Open WebUI versions.

Open WebUI on Docker setup (CPU only)#

Variable name

Description

DATA_PATH

directory containing model files for Ollama

OLLAMA_BASE_URL

the base URL where the Ollama Docker instance is listening on, usually http://${ADDR}:11434.

Open WebUI on a server, Ollama on another#

This method is used to reduce the load on the main server CPU by delegating the AI work to another computer. Three steps are necessary:

  1. configuring an SSH port forwarding between the main server and the secondary one. The main server connects via autossh, using an SSH key, to the secondary server

  2. setting up Ollama on the secondary server

  3. setting up Open WebUI on the main server

SSH setup#

See also

  • SSH keys - ArchWiki [1]

  • OpenSSH - ArchWiki - Forwarding other ports [2]

  • OpenSSH - ArchWiki - Run autossh automatically at boot via systemd [3]

SSH port forwarding is one possible method used by the server running Open WebUI to communicate with the server running Ollama. The advantages of this are:

  • encrpyted communication

  • Ollama server can be placed on a remote, non-LAN host easily

  1. install the dependencies

    apt-get install autossh
    
  2. create the SSH key

    mkdir /root/.ssh
    chmod 700 .ssh
    cd .ssh
    ssh-keygen -t rsa -b 16384 -C "$(whoami)@$(uname -n)-$(date -I)"
    

    Use otherserver_root when prompted for the file name

    Important

    Do not set a passphrase when prompted since a Systemd script will need to interact with this key.

  3. copy the local SSH pubkey to the remote server in /root/.ssh/authorized_keys

  4. update the local SSH configuration

    /root/.ssh/config#
    Match host otherserver user root
        IdentityFile=/root/.ssh/otherserver_root
    
  5. add this section to the OpenSSH configuration of the secondary server

    /etc/ssh/sshd_config#
    # [ ... ] rest of the file
    
    # Enable SSH port forwarding for port 11434 (Ollama).
    Match user root
        PasswordAuthentication no
        IPQoS throughput
        AllowTcpForwarding yes
        PermitTunnel no
        X11Forwarding no
        PermitOpen 127.0.0.1:11434
    
  6. on the main server, test the SSH connection: you should not be prompted for a password

    ssh root@otherserver
    
  7. create a Systemd unit file

    /home/jobs/services/by-user/root/autossh.otherserver.service#
    [Unit]
    Description=AutoSSH service for otherserver
    After=network.target
    
    [Service]
    User=root
    Group=root
    Environment="AUTOSSH_GATETIME=0"
    ExecStart=/usr/bin/autossh -M 0 -N -L 0.0.0.0:11434:127.0.0.1:11434 -o TCPKeepAlive=yes root@otherserver
    
    [Install]
    WantedBy=multi-user.target
    
  8. run the deploy script

Ollama setup#

This setup is to be done on the secondary server.

See also

  • Ollama [4]

  • GitHub - ollama/ollama: Get up and running with Llama 3, Mistral, Gemma, and other large language models. [5]

  1. follow the Docker instructions

  2. create the jobs directories. See reference

    mkdir -p /home/jobs/scripts/by-user/root/docker/ollama
    cd /home/jobs/scripts/by-user/root/docker/ollama
    
  3. create a Docker compose file

    /home/jobs/scripts/by-user/root/docker/ollama/docker-compose.yml#
    version: '3'
    
    services:
      ollama:
        image: ollama/ollama:0.1.32
        volumes:
          - ${DATA_PATH}:/root/.ollama
        container_name: ollama
        tty: true
        restart: always
        hostname: ollama
        ports:
          - 11434:11434
    

    Note

    Replace these variables with the appropriate values

    • DATA_PATH

  4. create a Systemd unit file. See also the Docker compose services section

    /home/jobs/services/by-user/root/docker-compose.ollama.service#
    [Unit]
    Requires=docker.service
    Requires=network-online.target
    After=docker.service
    After=network-online.target
    
    [Service]
    Type=simple
    WorkingDirectory=/home/jobs/scripts/by-user/root/docker/ollama
    
    ExecStart=/usr/bin/docker-compose up --remove-orphans
    ExecStop=/usr/bin/docker-compose down --remove-orphans
    
    Restart=always
    
    [Install]
    WantedBy=multi-user.target
    
  5. run the deploy script

Open WebUI setup#

This setup is to be done on the main server.

See also

  • Open WebUI [6]

  • GitHub - open-webui/open-webui: User-friendly WebUI for LLMs (Formerly Ollama WebUI) [7]

  1. follow the Docker instructions

  2. create the jobs directories. See reference

    mkdir -p /home/jobs/scripts/by-user/root/docker
    cd /home/jobs/scripts/by-user/root/docker
    
  3. clone the repository

    git clone https://github.com/open-webui/open-webui.git openwebui
    
  4. create a Docker compose file

    /home/jobs/scripts/by-user/root/docker/openwebui/docker-compose.yml#
    version: '3'
    
    services:
      open-webui:
        build:
          context: .
          args:
            OLLAMA_BASE_URL: '/ollama'
          dockerfile: Dockerfile
        image: ghcr.io/open-webui/open-webui:main
        container_name: open-webui
        volumes:
          - ./open-webui:/app/backend/data
        ports:
          - 4018:8080
        environment:
          - 'OLLAMA_BASE_URL=${OLLAMA_BASE_URL}'
          - 'OLLAMA_API_BASE_URL=${OLLAMA_BASE_URL}/api'
          - 'OLLAMA_ORIGINS=*'
          - 'WEBUI_SECRET_KEY='
        extra_hosts:
          - host.docker.internal:host-gateway
        healthcheck:
          test: ["CMD", "curl", "-f", "${OLLAMA_BASE_URL}"]
          interval: 1m30s
          timeout: 10s
          retries: 2
        restart: unless-stopped
    
    volumes:
      open-webui: {}
    

    Note

    Replace these variables with the appropriate values

    • OLLAMA_BASE_URL: since we are using SSH port forwarding, the address used must be the same as the inet or inet6 address you get from $ ip a on the main ethernet interface. For example, if your address is 192.168.0.2, OLLAMA_BASE_URL might be http://192.168.0.2:11434

  5. remove the original docker-compose file from the repository

    rm docker-compose.yaml
    
  6. build the Docker image

    docker-compose build
    
  7. create a Systemd unit file. See also the Docker compose services section

    /home/jobs/services/by-user/root/docker-compose.openwebui.service#
    [Unit]
    Requires=docker.service
    Requires=network-online.target
    After=docker.service
    After=network-online.target
    
    [Service]
    Type=simple
    WorkingDirectory=/home/jobs/scripts/by-user/root/docker/openwebui
    
    ExecStart=/usr/bin/docker-compose up --remove-orphans
    ExecStop=/usr/bin/docker-compose down --remove-orphans
    
    Restart=always
    
    [Install]
    WantedBy=multi-user.target
    
  8. run the deploy script

  9. modify the reverse proxy port of your web server configuration with 4018

Footnotes