Ollama + Open-WebUI

  • 安裝環境 VM 規格 :
    • vCPU : 8 ([host] i7-11700 支援 avx2)
    • RAM : 8G
    • SSD : 64G
  • OS : Ubuntu 20.04
  • 編輯 docker-compose.yml
    services:
      ollama:
        image: ollama/ollama:latest
        ports:
          - 11434:11434
        volumes:
          - .:/code
          - ./ollama/ollama:/root/.ollama
        container_name: ollama
        environment:
          - OLLAMA_NUM_PARALLEL=4
          - OLLAMA_MAX_LOADED_MODELS=4
        pull_policy: always
        tty: true
        restart: always
        networks:
          - ollama-docker
        #deploy:
        #  resources:
        #    reservations:
        #      devices:
        #        - driver: nvidia
        #          count: 1
        #          capabilities: [gpu]
    
      ollama-webui:
        image: ghcr.io/open-webui/open-webui:main
        container_name: ollama-webui
        volumes:
          - ./ollama/ollama-webui:/app/backend/data
        depends_on:
          - ollama
        ports:
          - 8080:8080
        environment:
          - '/ollama/api=http://ollama:11434/api'
        extra_hosts:
          - host.docker.internal:host-gateway
        restart: unless-stopped
        networks:
          - ollama-docker
    
    networks:
      ollama-docker:
        external: false
  • 啟動 docker compose

    docker compose up -d

  • Open-WebUI 網址 - http://Server-IP:8080
  • Ollama API 網址 - http://Server-IP:11434
  • 下載語法 Exp. ycchen/breeze-7b-instruct-v1_0

    docker exec ollama ollama pull ycchen/breeze-7b-instruct-v1_0

  • Exp. phi3

    pve-ollama-221:~# docker exec ollama ollama list
    NAME                                            ID              SIZE    MODIFIED
    phi3:14b                                        1e67dff39209    7.9 GB  5 days ago
    codestral:latest                                fcc0019dcee9    12 GB   5 days ago
    yabi/breeze-7b-32k-instruct-v1_0_q8_0:latest    ccc26fb14a68    8.0 GB  5 days ago
    phi3:latest                                     64c1188f2485    2.4 GB  5 days ago
    
    pve-ollama-221:~# docker exec -it ollama ollama run phi3 --verbose
    >>> hello
     Hello! How can I assist you today? Whether it's answering a question, providing information, or helping with a task, feel free to ask.
    
    total duration:       3.119956759s
    load duration:        905.796µs
    prompt eval count:    5 token(s)
    prompt eval duration: 101.53ms
    prompt eval rate:     49.25 tokens/s
    eval count:           32 token(s)
    eval duration:        2.889224s
    eval rate:            11.08 tokens/s

  • 安裝程序

    curl -fsSL https://ollama.com/install.sh | sh

    • 看安裝紀錄
  • 下載模型 Exp. llama3

    ollama pull llama3

    • 看執行紀錄
  • 開始測試

    ollama run llama3

    • 看執行紀錄
  • tech/ollama.txt
  • 上一次變更: 2024/06/12 13:47
  • jonathan