Ollama
Run Llama 3, Phi 3, Mistral, Gemma, and other models. Customize and create your own.
- https://ollama.com/
- GitHub: https://github.com/ollama/ollama
- Doc: https://github.com/ollama/ollama/tree/main/docs
- Video: 離線不怕隱私外洩!免費開源 AI 助手 Ollama 從安裝到微調,一支影片通通搞定! - YouTube
Installation
ollama + open webui
mkdir ollama-data download open-webui-data
docker-compose.yml:
services:
ollama:
image: ollama/ollama:latest
ports:
- 11434:11434
volumes:
- ./ollama-data:/root/.ollama
- ./download:/download
container_name: ollama
pull_policy: always
tty: true
restart: always
networks:
- ollama-docker
open-webui:
image: ghcr.io/open-webui/open-webui:main
container_name: open-webui
volumes:
- ./open-webui-data:/app/backend/data
depends_on:
- ollama
ports:
- 3000:8080
environment:
- 'OLLAMA_BASE_URL=http://ollama:11434'
extra_hosts:
- host.docker.internal:host-gateway
restart: unless-stopped
networks:
- ollama-docker
networks:
ollama-docker:
external: false
ollama
mkdir ollama-data download
docker run --name ollama -d --rm \
-v $PWD/ollama-data:/root/.ollama \
-v $PWD/download:/download \
-p 11434:11434 \
ollama/ollama
Models
List Models Installed
ollama list
Load a GGUF model manually
ollama create <my-model-name> -f <modelfile>
Page Assist
Page Assist is an open-source Chrome Extension that provides a Sidebar and Web UI for your Local AI model.
No Comments