Ollama Manager
A terminal UI (TUI) for managing Ollama models.
Why a TUI?
The command line is powerful but repetitive. Instead of typing:
ollama ps # What's loaded?
ollama list # What's available?
ollama run qwen3 # Load this one
ollama stop qwen3 # Unload it
Just run ollama-manager.exe and use keyboard shortcuts.
Installation
Pre-built Binary
Download from the repository:
cd ollama-manager
.\ollama-manager.exe
Build from Source
Requires Go 1.21+ or Docker/Podman:
# Using container (no local Go needed)
cd ollama-manager
.\build.ps1
# Using local Go
.\build.ps1 -Local
Usage
Starting the Manager
.\ollama-manager.exe
You'll see a list of all installed models with their status:
┌──────────────────────────────────────────────────────────┐
│ Ollama Manager │
├──────────────────────────────────────────────────────────┤
│ > qwen3:32b [LOADED] │
│ deepseek-r1:32b │
│ llama3.3:70b-instruct-q4_K_M │
│ llama3.1:8b │
│ mistral:7b │
│ phi-4:14b │
│ │
├──────────────────────────────────────────────────────────┤
│ r: run s: stop u: unload all R: refresh q: quit │
└──────────────────────────────────────────────────────────┘
Keyboard Controls
| Key | Action |
|---|---|
↑ / ↓ | Navigate models |
r / Enter | Run selected model (interactive chat) |
s | Stop selected model (unload from VRAM) |
u | Unload ALL models |
R | Refresh model list |
q | Quit |
Running a Model
- Navigate to a model with arrow keys
- Press
rorEnter - The TUI exits and opens an interactive chat session
- Type
/byeto exit chat
Stopping Models
Models stay loaded in VRAM for fast reuse. To free memory:
- Navigate to a loaded model (shows
[LOADED]) - Press
sto stop it - Or press
uto unload ALL models
How It Works
The manager is built with:
It calls Ollama CLI commands:
// List models
exec.Command("ollama", "list")
// Check what's loaded
exec.Command("ollama", "ps")
// Run model
exec.Command("ollama", "run", modelName)
// Stop model
exec.Command("ollama", "stop", modelName)
Source Code
The full source is in ollama-manager/main.go:
package main
import (
tea "github.com/charmbracelet/bubbletea"
"github.com/charmbracelet/lipgloss"
)
type model struct {
models []modelInfo
cursor int
selected string
}
// ... see full source in repository
Building
With Docker/Podman (Recommended)
No local Go installation required:
cd ollama-manager
.\build.ps1
This:
- Builds a Docker image with Go
- Cross-compiles for Windows
- Extracts the
.exefile
With Local Go
If you have Go installed:
cd ollama-manager
.\build.ps1 -Local
Or manually:
go mod tidy
go build -ldflags="-s -w" -o ollama-manager.exe .
Build Output
Build complete! Binary: ollama-manager.exe (2.57 MB)
Customization
Adding Features
Fork the repository and modify main.go:
// Example: Add model deletion
case "d":
exec.Command("ollama", "rm", m.models[m.cursor].name).Run()
Changing Styles
Modify the Lipgloss styles:
var (
titleStyle = lipgloss.NewStyle().
Bold(true).
Foreground(lipgloss.Color("86")) // Change color
)
Troubleshooting
"ollama not found"
The manager calls ollama CLI. Ensure it's in your PATH:
ollama --version
Models not showing
Refresh the list with R key.
"Access denied" on Windows
Run PowerShell as Administrator if models are in a protected directory.
Terminal rendering issues
Ensure your terminal supports ANSI colors. Windows Terminal recommended.
Alternatives
If a TUI isn't your style:
CLI Commands
ollama list # List models
ollama ps # Show loaded
ollama stop X # Unload model
Open WebUI
Web interface at http://localhost:3000 with visual model selection.
API
# List via API
curl http://localhost:11434/api/tags
# Load model via API
curl http://localhost:11434/api/generate -d '{"model":"qwen3:32b"}'