mirror of
https://github.com/ollama/ollama
synced 2026-04-23 08:45:14 +00:00
Replace binary low VRAM mode with tiered VRAM thresholds that set default context lengths for all models: - < 24 GiB VRAM: 4,096 context - 24-48 GiB VRAM: 32,768 context - >= 48 GiB VRAM: 262,144 context |
||
|---|---|---|
| .. | ||
| config.go | ||
| config_test.go | ||