ollama/fs
Daniel Hiltgen bd6c1d6b49
flash attn: add auto mode for llama engine (#13052)
* flash attn: add auto mode for llama engine

If the user does not specify fa in the environment, use auto-mode.

* review comments

* ensure kv cache quantized types have FA explicitly enabled

additional review comments
2025-12-12 13:27:19 -08:00
..
ggml flash attn: add auto mode for llama engine (#13052) 2025-12-12 13:27:19 -08:00
gguf Reapply "feat: incremental gguf parser (#10822)" (#11114) (#11119) 2025-06-20 11:11:40 -07:00
util/bufioutil next ollama runner (#7913) 2025-02-13 16:31:21 -08:00
config.go add new gemma model (#11204) 2025-06-25 21:47:09 -07:00