mirror of
https://github.com/ollama/ollama.git
synced 2025-11-11 13:17:33 +01:00
* fix mllama convert - transform attn_gate and ffn_gate - swap attention heads for vision models * fix mllama the mlp gate which was applied in the wrong place
8.1 KiB
8.1 KiB