Ollama: Difference between revisions
This has been changed so that it actually overrides the proper env vars. The last format shouldn't work as the convention is to use 10.3.0 for other rocm devices. |
m →Configuration of GPU acceleration: Document vulkan option |
||
| (2 intermediate revisions by 2 users not shown) | |||
| Line 23: | Line 23: | ||
* "rocm": supported by most modern AMD GPUs | * "rocm": supported by most modern AMD GPUs | ||
* "cuda": supported by most modern NVIDIA GPUs | * "cuda": supported by most modern NVIDIA GPUs | ||
* "vulkan": supported by most modern GPUs on Linux | |||
| Line 94: | Line 95: | ||
== Troubleshooting == | == Troubleshooting == | ||
=== AMD GPU with open source driver === | === AMD GPU with open source driver === | ||
Use the ollama-rocm nix package: | |||
<syntaxhighlight lang="nix"> | |||
environment.systemPackages = [ pkgs.ollama-rocm ]; | |||
</syntaxhighlight> | |||
And make sure the kernel loads the amdgpu driver: | |||
<syntaxhighlight lang="nix"> | |||
boot.initrd.kernelModules = [ "amdgpu" ]; | |||
</syntaxhighlight> | |||
In certain cases Ollama might not allow your system to use GPU acceleration if it cannot be sure your GPU/driver is compatible. | In certain cases Ollama might not allow your system to use GPU acceleration if it cannot be sure your GPU/driver is compatible. | ||
However you can attempt to force-enable the usage of your GPU by overriding the LLVM target. <ref>https://github.com/ollama/ollama/blob/main/docs/gpu. | However you can attempt to force-enable the usage of your GPU by overriding the LLVM target. <ref>https://github.com/ollama/ollama/blob/main/docs/gpu.mdx#overrides-on-linux</ref> | ||
You can get the version for your GPU from the logs or like so: | You can get the version for your GPU from the logs or like so: | ||