diff options
author | Ben Sima <ben@bsima.me> | 2024-05-14 09:35:45 -0400 |
---|---|---|
committer | Ben Sima <ben@bsima.me> | 2024-05-20 22:15:49 -0400 |
commit | 2d33aa547ff6a516c90ca2b47b13e2add200583a (patch) | |
tree | 8d4941699982c59c6430f4b9a629b8ea91245bb1 /Biz/Bild/Python.nix | |
parent | cceefa62d147594d43478e398bbaa9c630670935 (diff) |
Add simonw/llm as cli/library client for running LLMs
This is basically exactly the client library that I would write myself. Some
parts of it are still beta quality, but it's the sort of thing that I would
contribute to anyway.
Unfortunately I couldn't get the llm-llama-cpp plugin to work because it depends
on llama-cpp-python which is not packaged for nix and is hard to package because
the upstream project vendors a patched version of llama.cpp. So I'm stuck with
ollama for now, but that's fine because it actually works.
Diffstat (limited to 'Biz/Bild/Python.nix')
-rw-r--r-- | Biz/Bild/Python.nix | 1 |
1 files changed, 1 insertions, 0 deletions
diff --git a/Biz/Bild/Python.nix b/Biz/Bild/Python.nix index c559e42..2385987 100644 --- a/Biz/Bild/Python.nix +++ b/Biz/Bild/Python.nix @@ -7,6 +7,7 @@ _self: super: { exllama = callPackage ./Deps/exllama.nix { }; exllamav2 = callPackage ./Deps/exllamav2.nix { }; interegular = callPackage ./Deps/interegular.nix { }; + llm-ollama = callPackage ./Deps/llm-ollama.nix { }; mypy = dontCheck pysuper.mypy; outlines = callPackage ./Deps/outlines.nix { }; perscache = callPackage ./Deps/perscache.nix { }; |