From 2d33aa547ff6a516c90ca2b47b13e2add200583a Mon Sep 17 00:00:00 2001 From: Ben Sima Date: Tue, 14 May 2024 09:35:45 -0400 Subject: Add simonw/llm as cli/library client for running LLMs This is basically exactly the client library that I would write myself. Some parts of it are still beta quality, but it's the sort of thing that I would contribute to anyway. Unfortunately I couldn't get the llm-llama-cpp plugin to work because it depends on llama-cpp-python which is not packaged for nix and is hard to package because the upstream project vendors a patched version of llama.cpp. So I'm stuck with ollama for now, but that's fine because it actually works. --- Biz/Bild/Python.nix | 1 + 1 file changed, 1 insertion(+) (limited to 'Biz/Bild/Python.nix') diff --git a/Biz/Bild/Python.nix b/Biz/Bild/Python.nix index c559e42..2385987 100644 --- a/Biz/Bild/Python.nix +++ b/Biz/Bild/Python.nix @@ -7,6 +7,7 @@ _self: super: { exllama = callPackage ./Deps/exllama.nix { }; exllamav2 = callPackage ./Deps/exllamav2.nix { }; interegular = callPackage ./Deps/interegular.nix { }; + llm-ollama = callPackage ./Deps/llm-ollama.nix { }; mypy = dontCheck pysuper.mypy; outlines = callPackage ./Deps/outlines.nix { }; perscache = callPackage ./Deps/perscache.nix { }; -- cgit v1.2.3