From 2d33aa547ff6a516c90ca2b47b13e2add200583a Mon Sep 17 00:00:00 2001 From: Ben Sima Date: Tue, 14 May 2024 09:35:45 -0400 Subject: Add simonw/llm as cli/library client for running LLMs This is basically exactly the client library that I would write myself. Some parts of it are still beta quality, but it's the sort of thing that I would contribute to anyway. Unfortunately I couldn't get the llm-llama-cpp plugin to work because it depends on llama-cpp-python which is not packaged for nix and is hard to package because the upstream project vendors a patched version of llama.cpp. So I'm stuck with ollama for now, but that's fine because it actually works. --- Biz/Bild/Deps.nix | 2 ++ 1 file changed, 2 insertions(+) (limited to 'Biz/Bild/Deps.nix') diff --git a/Biz/Bild/Deps.nix b/Biz/Bild/Deps.nix index 8bf2272..46fa00f 100644 --- a/Biz/Bild/Deps.nix +++ b/Biz/Bild/Deps.nix @@ -30,6 +30,8 @@ _self: super: ]; }; + llm = super.overrideSrc super.llm super.sources.llm; + nostr-rs-relay = super.callPackage ./Deps/nostr-rs-relay.nix { }; ollama = super.callPackage ./Deps/ollama.nix { acceleration = "cuda"; }; -- cgit v1.2.3