diff --git a/example/llm/README.md b/example/llm/README.md index fc4966a..5304da8 100644 --- a/example/llm/README.md +++ b/example/llm/README.md @@ -1,7 +1,15 @@ # Running local LLM using ollama and open-webui -While `services-flake` is generally used for running services in a *development* project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services. +While `services-flake` is generally used for running services in a *development* project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services, which then can be run using `nix run` (or installed using `nix profile install`): -`example/llm` runs two processes ollama and open-webui, while storing the ollama data under `$HOME/.services-flake/ollama`. You can change this path in `flake.nix`. +```sh +# You can also use `nix profile install` on this URL, and run `services-flake-llm` +nix run github:juspay/services-flake?dir=example/llm +``` -By default, a single model (`llama2-uncensored`) is downloaded. You can modify this in `flake.nix` as well. +## Default configuration & models + +`example/llm` runs two processes ollama and open-webui + +- The ollama data is stored under `$HOME/.services-flake/ollama`. You can change this path in `flake.nix` by setting the `dataDir` option. +- A single model (`llama2-uncensored`) is automatically downloaded. You can modify this in `flake.nix` as well by setting the `models` option. You can also download models in the open-webui UI. diff --git a/example/llm/flake.nix b/example/llm/flake.nix index ea87f03..69dab0b 100644 --- a/example/llm/flake.nix +++ b/example/llm/flake.nix @@ -13,7 +13,9 @@ inputs.process-compose-flake.flakeModule ]; perSystem = { self', pkgs, lib, ... }: { - process-compose."default" = pc: { + packages.default = self'.packages.services-flake-llm; + + process-compose."services-flake-llm" = pc: { imports = [ inputs.services-flake.processComposeModules.default ];