chore(example/llm): Give it an exe name, and expand README re: nix run

This commit is contained in:
Sridhar Ratnakumar 2024-06-14 12:49:37 -04:00 committed by Sridhar Ratnakumar
parent 9b88034e0a
commit 06650f4f2b
2 changed files with 14 additions and 4 deletions

View File

@ -1,7 +1,15 @@
# Running local LLM using ollama and open-webui
While `services-flake` is generally used for running services in a *development* project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services.
While `services-flake` is generally used for running services in a *development* project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services, which then can be run using `nix run` (or installed using `nix profile install`):
`example/llm` runs two processes ollama and open-webui, while storing the ollama data under `$HOME/.services-flake/ollama`. You can change this path in `flake.nix`.
```sh
# You can also use `nix profile install` on this URL, and run `services-flake-llm`
nix run github:juspay/services-flake?dir=example/llm
```
By default, a single model (`llama2-uncensored`) is downloaded. You can modify this in `flake.nix` as well.
## Default configuration & models
`example/llm` runs two processes ollama and open-webui
- The ollama data is stored under `$HOME/.services-flake/ollama`. You can change this path in `flake.nix` by setting the `dataDir` option.
- A single model (`llama2-uncensored`) is automatically downloaded. You can modify this in `flake.nix` as well by setting the `models` option. You can also download models in the open-webui UI.

View File

@ -13,7 +13,9 @@
inputs.process-compose-flake.flakeModule
];
perSystem = { self', pkgs, lib, ... }: {
process-compose."default" = pc: {
packages.default = self'.packages.services-flake-llm;
process-compose."services-flake-llm" = pc: {
imports = [
inputs.services-flake.processComposeModules.default
];