services-flake/example/llm
Sridhar Ratnakumar 2e60588ffd chore(example/llm): Reintroduce dataDir
See README.
2024-06-14 12:55:20 -04:00
..
flake.lock chore(example/llm): minor refactor (#227) 2024-06-14 20:14:44 +05:30
flake.nix chore(example/llm): Reintroduce dataDir 2024-06-14 12:55:20 -04:00
README.md chore(example/llm): Reintroduce dataDir 2024-06-14 12:55:20 -04:00

Running local LLM using ollama and open-webui

While services-flake is generally used for running services in a development project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services.

example/llm runs two processes ollama and open-webui, while storing the ollama data under $HOME/.services-flake/ollama. You can change this path in flake.nix.