mirror of
https://github.com/juspay/services-flake.git
synced 2024-09-19 08:17:11 +03:00
.. | ||
flake.lock | ||
flake.nix | ||
README.md |
Running local LLM using ollama and open-webui
While services-flake
is generally used for running services in a development project, typically under a source code checkout, you can also write flakes to derive an end-user app which runs a group of services, which then can be run using nix run
(or installed using nix profile install
):
# You can also use `nix profile install` on this URL, and run `services-flake-llm`
nix run github:juspay/services-flake?dir=example/llm
Default configuration & models
example/llm
runs two processes ollama and open-webui
- The ollama data is stored under
$HOME/.services-flake/ollama
. You can change this path inflake.nix
by setting thedataDir
option. - A single model (
llama2-uncensored
) is automatically downloaded. You can modify this inflake.nix
as well by setting themodels
option. You can also download models in the open-webui UI.