louistiti
|
822e20f80b
|
fix(scripts): remove description property on LLM action classifier training
|
2024-07-07 16:03:20 +08:00 |
|
louistiti
|
0c775ba2e5
|
feat: use VRAM as LLM unit requirements
|
2024-07-03 23:41:23 +08:00 |
|
louistiti
|
4019bd877b
|
chore: better comments on LLM action matching
|
2024-07-03 09:11:25 +08:00 |
|
louistiti
|
8dc725a697
|
refactor(scripts): differentiate PyTorch info log on macOS
|
2024-06-24 08:28:32 +08:00 |
|
louistiti
|
1314217e00
|
BREAKING: upgrade from Python 3.9 to Python 3.11
|
2024-06-23 23:29:09 +08:00 |
|
louistiti
|
0661e9e04f
|
refactor(scripts): only install PyTorch when the targeted setup is the TCP server
|
2024-06-23 22:39:39 +08:00 |
|
louistiti
|
a8b3f7c9bd
|
feat(python tcp server): upgrade PyTorch to 2.3.1 (better MPS compatibility)
|
2024-06-20 09:54:16 +08:00 |
|
louistiti
|
8d3bffb64d
|
fix: skill settings
|
2024-06-18 09:07:17 +08:00 |
|
louistiti
|
32dc3ced0b
|
feat: Apple Silicon support for voice models
|
2024-06-18 09:02:09 +08:00 |
|
louistiti
|
582b2aa572
|
feat: kick off ASR engine support for Apple Silicon
|
2024-06-12 00:31:47 +08:00 |
|
louistiti
|
4ce470d3e7
|
feat(python tcp server): force TTS BERT local files
|
2024-06-06 08:43:06 +08:00 |
|
louistiti
|
7c9487b62b
|
feat(scripts): download and install TTS BERT model files for the Python TCP server env setup
|
2024-06-06 07:56:03 +08:00 |
|
louistiti
|
6df34d48ee
|
feat(scripts): test comment
|
2024-06-05 23:58:01 +08:00 |
|
Louis Grenard
|
d3f3c4e3a7
|
feat(scripts): setup Python TCP server TTS language model files kick off
|
2024-06-05 23:39:59 +08:00 |
|
louistiti
|
eb72065e5e
|
feat(scripts): auto delete .venv when Python packages failed to be installed on setup
|
2024-06-01 23:23:00 +08:00 |
|
louistiti
|
e9f1aadc67
|
feat(scripts): add Apple Metal developer tools error hint on Apple Silicon for llama.cpp setup
|
2024-06-01 21:57:20 +08:00 |
|
louistiti
|
f1c6db6053
|
fix(scripts): llama.cpp release tag not failing when null
|
2024-06-01 21:33:07 +08:00 |
|
louistiti
|
40fd5fac88
|
fix(scripts): llama.cpp default tag number on LLM setup
|
2024-06-01 21:04:46 +08:00 |
|
louistiti
|
8eefe53473
|
feat(web app): prevent from sending utterance while Leon is generating answer
|
2024-05-27 19:25:57 +08:00 |
|
louistiti
|
0ce1f62c0d
|
feat: support default conversations powered by LLM with action-first in mind
|
2024-05-27 18:54:54 +08:00 |
|
louistiti
|
1516b18a11
|
refactor: unify Python TCP server warnings to ignore
|
2024-05-26 09:55:48 +08:00 |
|
louistiti
|
fb5c258cf6
|
refactor: unify NLP models paths
|
2024-05-26 09:47:13 +08:00 |
|
louistiti
|
c172f2d6af
|
fix(server): handle exception on NLU
|
2024-05-22 17:32:13 +08:00 |
|
louistiti
|
b02d510490
|
feat(server): always load ASR model from local
|
2024-05-22 15:22:30 +08:00 |
|
louistiti
|
596b7552dd
|
feat: new ASR engine ready
|
2024-05-21 23:57:36 +08:00 |
|
louistiti
|
580289e05d
|
feat(scripts): add error hint about PortAudio for audio stream
|
2024-05-20 00:39:33 +08:00 |
|
louistiti
|
f776913ca3
|
feat(server): complete first version of the new TTS engine
|
2024-05-18 21:45:46 +08:00 |
|
louistiti
|
a7cab344f8
|
feat(python tcp server): embed new text-to-speech engine in TCP server binary
|
2024-05-18 01:11:12 +08:00 |
|
louistiti
|
e455a9d96b
|
feat(scripts): TCP server setup add PyTorch nightly install
|
2024-05-17 12:20:34 +08:00 |
|
louistiti
|
bdff917e04
|
feat(scripts): avoid LLM setup when running in CI
|
2024-05-08 19:21:52 +08:00 |
|
louistiti
|
aa862a6b5f
|
fix(scripts): TCP_SERVER_SRC_PATH to PYTHON_TCP_SERVER_SRC_PATH to set up Python env
|
2024-05-08 19:04:15 +08:00 |
|
louistiti
|
57923f83ee
|
fix(scripts): always update manifest on LLM setup
|
2024-04-28 10:10:57 +08:00 |
|
louistiti
|
4ca32b5070
|
feat(scripts): fallback to mirror in case of error to download LLM
|
2024-04-22 00:00:46 +08:00 |
|
louistiti
|
2de95c4ef9
|
feat(server): Gemma support (prototype)
|
2024-04-16 20:18:18 +08:00 |
|
Louis Grenard
|
dba5d90b94
|
chore: add LLM TCP server to Git commit message
|
2024-02-14 09:45:26 +08:00 |
|
Louis Grenard
|
0705659f3e
|
fix(scripts): Python TCP server setup
|
2024-02-14 09:44:50 +08:00 |
|
louistiti
|
49ac3218f7
|
feat(server): preparing LLM TCP server loading
|
2024-02-13 16:22:32 +08:00 |
|
louistiti
|
1c0c8080bf
|
feat(scripts): download and compile llama.cpp
|
2024-02-08 21:05:06 +08:00 |
|
louistiti
|
46f6c09739
|
scripts(setup-llm): tell when LLM is up-to-date
|
2024-01-30 00:09:20 +08:00 |
|
louistiti
|
93eb2d22b3
|
scripts(setup-llm): set up LLM
|
2024-01-29 23:57:23 +08:00 |
|
louistiti
|
c4d2524922
|
scripts(setup-llm): kick off LLM setup and unify common functions
|
2024-01-29 00:20:38 +08:00 |
|
louistiti
|
84b84453a7
|
Merge branch 'chore/clean-up' into develop
|
2023-11-15 22:38:53 +08:00 |
|
louistiti
|
977a822cc8
|
fix(scripts): clone intent-object JSON files for bridges on check command
|
2023-11-15 21:15:29 +08:00 |
|
Théo LUDWIG
|
86de211fa8
|
BREAKING: drop support for Docker
|
2023-11-15 01:23:05 +01:00 |
|
louistiti
|
2e6abef507
|
chore: lint
|
2023-05-27 08:06:11 +08:00 |
|
Divlo
|
4f5b385fd7
|
chore: rename config.json to settings.json
|
2023-05-26 23:31:25 +02:00 |
|
Divlo
|
5539130dd9
|
fix: typo missing "GB" for Free RAM
|
2023-05-15 23:42:27 +02:00 |
|
louistiti
|
a3b8c40714
|
refactor(scripts): unify skills setup over one iteration only
|
2023-05-14 12:10:55 +08:00 |
|
louistiti
|
6d495cf43c
|
refactor(scripts): explicit Node.js bridge npm packages install
|
2023-05-13 00:30:15 +08:00 |
|
louistiti
|
d6933a9e35
|
refactor(bridge/nodejs): remove lowdb and tiny custom ORM in favor of Prisma
|
2023-05-13 00:26:51 +08:00 |
|