louistiti
|
764e815204
|
feat(server): switch LLM duties back to using ChatWrapper instead of completions
|
2024-05-03 23:23:46 +08:00 |
|
louistiti
|
03d25f2a48
|
feat(server): Leon's personality done
|
2024-05-03 23:00:32 +08:00 |
|
louistiti
|
aa36c8cb2a
|
feat(server): personality support kick off; answer queue
|
2024-05-02 16:40:05 +08:00 |
|
louistiti
|
2146637022
|
feat(server): add paraphrase LLM duty to kick off personality attribution
|
2024-05-01 01:04:29 +08:00 |
|
louistiti
|
0a4e0ed90d
|
feat(server): add confidence value in logs
|
2024-04-30 11:48:09 +08:00 |
|
louistiti
|
ceeb5bd70e
|
feat(server): use Lexi-Llama-3-8B-Uncensored-Q5_K_S as default LLM
|
2024-04-30 09:58:30 +08:00 |
|
louistiti
|
f77343918d
|
chore: upgrade socket.io-client to latest
|
2024-04-29 21:28:20 +08:00 |
|
louistiti
|
e2984ea87e
|
chore: upgrade socket.io to latest
|
2024-04-29 21:27:16 +08:00 |
|
louistiti
|
694d398d7b
|
chore: upgrade fastify to latest
|
2024-04-29 21:25:25 +08:00 |
|
louistiti
|
a2bca433f6
|
chore: upgrade dotenv to latest
|
2024-04-29 21:24:21 +08:00 |
|
louistiti
|
60698704ca
|
chore: uninstall async npm package
|
2024-04-29 21:15:47 +08:00 |
|
louistiti
|
7d4e64d575
|
chore: upgrade tsc-watch to latest
|
2024-04-29 21:08:49 +08:00 |
|
louistiti
|
2c03efc167
|
refactor(server): warning message on explicit deactivation of LLM
|
2024-04-29 19:33:30 +08:00 |
|
louistiti
|
539281110a
|
feat(server): allow explicit deactivation of LLM
|
2024-04-29 19:31:57 +08:00 |
|
louistiti
|
3ed88544f8
|
fix(server): specify correct minimem total/free RAM for LLM
|
2024-04-29 19:18:27 +08:00 |
|
louistiti
|
9ee73ea5f9
|
feat(server): map resolved slots to dialog answers
|
2024-04-29 18:37:20 +08:00 |
|
louistiti
|
c934d0e30d
|
feat(server): support dialog type action after slots filled
|
2024-04-29 18:02:41 +08:00 |
|
louistiti
|
d9f0144e62
|
feat(server): action loop support
|
2024-04-29 11:25:18 +08:00 |
|
louistiti
|
57923f83ee
|
fix(scripts): always update manifest on LLM setup
|
2024-04-28 10:10:57 +08:00 |
|
louistiti
|
a38eee71f0
|
feat(server): set Phi-3 as default LLM
|
2024-04-26 15:14:28 +08:00 |
|
louistiti
|
c8e03ec401
|
fix(server): utterance as expected_item loop indefinitely
|
2024-04-22 23:29:30 +08:00 |
|
louistiti
|
4ca32b5070
|
feat(scripts): fallback to mirror in case of error to download LLM
|
2024-04-22 00:00:46 +08:00 |
|
louistiti
|
d8660251af
|
feat: use mistral-7b-instruct-v0.2.Q4_K_S as final choice
|
2024-04-20 14:33:37 +08:00 |
|
louistiti
|
26af271edc
|
feat(server): add utterance as expected_item type
|
2024-04-20 14:29:00 +08:00 |
|
louistiti
|
da49a4dd82
|
feat(scripts): llama.cpp compatible build
|
2024-04-18 20:38:02 +08:00 |
|
louistiti
|
cf02f0f91d
|
feat(server): final LLM setup
|
2024-04-17 00:10:41 +08:00 |
|
louistiti
|
2de95c4ef9
|
feat(server): Gemma support (prototype)
|
2024-04-16 20:18:18 +08:00 |
|
louistiti
|
2434d36564
|
feat(skill/translator-poc): short translate action kick off
|
2024-04-15 20:42:11 +08:00 |
|
louistiti
|
73d919868b
|
feat(server): LLM entities support
|
2024-04-14 17:41:39 +08:00 |
|
Théo LUDWIG
|
39cbd7114e
|
fix(bridge/python): usage of dict instead of empty TypedDict to avoid types errors
We might in the future generate automatically from the TypeScript types, generate the Python types.
For the moment, it is not typed, so we use a dict.
|
2024-02-25 20:57:02 +01:00 |
|
louistiti
|
a16ab4abfa
|
feat(server): add response data to LLM inference endpoint
|
2024-02-19 18:31:56 +08:00 |
|
louistiti
|
6e12d2638d
|
feat(server): verify whether LLM is enabled on inference
|
2024-02-19 18:30:10 +08:00 |
|
louistiti
|
0d3c42beb2
|
feat(server): implement PoC skill to validate LLM execution
|
2024-02-19 18:01:41 +08:00 |
|
louistiti
|
66f65b25b4
|
feat(server): expose LLM duties behind HTTP endpoint
|
2024-02-19 17:00:14 +08:00 |
|
louistiti
|
3f85a93a03
|
feat(server): LLM duties architecture + custom NER, summarization and translation duties
|
2024-02-18 23:43:25 +08:00 |
|
Louis Grenard
|
e9e9155366
|
feat(server): bootstrap LLM TCP server <> LLM duties <> TCP client (tmp)
|
2024-02-15 00:13:16 +08:00 |
|
Louis Grenard
|
30c9d3bce5
|
feat(server): create updater
|
2024-02-14 17:37:21 +08:00 |
|
Louis Grenard
|
dba5d90b94
|
chore: add LLM TCP server to Git commit message
|
2024-02-14 09:45:26 +08:00 |
|
Louis Grenard
|
0705659f3e
|
fix(scripts): Python TCP server setup
|
2024-02-14 09:44:50 +08:00 |
|
louistiti
|
4815651a1d
|
feat(server): bootstrap LLM TCP server (tmp)
|
2024-02-14 09:27:22 +08:00 |
|
louistiti
|
71ebdc5d80
|
refactor(server): prepare LLM TCP client connection
|
2024-02-13 18:51:45 +08:00 |
|
louistiti
|
49ac3218f7
|
feat(server): preparing LLM TCP server loading
|
2024-02-13 16:22:32 +08:00 |
|
louistiti
|
1c0c8080bf
|
feat(scripts): download and compile llama.cpp
|
2024-02-08 21:05:06 +08:00 |
|
louistiti
|
46f6c09739
|
scripts(setup-llm): tell when LLM is up-to-date
|
2024-01-30 00:09:20 +08:00 |
|
louistiti
|
93eb2d22b3
|
scripts(setup-llm): set up LLM
|
2024-01-29 23:57:23 +08:00 |
|
louistiti
|
c4d2524922
|
scripts(setup-llm): kick off LLM setup and unify common functions
|
2024-01-29 00:20:38 +08:00 |
|
louistiti
|
7482edde0a
|
Merge branch 'widget-backbone' into mistral-llm-ner
|
2024-01-28 22:22:48 +08:00 |
|
louistiti
|
09b9a02284
|
feat(skill/random_number): widget rendering
|
2024-01-28 17:16:10 +08:00 |
|
Théo LUDWIG
|
f895963ab9
|
feat(bridge/python): add Text component
|
2023-12-12 19:48:03 +01:00 |
|
Théo LUDWIG
|
637ab43626
|
feat(bridge/python): add TextInput component
|
2023-12-12 19:47:44 +01:00 |
|