Commit Graph

145 Commits

Author SHA1 Message Date
Adam Treat
889d7d8563 Move settings dialog into own file. 2023-04-23 06:58:07 -04:00
Adam Treat
1f65e381ee New thumbs up/down support for gpt4all-datalake. 2023-04-22 22:09:14 -04:00
Adam Treat
993a43d33a Minor cleanup. 2023-04-22 16:40:34 -04:00
Adam Treat
cca2a88e47 Getting ready for next update. 2023-04-21 23:23:57 -04:00
Adam Treat
bec8072fe1 Fix logic. 2023-04-21 13:46:50 -04:00
eachadea
116f740fb5 Don't build test_hw on apple silicon 2023-04-21 11:25:03 -04:00
Adam Treat
3e7cf346d6 Restore basic functionality. 2023-04-21 09:56:06 -04:00
Adam Treat
670bbe4db5 Make the settings dialog persist across sessions. 2023-04-21 08:23:39 -04:00
Adam Treat
294f2d6041 Revamp hardware tester to print to stdout the result in single word. 2023-04-21 07:36:05 -04:00
Adam Treat
e4d75cbfcd Remove this as clang does not support. 2023-04-20 20:48:27 -04:00
AT
6f1fe51087
Update README.md 2023-04-20 19:43:16 -04:00
Adam Treat
14831cd1c0 Add a small program that tests hardware. 2023-04-20 19:34:56 -04:00
AT
2dc26cfd09
Update README.md 2023-04-20 18:56:38 -04:00
Adam Treat
4d26f5daeb Silence a warning now that we're forked. 2023-04-20 17:27:06 -04:00
Adam Treat
442ca09b32 Remove ggml submodule in favor of llama.cpp 2023-04-20 17:20:44 -04:00
Adam Treat
bb78ee0025 Back out the prompt/response finding in gptj since it doesn't seem to help.
Guard against reaching the end of the context window which we don't handle
gracefully except for avoiding a crash.
2023-04-20 17:15:46 -04:00
Tom Jobbins
154f35ce53
Update HTTP link to model to point to the latest Jazzy model (in the CLI-only build section) (#78) 2023-04-20 14:15:07 -04:00
Adam Treat
65abaa19e5 Fix warning and update llama.cpp submodule to latest. 2023-04-20 13:27:11 -04:00
Adam Treat
51768bfbda Use default params unless we override them. 2023-04-20 12:07:43 -04:00
Adam Treat
b15feb5a4c Crop the filename. 2023-04-20 10:54:42 -04:00
Adam Treat
5a00c83139 Display filesize info in the model downloader. 2023-04-20 09:32:51 -04:00
Adam Treat
cd5f525950 Add multi-line prompt support. 2023-04-20 08:31:33 -04:00
Adam Treat
4c970fdc9c Pin the llama.cpp to a slightly older version. 2023-04-20 07:34:15 -04:00
Adam Treat
43e6d05d21 Don't crash starting with no model. 2023-04-20 07:17:07 -04:00
Adam Treat
d336db9fe9 Don't use versions for model downloader. 2023-04-20 06:48:13 -04:00
eachadea
b09ca009c5 Don't build a universal binary
unless -DBUILD_UNIVERSAL=ON
2023-04-20 06:37:54 -04:00
Adam Treat
55084333a9 Add llama.cpp support for loading llama based models in the gui. We now
support loading both gptj derived models and llama derived models.
2023-04-20 06:19:09 -04:00
Aaron Miller
f1b87d0b56 Add thread count setting 2023-04-19 08:33:13 -04:00
Adam Treat
e6cb6a2ae3 Add a new model download feature. 2023-04-18 21:10:06 -04:00
Adam Treat
1eda8f030e Allow unloading/loading/changing of models. 2023-04-18 11:42:38 -04:00
Aaron Miller
3a82a1d96c remove fill color for prompt template box 2023-04-18 08:47:37 -04:00
Adam Treat
a842f6c33f Fix link color to have consistency across platforms. 2023-04-18 08:45:21 -04:00
Adam Treat
0928c01ddb Make the gui accessible. 2023-04-18 08:40:04 -04:00
Pavol Rusnak
0e599e6b8a readme: GPL -> MIT license 2023-04-17 16:45:29 -04:00
Adam Treat
ef711b305b Changing to MIT license. 2023-04-17 16:37:50 -04:00
Adam Treat
bbf838354e Don't add version number to the installer or the install location. 2023-04-17 15:59:14 -04:00
Adam Treat
9f4e3cb7f4 Bump the version for the context bug fix. 2023-04-17 15:37:24 -04:00
Adam Treat
15ae0a4441 Fix the context. 2023-04-17 14:11:41 -04:00
Adam Treat
801107a12c Set a new default temp that is more conservative. 2023-04-17 09:49:59 -04:00
AT
ea7179e2e8
Update README.md 2023-04-17 09:02:26 -04:00
Adam Treat
7dbf81ed8f Update submodule. 2023-04-17 08:04:40 -04:00
Adam Treat
42fb215f61 Bump version to 2.1 as this has been referred to far and wide as
GPT4All v2 so doing this to decrease confusion. Also, making the version
number visible in the title bar.
2023-04-17 07:50:39 -04:00
Adam Treat
1dcd4dce58 Update the bundled model name. 2023-04-16 22:10:26 -04:00
Adam Treat
7ea548736b New version. 2023-04-16 19:20:43 -04:00
Adam Treat
659ab13665 Don't allow empty prompts. Context past always equal or greater than zero. 2023-04-16 14:57:58 -04:00
Adam Treat
7e9ca06366 Trim trailing whitespace at the end of generation. 2023-04-16 14:19:59 -04:00
Adam Treat
fdf7f20d90 Remove newlines too. 2023-04-16 14:04:25 -04:00
Adam Treat
f8b962d50a More conservative default params and trim leading whitespace from response. 2023-04-16 13:56:56 -04:00
TheBloke
7215b9f3fb Change the example CLI prompt to something more appropriate, as this is not a Llama model! :) 2023-04-16 12:52:23 -04:00
TheBloke
16f6b04a47 Fix repo name 2023-04-16 12:52:23 -04:00