Tekky
|
644f33b85e
|
~
|
2023-09-26 00:30:51 +01:00 |
|
abc
|
927b8d85ad
|
~ | g4f v-0.1.3.5
|
2023-09-26 00:24:58 +01:00 |
|
Tekky
|
af5cc4394b
|
~
|
2023-09-26 00:22:32 +01:00 |
|
Tekky
|
fd5d28cf7b
|
~ | Merge pull request #941 from hlohaus/myshell
AItianhuSpace Provider with GPT 4 added
|
2023-09-26 00:20:27 +01:00 |
|
Heiner Lohaus
|
72c3ff7a25
|
AItianhuSpace Provider with GPT 4 added
Reduced chunksize to better text completion
|
2023-09-26 01:02:02 +02:00 |
|
Heiner Lohaus
|
348670fe35
|
"create_async" support for BaseProvider,
by using ThreadPoolExecutor
Default Model for Vercel
|
2023-09-26 00:52:29 +02:00 |
|
Tekky
|
5bcc82ce1a
|
Merge pull request #940 from hlohaus/myshell
Add Myshell Provider
|
2023-09-25 19:23:47 +01:00 |
|
Heiner Lohaus
|
f1b6880f7e
|
Add Myshell Provider
Remove auto proxy prefix
|
2023-09-25 15:52:19 +02:00 |
|
Tekky
|
355295bcd9
|
~ | Merge pull request #936 from r1di/patch-1
#935 [Docker] ModuleNotFoundError: No module named 'transformers' and 'PyExecJS'
|
2023-09-24 20:37:09 +01:00 |
|
r1di
|
ad1aee266e
|
forgot some more modules
|
2023-09-24 15:18:47 +02:00 |
|
r1di
|
45743583a8
|
#935 [Docker] ModuleNotFoundError: No module named 'transformers' and PyExecJS
fixing missing modules
|
2023-09-24 15:05:01 +02:00 |
|
abc
|
21d27d9e62
|
~
|
2023-09-23 22:01:50 +01:00 |
|
Tekky
|
eaa8f712ab
|
~ | Merge pull request #934 from hlohaus/fix
Fix: Aivvm: KeyError: ''
|
2023-09-23 16:32:21 +01:00 |
|
Heiner Lohaus
|
fd5c33efb8
|
TypeError: issubclass() arg 1 must be a class
|
2023-09-23 17:10:25 +02:00 |
|
Heiner Lohaus
|
4edd7518de
|
Fix: Aivvm: KeyError: ''
and TypeError: issubclass() arg 1 must be a class
|
2023-09-23 15:35:17 +02:00 |
|
abc
|
bf78b4d033
|
~ | improve Vercel & g4f.Completion.create
|
2023-09-23 11:33:44 +01:00 |
|
abc
|
76bd483c1b
|
~
|
2023-09-23 11:23:07 +01:00 |
|
Tekky
|
4b9504dbcd
|
~
|
2023-09-23 11:20:25 +01:00 |
|
abc
|
6c2e3cc53c
|
~ | improve Vercel & g4f.Completion.create
added `.Completion.create` class.
```py
response = g4f.Completion.create(
model='text-davinci-003', prompt="Hello")
print(response)
```
|
2023-09-23 11:16:19 +01:00 |
|
abc
|
af9fc19938
|
~
|
2023-09-23 10:58:13 +01:00 |
|
abc
|
f21bac74b8
|
Merge branch 'main' of https://github.com/xtekky/gpt4free
|
2023-09-23 10:56:05 +01:00 |
|
abc
|
041949d4c8
|
~
|
2023-09-23 10:55:59 +01:00 |
|
Tekky
|
ea57fbf727
|
~ | Merge pull request #933 from hlohaus/night
Improve Vercel Provider:
|
2023-09-23 10:55:22 +01:00 |
|
Tekky
|
1215d30bf4
|
Delete g4f/Provider/Vercel.py
|
2023-09-23 10:54:38 +01:00 |
|
Heiner Lohaus
|
a3ecabb00e
|
Improve Vercel Provider:
- Fix endless loop
- Add proxy, async support
- Add default model
Fix HuggingChat Provider
|
2023-09-23 11:42:30 +02:00 |
|
abc
|
d8fc799e2d
|
~ | update models list
|
2023-09-23 01:34:27 +01:00 |
|
abc
|
d4acc23c0b
|
~ | Update Vercel Provider
|
2023-09-23 01:31:46 +01:00 |
|
abc
|
7875661332
|
~ | update models list
|
2023-09-23 01:31:28 +01:00 |
|
abc
|
9ffed2fc33
|
~ | gpt-3.5-turbo-16k-0613
|
2023-09-23 01:31:16 +01:00 |
|
abc
|
66e7660494
|
~
|
2023-09-23 01:30:45 +01:00 |
|
abc
|
07063b0fd8
|
~ | Update models list
|
2023-09-23 01:29:55 +01:00 |
|
abc
|
42a02c3d2d
|
~ | new providers
Somwhat fix Aivvm provider, which looks to have a working gpt-4
kinda unstable
|
2023-09-23 00:44:09 +01:00 |
|
abc
|
17b3eb1bba
|
~
|
2023-09-23 00:43:25 +01:00 |
|
abc
|
567e268446
|
~ | remove quickjs
|
2023-09-23 00:43:16 +01:00 |
|
abc
|
d320efd2ff
|
~ | Fix DeepAi
|
2023-09-22 23:48:39 +01:00 |
|
Tekky
|
a92a5cb8ef
|
~
|
2023-09-22 20:53:39 +01:00 |
|
Tekky
|
f07395f8e2
|
~
|
2023-09-22 20:52:52 +01:00 |
|
Tekky
|
7f55c79422
|
~
|
2023-09-22 20:50:08 +01:00 |
|
abc
|
029a799d5e
|
~ | g4f v-0.0.3.1
|
2023-09-22 20:42:41 +01:00 |
|
Tekky
|
17bed9c4d0
|
~ | Merge pull request #930 from thebigbone/main
update dockerfile
|
2023-09-22 20:41:21 +01:00 |
|
Tekky
|
ba287e89b5
|
~ | Merge pull request #924 from hlohaus/vercel
Fix async example in readme
|
2023-09-22 20:40:59 +01:00 |
|
abc
|
4d4fc98533
|
~ | gpt-3.5-turbo-0613
|
2023-09-22 20:36:44 +01:00 |
|
Tekky
|
d0344a406f
|
~ | Merge pull request #923 from fjteam/main
added model config : gpt-3.5-turbo-0613
|
2023-09-22 20:35:00 +01:00 |
|
Tekky
|
2cb59b4e10
|
~ | Merge pull request #920 from chatgpt-tricks/main
Adding embedding support to the interference proxy
|
2023-09-22 20:30:06 +01:00 |
|
hcrypt
|
01ae5c4280
|
update dockerfile
expose relevant port in dockerfile
|
2023-09-22 17:39:59 +05:30 |
|
Heiner Lohaus
|
e9f96ced9c
|
Add RetryProvider
|
2023-09-21 20:10:59 +02:00 |
|
Tekky
|
a76055652b
|
~
|
2023-09-20 23:53:21 +01:00 |
|
Heiner Lohaus
|
951a1332a7
|
Fix create_event_loop function
Add PerplexityAi Provider
|
2023-09-20 23:06:52 +02:00 |
|
Heiner Lohaus
|
f90741c10b
|
Improve code style in async support
|
2023-09-20 17:31:25 +02:00 |
|
chatgpt-tricks
|
ce1bf62ac3
|
import AutoTokenizer in app.py
import AutoTokenizer in app.py, because I forgot to
|
2023-09-20 15:31:37 +02:00 |
|