mirror of
https://github.com/xtekky/gpt4free.git
synced 2024-11-23 00:22:09 +03:00
Add help me coding guide
Add MissingAuthError in GeminiPro
This commit is contained in:
parent
6b3be02d63
commit
775a0c43a0
97
README.md
97
README.md
@ -19,18 +19,20 @@ pip install -U g4f
|
||||
docker pull hlohaus789/g4f
|
||||
```
|
||||
|
||||
## 🆕 What's New 🚀
|
||||
- How do I use my smartphone📱 to run g4f? [/docs/guides/phone](/docs/guides/phone.md)
|
||||
## 🆕 What's New
|
||||
- Guide: How do I use my smartphone📱to run g4f?
|
||||
- [/docs/guides/phone](/docs/guides/phone.md)
|
||||
- New: How can AI help me 💁with writing code?
|
||||
- [/docs/guides/help_me](/docs/guides/help_me.md)
|
||||
- Join our Telegram Channel: [t.me/g4f_channel](https://telegram.me/g4f_channel)
|
||||
- Join our Discord Group: [discord.gg/XfybzPXPH5](https://discord.gg/XfybzPXPH5)
|
||||
|
||||
## Site Takedown
|
||||
##🔻 Site Takedown
|
||||
Is your site on this repository and you want to take it down ? email takedown@g4f.ai with proof it is yours and it will be removed as fast as possible. - to prevent reproduction please secure your api ; )
|
||||
|
||||
## Feedback
|
||||
##🚀 Feedback and Todo
|
||||
You can always leave some feedback here: https://forms.gle/FeWV9RLEedfdkmFN6
|
||||
|
||||
## To do
|
||||
As per the survey, here is a list of improvements to come
|
||||
- [x] update the repository to include the new openai library syntax (ex: `Openai()` class) | completed, use `g4f.client.Client`
|
||||
- [ ] golang implementation
|
||||
@ -51,13 +53,13 @@ As per the survey, here is a list of improvements to come
|
||||
- [Quick start](#quick-start)
|
||||
+ [Use python](#use-python)
|
||||
- [Prerequisites](#prerequisites)
|
||||
- [Install using PyPI package:](#install-using-pypi-package-)
|
||||
- [Install from source:](#install-from-source-)
|
||||
- [Install using Docker:](#install-using-docker-)
|
||||
- [Install using PyPI package:](#install-using-pypi-package)
|
||||
- [Install from source:](#install-from-source)
|
||||
- [Install using Docker:](#install-using-docker)
|
||||
- [💡 Usage](#-usage)
|
||||
* [The Web UI](#the-web-ui)
|
||||
* [Text Generation](#text-generation)
|
||||
* [Image Generation](#text-generation)
|
||||
* [Web UI](#web-ui)
|
||||
* [Interference API](#interference-api)
|
||||
* [Configuration](#configuration)
|
||||
- [🚀 Providers and Models](#-providers-and-models)
|
||||
@ -67,8 +69,8 @@ As per the survey, here is a list of improvements to come
|
||||
* [Models](#models)
|
||||
- [🔗 Related GPT4Free Projects](#-related-gpt4free-projects)
|
||||
- [🤝 Contribute](#-contribute)
|
||||
+ [Create Provider with AI Tool](#create-provider-with-ai-tool)
|
||||
+ [Create Provider](#create-provider)
|
||||
+ [How do i create a new Provider?](#guide-how-do-i-create-a-new-provider)
|
||||
+ [How can AI help me with writing code?](#guide-how-can-ai-help-me-with-writing-code)
|
||||
- [🙌 Contributors](#-contributors)
|
||||
- [©️ Copyright](#-copyright)
|
||||
- [⭐ Star History](#-star-history)
|
||||
@ -158,15 +160,13 @@ response = client.images.generate(
|
||||
image_url = response.data[0].url
|
||||
```
|
||||
|
||||
**Result:**
|
||||
|
||||
[![Image with cat](/docs/cat.jpeg)](/docs/client.md)
|
||||
|
||||
**See also:**
|
||||
|
||||
- Documentation for the new Client API: [/docs/client](/docs/client.md)
|
||||
- Documentation for the leagcy API: [/docs/leagcy](/docs/leagcy.md)
|
||||
**Full Documentation for Python API**
|
||||
|
||||
- New Client API like the OpenAI Python library: [/docs/client](/docs/client.md)
|
||||
- Leagcy API with python modules: [/docs/leagcy](/docs/leagcy.md)
|
||||
|
||||
#### Web UI
|
||||
|
||||
@ -425,72 +425,19 @@ set G4F_PROXY=http://host:port
|
||||
|
||||
## 🤝 Contribute
|
||||
|
||||
#### Create Provider with AI Tool
|
||||
We welcome contributions from the community. Whether you're adding new providers or features, or simply fixing typos and making small improvements, your input is valued. Creating a pull request is all it takes – our co-pilot will handle the code review process. Once all changes have been addressed, we'll merge the pull request into the main branch and release the updates at a later time.
|
||||
|
||||
Call in your terminal the `create_provider.py` script:
|
||||
```bash
|
||||
python etc/tool/create_provider.py
|
||||
```
|
||||
1. Enter your name for the new provider.
|
||||
2. Copy and paste the `cURL` command from your browser developer tools.
|
||||
3. Let the AI create the provider for you.
|
||||
4. Customize the provider according to your needs.
|
||||
###### Guide: How do i create a new Provider?
|
||||
|
||||
#### Create Provider
|
||||
- Read: [/docs/guides/create_provider](/docs/guides/create_provider.md)
|
||||
|
||||
1. Check out the current [list of potential providers](https://github.com/zukixa/cool-ai-stuff#ai-chat-websites), or find your own provider source!
|
||||
2. Create a new file in [g4f/Provider](./g4f/Provider) with the name of the Provider
|
||||
3. Implement a class that extends [BaseProvider](./g4f/Provider/base_provider.py).
|
||||
###### Guide: How can AI help me with writing code?
|
||||
|
||||
```py
|
||||
from __future__ import annotations
|
||||
|
||||
from ..typing import AsyncResult, Messages
|
||||
from .base_provider import AsyncGeneratorProvider
|
||||
|
||||
class HogeService(AsyncGeneratorProvider):
|
||||
url = "https://chat-gpt.com"
|
||||
working = True
|
||||
supports_gpt_35_turbo = True
|
||||
|
||||
@classmethod
|
||||
async def create_async_generator(
|
||||
cls,
|
||||
model: str,
|
||||
messages: Messages,
|
||||
proxy: str = None,
|
||||
**kwargs
|
||||
) -> AsyncResult:
|
||||
yield ""
|
||||
```
|
||||
|
||||
4. Here, you can adjust the settings, for example, if the website does support streaming, set `supports_stream` to `True`...
|
||||
5. Write code to request the provider in `create_async_generator` and `yield` the response, _even if_ it's a one-time response, do not hesitate to look at other providers for inspiration
|
||||
6. Add the Provider Name in [`g4f/Provider/__init__.py`](./g4f/Provider/__init__.py)
|
||||
|
||||
```py
|
||||
from .HogeService import HogeService
|
||||
|
||||
__all__ = [
|
||||
HogeService,
|
||||
]
|
||||
```
|
||||
|
||||
7. You are done !, test the provider by calling it:
|
||||
|
||||
```py
|
||||
import g4f
|
||||
|
||||
response = g4f.ChatCompletion.create(model='gpt-3.5-turbo', provider=g4f.Provider.PROVIDERNAME,
|
||||
messages=[{"role": "user", "content": "test"}], stream=g4f.Provider.PROVIDERNAME.supports_stream)
|
||||
|
||||
for message in response:
|
||||
print(message, flush=True, end='')
|
||||
```
|
||||
- Read: [/docs/guides/help_me](/docs/guides/help_me.md)
|
||||
|
||||
## 🙌 Contributors
|
||||
|
||||
A list of the contributors is available [here](https://github.com/xtekky/gpt4free/graphs/contributors)
|
||||
A list of all contributors is available [here](https://github.com/xtekky/gpt4free/graphs/contributors)
|
||||
The [`Vercel.py`](https://github.com/xtekky/gpt4free/blob/main/g4f/Provider/Vercel.py) file contains code from [vercel-llm-api](https://github.com/ading2210/vercel-llm-api) by [@ading2210](https://github.com/ading2210), which is licensed under the [GNU GPL v3](https://www.gnu.org/licenses/gpl-3.0.txt)
|
||||
Top 1 Contributor: [@hlohaus](https://github.com/hlohaus)
|
||||
|
||||
|
@ -154,10 +154,11 @@ response = client.chat.completions.create(
|
||||
)
|
||||
print(response.choices[0].message.content)
|
||||
```
|
||||
![Waterfall](/docs/waterfall.jpeg)
|
||||
```
|
||||
User: What are on this image?
|
||||
```
|
||||
![Waterfall](/docs/waterfall.jpeg)
|
||||
|
||||
```
|
||||
Bot: There is a waterfall in the middle of a jungle. There is a rainbow over...
|
||||
```
|
||||
|
62
docs/guides/create_provider.md
Normal file
62
docs/guides/create_provider.md
Normal file
@ -0,0 +1,62 @@
|
||||
#### Create Provider with AI Tool
|
||||
|
||||
Call in your terminal the `create_provider` script:
|
||||
```bash
|
||||
python -m etc.tool.create_provider
|
||||
```
|
||||
1. Enter your name for the new provider.
|
||||
2. Copy and paste the `cURL` command from your browser developer tools.
|
||||
3. Let the AI create the provider for you.
|
||||
4. Customize the provider according to your needs.
|
||||
|
||||
#### Create Provider
|
||||
|
||||
1. Check out the current [list of potential providers](https://github.com/zukixa/cool-ai-stuff#ai-chat-websites), or find your own provider source!
|
||||
2. Create a new file in [g4f/Provider](/g4f/Provider) with the name of the Provider
|
||||
3. Implement a class that extends [BaseProvider](/g4f/providers/base_provider.py).
|
||||
|
||||
```py
|
||||
from __future__ import annotations
|
||||
|
||||
from ..typing import AsyncResult, Messages
|
||||
from .base_provider import AsyncGeneratorProvider
|
||||
|
||||
class HogeService(AsyncGeneratorProvider):
|
||||
url = "https://chat-gpt.com"
|
||||
working = True
|
||||
supports_gpt_35_turbo = True
|
||||
|
||||
@classmethod
|
||||
async def create_async_generator(
|
||||
cls,
|
||||
model: str,
|
||||
messages: Messages,
|
||||
proxy: str = None,
|
||||
**kwargs
|
||||
) -> AsyncResult:
|
||||
yield ""
|
||||
```
|
||||
|
||||
4. Here, you can adjust the settings, for example, if the website does support streaming, set `supports_stream` to `True`...
|
||||
5. Write code to request the provider in `create_async_generator` and `yield` the response, _even if_ it's a one-time response, do not hesitate to look at other providers for inspiration
|
||||
6. Add the Provider Import in [`g4f/Provider/__init__.py`](./g4f/Provider/__init__.py)
|
||||
|
||||
```py
|
||||
from .HogeService import HogeService
|
||||
|
||||
__all__ = [
|
||||
HogeService,
|
||||
]
|
||||
```
|
||||
|
||||
7. You are done !, test the provider by calling it:
|
||||
|
||||
```py
|
||||
import g4f
|
||||
|
||||
response = g4f.ChatCompletion.create(model='gpt-3.5-turbo', provider=g4f.Provider.PROVIDERNAME,
|
||||
messages=[{"role": "user", "content": "test"}], stream=g4f.Provider.PROVIDERNAME.supports_stream)
|
||||
|
||||
for message in response:
|
||||
print(message, flush=True, end='')
|
||||
```
|
106
docs/guides/help_me.md
Normal file
106
docs/guides/help_me.md
Normal file
@ -0,0 +1,106 @@
|
||||
### Guide: How can AI help me with writing code?
|
||||
|
||||
🤖 Ever dreamt of writing code at lightning speed, effortlessly crafting clean, bug-free functionalities? Welcome to the age of AI-powered coding, where your imagination merges seamlessly with the machine's precision. This guide unveils 4 powerful ways AI can become your secret weapon in the coding world, saving you time, resources, and frustration. 😎
|
||||
|
||||
- Language Agnostic? No Problem! Rewrite Code with Ease
|
||||
- Improving Written Code: Polish Your Code Like a Pro
|
||||
- Demystifying the Unreadable: Deobfuscating Unreadable Code
|
||||
- Custom Functions: Your AI-Powered Genie in a Bottle
|
||||
|
||||
So you can streamline your coding process, improve code quality, and enhance productivity.
|
||||
|
||||
#### 1. Rewriting Code from One Language to Another
|
||||
You can leverage AI to translate code from one programming language to another. For instance, you can easily convert JavaScript code to Python using the following approach:
|
||||
|
||||
**Prompt:**
|
||||
```
|
||||
Rewrite this JavaScript code to Python code. Use the aiohttp package:
|
||||
```
|
||||
```javascript
|
||||
async function fetch_data(url) {
|
||||
const response = await fetch(url);
|
||||
const data = await response.json();
|
||||
return data;
|
||||
}
|
||||
```
|
||||
|
||||
**AI Response:**
|
||||
```python
|
||||
import aiohttp
|
||||
|
||||
async def fetch_data(url):
|
||||
async with aiohttp.ClientSession() as session:
|
||||
async with session.get(url) as response:
|
||||
data = await response.json()
|
||||
return data
|
||||
```
|
||||
|
||||
#### 2. Improving Written Code
|
||||
You can ask the AI to enhance your code by adding comments, docstrings, and type hints for better readability and maintainability.
|
||||
|
||||
**Example Usage:**
|
||||
```
|
||||
Improve this Python code, add Google docstrings and type hints.
|
||||
Responses with the complete code and don't use placeholders.
|
||||
Keep the comments:
|
||||
```
|
||||
```python
|
||||
def calculate_area(radius):
|
||||
# Calculate the area of a circle
|
||||
return 3.14 * radius ** 2
|
||||
```
|
||||
|
||||
**AI Response:**
|
||||
```python
|
||||
def calculate_area(radius: float) -> float:
|
||||
"""Calculate the area of a circle.
|
||||
|
||||
Args:
|
||||
radius (float): The radius of the circle.
|
||||
|
||||
Returns:
|
||||
float: The area of the circle.
|
||||
"""
|
||||
return 3.14 * radius ** 2
|
||||
```
|
||||
|
||||
#### 3. Deobfuscating Unreadable Code
|
||||
AI can help in making obfuscated or unreadable code more understandable and maintainable.
|
||||
|
||||
**Example Usage:**
|
||||
```
|
||||
Deobfuscate this JavaScript code. Improve the readability:
|
||||
```
|
||||
|
||||
```javascript
|
||||
(function() {
|
||||
var a = 'Hello', b = 'World';
|
||||
console.log(a + ' ' + b);
|
||||
})();
|
||||
```
|
||||
|
||||
**AI Response:**
|
||||
```javascript
|
||||
(function() {
|
||||
var greeting = 'Hello';
|
||||
var subject = 'World';
|
||||
console.log(greeting + ' ' + subject);
|
||||
})();
|
||||
```
|
||||
|
||||
#### 4. Writing Custom Functions
|
||||
You can also ask the AI to generate custom functions or help you with specific coding issues.
|
||||
|
||||
**Example Usage:**
|
||||
```
|
||||
Write a function that returns the sum of two numbers.
|
||||
```
|
||||
|
||||
**AI Response:**
|
||||
```python
|
||||
def add_numbers(a, b):
|
||||
"""Add two numbers together."""
|
||||
return a + b
|
||||
```
|
||||
|
||||
These are just a few ways AI can revolutionize your coding experience. As AI technology continues to evolve, the possibilities are endless. So, embrace the future, unlock the power of AI, and watch your coding potential soar! 👷♂️
|
Binary file not shown.
Before Width: | Height: | Size: 13 KiB |
@ -21,9 +21,9 @@ Running Python applications on your smartphone is possible with specialized apps
|
||||
- In the app settings for Pydroid, disable power-saving mode and ensure that the option to pause when not in use is also disabled. This ensures uninterrupted operation of your Python scripts.
|
||||
|
||||
4. **Install Required Packages:**
|
||||
- Open Pip within the Pydroid app and install the necessary packages by executing the following commands:
|
||||
- Open Pip within the Pydroid app and install this necessary packages:
|
||||
```
|
||||
pip install g4f flask pillow beautifulsoup4
|
||||
g4f flask pillow beautifulsoup4
|
||||
```
|
||||
|
||||
5. **Create a New Python Script:**
|
||||
|
@ -7,7 +7,7 @@ from aiohttp import ClientSession
|
||||
from ..typing import AsyncResult, Messages, ImageType
|
||||
from .base_provider import AsyncGeneratorProvider, ProviderModelMixin
|
||||
from ..image import to_bytes, is_accepted_format
|
||||
|
||||
from ..errors import MissingAuthError
|
||||
|
||||
class GeminiPro(AsyncGeneratorProvider, ProviderModelMixin):
|
||||
url = "https://ai.google.dev"
|
||||
@ -29,7 +29,8 @@ class GeminiPro(AsyncGeneratorProvider, ProviderModelMixin):
|
||||
) -> AsyncResult:
|
||||
model = "gemini-pro-vision" if not model and image else model
|
||||
model = cls.get_model(model)
|
||||
api_key = api_key if api_key else kwargs.get("access_token")
|
||||
if not api_key:
|
||||
raise MissingAuthError('Missing "api_key" for auth')
|
||||
headers = {
|
||||
"Content-Type": "application/json",
|
||||
}
|
||||
@ -53,13 +54,13 @@ class GeminiPro(AsyncGeneratorProvider, ProviderModelMixin):
|
||||
})
|
||||
data = {
|
||||
"contents": contents,
|
||||
# "generationConfig": {
|
||||
# "stopSequences": kwargs.get("stop"),
|
||||
# "temperature": kwargs.get("temperature"),
|
||||
# "maxOutputTokens": kwargs.get("max_tokens"),
|
||||
# "topP": kwargs.get("top_p"),
|
||||
# "topK": kwargs.get("top_k"),
|
||||
# }
|
||||
"generationConfig": {
|
||||
"stopSequences": kwargs.get("stop"),
|
||||
"temperature": kwargs.get("temperature"),
|
||||
"maxOutputTokens": kwargs.get("max_tokens"),
|
||||
"topP": kwargs.get("top_p"),
|
||||
"topK": kwargs.get("top_k"),
|
||||
}
|
||||
}
|
||||
async with session.post(url, params={"key": api_key}, json=data, proxy=proxy) as response:
|
||||
if not response.ok:
|
||||
|
Loading…
Reference in New Issue
Block a user