summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
-rw-r--r--README.md2
-rw-r--r--docs/interference-api.md6
-rw-r--r--g4f/Provider/GeminiPro.py6
3 files changed, 11 insertions, 3 deletions
diff --git a/README.md b/README.md
index ed8fd61d..2de3d318 100644
--- a/README.md
+++ b/README.md
@@ -288,7 +288,7 @@ To utilize the OpenaiChat provider, a .har file is required from https://chatgpt
##### Storing the .HAR File
-- Place the exported .har file in the `./har_and_cookies` directory if you are using Docker. Alternatively, you can store it in any preferred location within your current working directory.
+- Place the exported .har file in the `./har_and_cookies` directory if you are using Docker. Alternatively, if you are using Python from a terminal, you can store it in a `./har_and_cookies` directory within your current working directory.
Note: Ensure that your .har file is stored securely, as it may contain sensitive information.
diff --git a/docs/interference-api.md b/docs/interference-api.md
index 2e18e7b5..b8e38fae 100644
--- a/docs/interference-api.md
+++ b/docs/interference-api.md
@@ -46,7 +46,11 @@ python -m g4f.api.run
```
**Once running, the API will be accessible at:** `http://localhost:1337/v1`
-
+
+**(Advanced) Bind to custom port:**
+```bash
+python -m g4f.cli api --bind "0.0.0.0:2400"
+```
## Using the Interference API
diff --git a/g4f/Provider/GeminiPro.py b/g4f/Provider/GeminiPro.py
index 06bf69ee..e7174c59 100644
--- a/g4f/Provider/GeminiPro.py
+++ b/g4f/Provider/GeminiPro.py
@@ -104,4 +104,8 @@ class GeminiPro(AsyncGeneratorProvider, ProviderModelMixin):
lines.append(chunk)
else:
data = await response.json()
- yield data["candidates"][0]["content"]["parts"][0]["text"] \ No newline at end of file
+ candidate = data["candidates"][0]
+ if candidate["finishReason"] == "STOP":
+ yield candidate["content"]["parts"][0]["text"]
+ else:
+ yield candidate["finishReason"] + ' ' + candidate["safetyRatings"] \ No newline at end of file