branch: externals/minuet
commit 83aa77aaa549b7a6c46a02e26acb3d9f91d2a0d3
Author: Milan Glacier <d...@milanglacier.com>
Commit: Milan Glacier <d...@milanglacier.com>

    doc: update README.
---
 README.md | 11 +++++++++++
 1 file changed, 11 insertions(+)

diff --git a/README.md b/README.md
index 248a8e71ee..edafde1f62 100644
--- a/README.md
+++ b/README.md
@@ -25,6 +25,7 @@
   - [Gemini](#gemini)
   - [OpenAI-compatible](#openai-compatible)
   - [OpenAI-FIM-Compatible](#openai-fim-compatible)
+    - [Non-OpenAI-FIM-Compatible APIs](#non-openai-fim-compatible-apis)
 - [Troubleshooting](#troubleshooting)
 - [Contributions](#contributions)
 - [Acknowledgement](#acknowledgement)
@@ -573,6 +574,10 @@ For example, you can set the `end_point` to
 
 <details>
 
+Refer to the
+[Completions 
Legacy](https://platform.openai.com/docs/api-reference/completions)
+section of the OpenAI documentation for details.
+
 Additionally, for Ollama users, it is essential to verify whether the model's
 template supports FIM completion. For example, qwen2.5-coder offers FIM 
support,
 as suggested in its
@@ -609,6 +614,12 @@ thereafter.
 
 </details>
 
+### Non-OpenAI-FIM-Compatible APIs
+
+For providers like **DeepInfra FIM**
+(`https://api.deepinfra.com/v1/inference/`), refer to 
[recipes.md](./recipes.md)
+for advanced configuration instructions.
+
 # Troubleshooting
 
 If your setup failed, there are two most likely reasons:

Reply via email to