branch: externals/llm
commit 8fbb242585a3f06505f237fe597289fc5f1ca308
Author: Andrew Hyatt <ahy...@gmail.com>
Commit: GitHub <nore...@github.com>

    Defaut Open AI compatible models to "unset" (#177)
    
    This is because some providers don't need a model, but we should always
    pass one since the model field is required by the API.
    
    This will fix https://github.com/ahyatt/llm/issues/176.
---
 NEWS.org      | 2 ++
 README.org    | 4 +++-
 llm-openai.el | 7 ++++---
 3 files changed, 9 insertions(+), 4 deletions(-)

diff --git a/NEWS.org b/NEWS.org
index d02a7f7576..73908237fd 100644
--- a/NEWS.org
+++ b/NEWS.org
@@ -1,3 +1,5 @@
+* Version 0.24.2
+- Fix issue with some Open AI compatible providers needing models to be passed 
by giving a non-nil default.
 * Version 0.24.1
 - Fix issue with Ollama incorrect requests when passing non-standard params.
 * Version 0.24.0
diff --git a/README.org b/README.org
index a50f3c0a3b..2455d53338 100644
--- a/README.org
+++ b/README.org
@@ -48,7 +48,9 @@ You can set up with ~make-llm-openai~, with the following 
parameters:
 - ~:embedding-model~: A model name from 
[[https://platform.openai.com/docs/guides/embeddings/embedding-models][list of 
Open AI's embedding model names.]]  This is optional, and will default to a 
reasonable model.
 ** Open AI Compatible
 There are many Open AI compatible APIs and proxies of Open AI.  You can set up 
one with ~make-llm-openai-compatible~, with the following parameter:
-- ~:url~, the URL of leading up to the command ("embeddings" or 
"chat/completions").  So, for example, "https://api.openai.com/v1/"; is the URL 
to use Open AI (although if you wanted to do that, just use ~make-llm-openai~ 
instead.
+1) ~:url~, the URL of leading up to the command ("embeddings" or 
"chat/completions").  So, for example, "https://api.openai.com/v1/"; is the URL 
to use Open AI (although if you wanted to do that, just use ~make-llm-openai~ 
instead).
+2) ~:chat-model~:  The chat model that is supported by the provider.  Some 
providers don't need a model to be set, but still require it in the API, so we 
default to "unset".
+3) ~:embedding-model~: An embedding model name that is supported by the 
provider.  This is also defaulted to "unset".
 ** Azure's Open AI
 Microsoft Azure has an Open AI integration, although it doesn't support 
everything Open AI does, such as tool use.  You can set it up with 
~make-llm-azure~, with the following parameter:
 - ~:url~, the endpoint URL, such as "https://docs-test-001.openai.azure.com/";.
diff --git a/llm-openai.el b/llm-openai.el
index ccf2286a36..d08139e884 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -55,8 +55,8 @@ will use a reasonable default."
   key (chat-model "gpt-4o") (embedding-model "text-embedding-3-small"))
 
 (cl-defstruct (llm-openai-compatible (:include llm-openai
-                                               (chat-model nil)
-                                               (embedding-model nil)))
+                                               (chat-model "unset")
+                                               (embedding-model "unset")))
   "A structure for other APIs that use the Open AI's API.
 
 URL is the URL to use for the API, up to the command.  So, for
@@ -377,7 +377,8 @@ RESPONSE can be nil if the response is complete."
 
 (cl-defmethod llm-capabilities ((provider llm-openai-compatible))
   (append '(streaming model-list)
-          (when (llm-openai-embedding-model provider)
+          (when (and (llm-openai-embedding-model provider)
+                     (not (equal "unset" (llm-openai-embedding-model 
provider))))
             '(embeddings embeddings-batch))
           (when-let* ((model (llm-models-match (llm-openai-chat-model 
provider))))
             (llm-model-capabilities model))))

Reply via email to