branch: externals/llm
commit 5bb3dadf25c323a93fcbc146a40887026c2a2020
Author: Roman Scherer <ro...@burningswell.com>
Commit: Roman Scherer <ro...@burningswell.com>

    Handle empty choices
    
    When streaming, it can happen that choices is a zero-length
    array. This check prevents `llm-openai--get-partial-chat-response`
    from crashing in such cases.
---
 llm-openai.el | 4 +++-
 1 file changed, 3 insertions(+), 1 deletion(-)

diff --git a/llm-openai.el b/llm-openai.el
index 71280cacd2..5c831766e7 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -258,7 +258,9 @@ PROMPT is the prompt that needs to be updated with the 
response."
   "Return the text in the partial chat response from RESPONSE.
 RESPONSE can be nil if the response is complete."
   (when response
-    (let* ((delta (assoc-default 'delta (aref (assoc-default 'choices 
response) 0)))
+    (let* ((choices (assoc-default 'choices response))
+           (delta (when (> (length choices) 0)
+                    (assoc-default 'delta (aref choices 0))))
            (content-or-call (or (assoc-default 'content delta)
                                 (assoc-default 'tool_calls delta))))
       (when content-or-call

Reply via email to