[elpa] externals/xeft 811806d23f 1/3: ; * README.md: Fix typo.

2023-09-15 Thread ELPA Syncer
branch: externals/xeft
commit 811806d23f0ec142368188f001cc044f226653e5
Author: Levin Du 
Commit: Yuan Fu 

; * README.md: Fix typo.
---
 README.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/README.md b/README.md
index d91982a69b..3db8ea7663 100644
--- a/README.md
+++ b/README.md
@@ -127,9 +127,9 @@ to do it.
 
 ```emacs-lisp
 ;; Don't follow symlinks.
-(setq xeft-recusive t)
+(setq xeft-recursive t)
 ;; Follow symlinks.
-(setq xeft-recusive 'follow-symlinks)
+(setq xeft-recursive 'follow-symlinks)
 ```
 
 **How to make the preview pane to show up automatically?**



[elpa] externals/xeft ebaa0d493f 2/3: Scroll to the search phrase in preview window (issue#29)

2023-09-15 Thread ELPA Syncer
branch: externals/xeft
commit ebaa0d493ff47243dfb68190e156e2f62bb9ca0c
Author: Yuan Fu 
Commit: Yuan Fu 

Scroll to the search phrase in preview window (issue#29)

* xeft.el (xeft--preview-file): Scroll to the first search phrase.
---
 xeft.el | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/xeft.el b/xeft.el
index 8783894f96..e6f151c6e5 100644
--- a/xeft.el
+++ b/xeft.el
@@ -500,7 +500,10 @@ If SELECT is non-nil, select the buffer after displaying 
it."
 (if (and (window-live-p xeft--preview-window)
  (not (eq xeft--preview-window (selected-window
 (with-selected-window xeft--preview-window
-  (switch-to-buffer buffer))
+  (switch-to-buffer buffer)
+  (when keyword-list
+(let ((case-fold-search t))
+  (search-forward (car keyword-list) nil t
   (setq xeft--preview-window
 (display-buffer
  buffer '((display-buffer-use-some-window



[elpa] externals/xeft updated (6de2d038e9 -> 32735a2a63)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/xeft.

  from  6de2d038e9 ; * README.md: Mention search phrase threshold.
   new  811806d23f ; * README.md: Fix typo.
   new  ebaa0d493f Scroll to the search phrase in preview window (issue#29)
   new  32735a2a63 ; * xeft.el (xeft-default-title): Fix typo.


Summary of changes:
 README.md | 4 ++--
 xeft.el   | 7 +--
 2 files changed, 7 insertions(+), 4 deletions(-)



[elpa] externals/xeft 32735a2a63 3/3: ; * xeft.el (xeft-default-title): Fix typo.

2023-09-15 Thread ELPA Syncer
branch: externals/xeft
commit 32735a2a631fc2957b79cc65ad851546b7d572df
Author: Yuan Fu 
Commit: Yuan Fu 

; * xeft.el (xeft-default-title): Fix typo.
---
 xeft.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/xeft.el b/xeft.el
index e6f151c6e5..cc3f96685b 100644
--- a/xeft.el
+++ b/xeft.el
@@ -597,7 +597,7 @@ title."
   (let ((bol (point)) title)
 (end-of-line)
 (setq title (buffer-substring-no-properties bol (point)))
-(if (eq title "")
+(if (equal title "")
 (file-name-base file)
   title)))
 



[nongnu] elpa/eat 3fcf128840: ; * eat.el (eat-mode): Fix mode line

2023-09-15 Thread ELPA Syncer
branch: elpa/eat
commit 3fcf128840eeac844aed0962f03af1fdf14c7c3f
Author: Akib Azmain Turja 
Commit: Akib Azmain Turja 

; * eat.el (eat-mode): Fix mode line
---
 eat.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/eat.el b/eat.el
index 2b67ea9193..556b9e283f 100644
--- a/eat.el
+++ b/eat.el
@@ -6256,7 +6256,7 @@ mouse-3: Switch to char mode"
 '(" "
   (:propertize
(:eval
-(when-let* (((eat--terminal))
+(when-let* ((eat--terminal)
 (title (eat-term-title eat--terminal))
 ((not (string-empty-p title
   (format "(%s)" (string-replace "%" "%%"



[elpa] externals/denote dff73d46c5: Move "Denote" menu on menu-bar to the end after "Tools"

2023-09-15 Thread ELPA Syncer
branch: externals/denote
commit dff73d46c598926e44d477d77c8ceadbbb7c3ed9
Author: Noboru Ota 
Commit: Protesilaos Stavrou 

Move "Denote" menu on menu-bar to the end after "Tools"

Hi Prot and all in the list.

I'm happy to see that Prot has got his off-grid electricity cabin up.
Hope everything is working as you expect it to, Prot.

Here is a patch. I have got this method from Charles Choi's blog article, 
"Using
Bookmarks in Emacs like you do in Web Browsers" [1].


[1]:http://yummymelon.com/devnull/using-bookmarks-in-emacs-like-you-do-in-web-browsers.html

– nobiot

>From ec217494621d6d70f28c9c3a1eb1b0539e6e92b7 Mon Sep 17 00:00:00 2001
From: Noboru Ota 
Date: Fri, 15 Sep 2023 09:18:08 +0200
Subject: [PATCH] Move "Denote" menu on menu-bar to the end of global-map 
after
 Tools
---
 denote.el | 6 +-
 1 file changed, 5 insertions(+), 1 deletion(-)

diff --git a/denote.el b/denote.el
index 43e9007647..782737b94d 100644
--- a/denote.el
+++ b/denote.el
@@ -3489,10 +3489,14 @@ This command is meant to be used from a Dired buffer."
  :selected (bound-and-true-p denote-dired-mode)])
   "Contents of the Denote menu.")
 
-(easy-menu-define denote-global-menu global-map
+(easy-menu-define denote-global-menu nil
   "Menu with all Denote commands, each available in the right context."
   denote--menu-contents)
 
+;; Add Denote menu at the end of global-map after Tools
+(easy-menu-add-item global-map '(menu-bar)
+denote-global-menu)
+
 (defun denote-context-menu (menu _click)
   "Populate MENU with Denote commands at CLICK."
   (define-key menu [denote-separator] menu-bar-separator)



[elpa] externals/org 1d35ebd93c 2/5: test-org-clok/org-clock-update-time-maybe: Fix test for non-English LANG

2023-09-15 Thread ELPA Syncer
branch: externals/org
commit 1d35ebd93c26a511f42555e1eb54fc7f8028a124
Author: Ihor Radchenko 
Commit: Ihor Radchenko 

test-org-clok/org-clock-update-time-maybe: Fix test for non-English LANG

* testing/lisp/test-org-clock.el 
(test-org-clok/org-clock-update-time-maybe):
Do not assert English day names.

Reported-by: em...@supporter.mailer.me
---
 testing/lisp/test-org-clock.el | 5 -
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/testing/lisp/test-org-clock.el b/testing/lisp/test-org-clock.el
index ff547a0942..44c62e7bc8 100644
--- a/testing/lisp/test-org-clock.el
+++ b/testing/lisp/test-org-clock.el
@@ -122,7 +122,10 @@ the buffer."
   "Test `org-clock-update-time-maybe' specifications."
   (should
(equal
-"CLOCK: [2023-04-29 Sat 00:00]--[2023-05-04 Thu 01:00] => 121:00"
+(format
+ "CLOCK: [2023-04-29 %s 00:00]--[2023-05-04 %s 01:00] => 121:00"
+ (org-test-get-day-name "Sat")
+ (org-test-get-day-name "Thu"))
 (org-test-with-temp-text
 "CLOCK: [2023-04-29 Sat 00:00]--[2023-05-04 Thu 01:00]"
   (should (org-clock-update-time-maybe))



[elpa] externals/org d70c1200f7 3/5: ob-fortran.el: Fix name of caller in documentation

2023-09-15 Thread ELPA Syncer
branch: externals/org
commit d70c1200f70209db8fe6868f240492ce0e8ea190
Author: Gerard Vermeulen 
Commit: Ihor Radchenko 

ob-fortran.el: Fix name of caller in documentation

* lisp/ob-fortran.el (org-babel-execute:fortran): Fix name of caller
in documentation and conform to "rules" in sibling ob-XXX.el files.
---
 lisp/ob-fortran.el | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/lisp/ob-fortran.el b/lisp/ob-fortran.el
index 7075d8a9fe..fabc6a47a9 100644
--- a/lisp/ob-fortran.el
+++ b/lisp/ob-fortran.el
@@ -51,8 +51,8 @@
   :type  'string)
 
 (defun org-babel-execute:fortran (body params)
-  "Execute fortran BODY according to PARAMS.
-This function should only be called by `org-babel-execute:fortran'."
+  "Execute Fortran BODY according to PARAMS.
+This function is called by `org-babel-execute-src-block'."
   (let* ((tmp-src-file (org-babel-temp-file "fortran-src-" ".F90"))
  (tmp-bin-file (org-babel-temp-file "fortran-bin-" org-babel-exeext))
  (cmdline (cdr (assq :cmdline params)))



[elpa] externals/org fd1418dadd 4/5: * lisp/ob-gnuplot.el: Document all the function arguments

2023-09-15 Thread ELPA Syncer
branch: externals/org
commit fd1418dadd85a9884deea9a79f675de9d7d45937
Author: Ihor Radchenko 
Commit: Ihor Radchenko 

* lisp/ob-gnuplot.el: Document all the function arguments

(org-babel-execute:gnuplot):
(org-babel-variable-assignments:gnuplot):
---
 lisp/ob-gnuplot.el | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/lisp/ob-gnuplot.el b/lisp/ob-gnuplot.el
index 91e4bed89a..642ad6fa7c 100644
--- a/lisp/ob-gnuplot.el
+++ b/lisp/ob-gnuplot.el
@@ -197,7 +197,7 @@ code."
 body))
 
 (defun org-babel-execute:gnuplot (body params)
-  "Execute a block of Gnuplot code.
+  "Execute Gnuplot BODY according to PARAMS.
 This function is called by `org-babel-execute-src-block'."
   (org-require-package 'gnuplot)
   (let ((session (cdr (assq :session params)))
@@ -252,7 +252,8 @@ This function is called by `org-babel-execute-src-block'."
   buffer)))
 
 (defun org-babel-variable-assignments:gnuplot (params)
-  "Return list of gnuplot statements assigning the block's variables."
+  "Return list of gnuplot statements assigning the block's variables.
+PARAMS is src block parameters alist defining variable assignments."
   (mapcar
(lambda (pair) (format "%s = \"%s\"" (car pair) (cdr pair)))
(org-babel-gnuplot-process-vars params)))



[elpa] externals/org e90a8a69a7 5/5: org-element-cache: Log recovered persisted elements during loading

2023-09-15 Thread ELPA Syncer
branch: externals/org
commit e90a8a69a7fa2d83c995b5d32bc0b24a68218ed3
Author: Ihor Radchenko 
Commit: Ihor Radchenko 

org-element-cache: Log recovered persisted elements during loading

* lisp/org-element.el (org-element--cache-persist-before-read):
(org-element--cache-persist-after-read): Record diagnostics messages
when loading persistent cache.
---
 lisp/org-element.el | 11 +--
 1 file changed, 9 insertions(+), 2 deletions(-)

diff --git a/lisp/org-element.el b/lisp/org-element.el
index 40bb294795..37c2d201f3 100644
--- a/lisp/org-element.el
+++ b/lisp/org-element.el
@@ -7368,15 +7368,19 @@ The element is: %S\n The real element is: %S\n Cache 
around :begin:\n%S\n%S\n%S"
 (defun org-element--cache-persist-before-read (container &optional associated)
   "Avoid reading cache before Org mode is loaded."
   (when (equal container '(elisp org-element--cache))
+(org-element--cache-log-message "Loading persistent cache for %s" 
(plist-get associated :file))
 (if (not (and (plist-get associated :file)
 (get-file-buffer (plist-get associated :file
-'forbid
+(progn
+  (org-element--cache-log-message "%s does not have a buffer: not 
loading cache" (plist-get associated :file))
+  'forbid)
   (with-current-buffer (get-file-buffer (plist-get associated :file))
 (unless (and org-element-use-cache
  org-element-cache-persistent
  (derived-mode-p 'org-mode)
  (equal (secure-hash 'md5 (current-buffer))
 (plist-get associated :hash)))
+  (org-element--cache-log-message "Cache is not current (or 
persistence is disabled) in %s" (plist-get associated :file))
   'forbid)
 
 (defun org-element--cache-persist-after-read (container &optional associated)
@@ -7393,7 +7397,10 @@ The element is: %S\n The real element is: %S\n Cache 
around :begin:\n%S\n%S\n%S"
(lambda (el2)
  (unless (org-element-type-p el2 'plain-text)
(org-element-put-property el2 :buffer (current-buffer
-   nil nil nil 'with-affiliated 'no-undefer))
+   nil nil nil 'with-affiliated 'no-undefer)
+ (org-element--cache-log-message
+  "Recovering persistent cached element: %S"
+  (org-element--format-element el)))
org-element--cache)
   (setq-local org-element--cache-size (avl-tree-size 
org-element--cache)))
 (when (and (equal container '(elisp org-element--headline-cache)) 
org-element--headline-cache)



[elpa] externals/org updated (765a84ea25 -> e90a8a69a7)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/org.

  from  765a84ea25 * lisp/ob-fortran.el: Document all the function arguments
   new  9eaca51c51 * lisp/org-clock.el (org-clock--translate): Clarify "L" 
and "ALL" terms
   new  1d35ebd93c test-org-clok/org-clock-update-time-maybe: Fix test for 
non-English LANG
   new  d70c1200f7 ob-fortran.el: Fix name of caller in documentation
   new  fd1418dadd * lisp/ob-gnuplot.el: Document all the function arguments
   new  e90a8a69a7 org-element-cache: Log recovered persisted elements 
during loading


Summary of changes:
 lisp/ob-fortran.el |  4 ++--
 lisp/ob-gnuplot.el |  5 +++--
 lisp/org-clock.el  |  4 
 lisp/org-element.el| 11 +--
 testing/lisp/test-org-clock.el |  5 -
 5 files changed, 22 insertions(+), 7 deletions(-)



[elpa] externals/org 9eaca51c51 1/5: * lisp/org-clock.el (org-clock--translate): Clarify "L" and "ALL" terms

2023-09-15 Thread ELPA Syncer
branch: externals/org
commit 9eaca51c510905e0cf4884aa52e4e7c2a4484d41
Author: Ihor Radchenko 
Commit: Ihor Radchenko 

* lisp/org-clock.el (org-clock--translate): Clarify "L" and "ALL" terms

Explain in a comment what "L" And "ALL" terms mean in the context of
clock table.

Reproted-by: em...@supporter.mailer.me
---
 lisp/org-clock.el | 4 
 1 file changed, 4 insertions(+)

diff --git a/lisp/org-clock.el b/lisp/org-clock.el
index da19acef6e..ffd911c0a0 100644
--- a/lisp/org-clock.el
+++ b/lisp/org-clock.el
@@ -567,6 +567,10 @@ of a different task.")
 Assume S in the English term to translate.  Return S as-is if it
 cannot be translated."
   (or (nth (pcase s
+ ;; "L" stands for "Level"
+ ;; "ALL" stands for a line summarizing clock data across
+ ;; all the files, when the clocktable includes multiple
+ ;; files.
 ("File" 1) ("L" 2) ("Timestamp" 3) ("Headline" 4) ("Time" 5)
 ("ALL" 6) ("Total time" 7) ("File time" 8) ("Clock summary at" 9))
   (assoc-string language org-clock-clocktable-language-setup t))



[nongnu] elpa/cdlatex 42a2041df9: Fix bug when moving out of parenthesis aith TAB

2023-09-15 Thread ELPA Syncer
branch: elpa/cdlatex
commit 42a2041df99d1d3da9e08d17ceb2eba111cc85ed
Author: Carsten Dominik 
Commit: Carsten Dominik 

Fix bug when moving out of parenthesis aith TAB

Thanks to SnootierMoon of reporting the bug
---
 cdlatex.el | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/cdlatex.el b/cdlatex.el
index 205429ab33..f43dad1e71 100644
--- a/cdlatex.el
+++ b/cdlatex.el
@@ -3,7 +3,7 @@
 ;;
 ;; Author: Carsten Dominik 
 ;; Keywords: tex
-;; Version: 4.18
+;; Version: 4.18b
 ;;
 ;; This file is not part of GNU Emacs.
 ;;
@@ -1006,7 +1006,7 @@ Sounds strange?  Try it out!"
 (= (preceding-char) ?-))
 (throw 'stop t)
   (forward-char 1)
-  (if (looking-at "[^_\\^({\\[]")
+  (if (looking-at "[^_^({\\[]")
   ;; stop after closing bracket, unless ^_[{( follow
   (throw 'stop t
 



[nongnu] elpa/org-contrib 1ee7db2701: * lisp/ol-vm.el (org-vm-store-link): Do not use obsolete function names

2023-09-15 Thread ELPA Syncer
branch: elpa/org-contrib
commit 1ee7db27015e72202928925305dde9e5a2ee6a01
Author: Ihor Radchenko 
Commit: Ihor Radchenko 

* lisp/ol-vm.el (org-vm-store-link): Do not use obsolete function names
---
 lisp/ol-vm.el | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/lisp/ol-vm.el b/lisp/ol-vm.el
index bfd4a0b4ea..b42843d679 100644
--- a/lisp/ol-vm.el
+++ b/lisp/ol-vm.el
@@ -88,11 +88,11 @@
folder))
 (setq folder (replace-match "" t t folder)
 (setq message-id (org-unbracket-string "<" ">" message-id))
-   (org-store-link-props :type link-type :from from :to to :subject subject
+   (org-link-store-props :type link-type :from from :to to :subject subject
  :message-id message-id :date date)
-   (setq desc (org-email-link-description))
+   (setq desc (org-link-email-description))
(setq link (concat (concat link-type ":") folder "#" message-id))
-   (org-add-link-props :link link :description desc)
+   (org-link-add-props :link link :description desc)
link
 
 (defun org-vm-open (path _)



[elpa] externals/embark dbb1158ced 1/3: Some more commands which should mark their targets

2023-09-15 Thread ELPA Syncer
branch: externals/embark
commit dbb1158ced3c047e6b8daffec1f178d668af611a
Author: Omar Antolín 
Commit: Omar Antolín 

Some more commands which should mark their targets
---
 embark.el | 4 
 1 file changed, 4 insertions(+)

diff --git a/embark.el b/embark.el
index 03d92a506a..f2e0260a4f 100644
--- a/embark.el
+++ b/embark.el
@@ -480,6 +480,10 @@ arguments and more details."
 (shell-command-on-region embark--mark-target)
 (embark-eval-replace embark--mark-target)
 (delete-indentation embark--mark-target)
+(comment-dwim embark--mark-target)
+(insert-parentheses embark--mark-target)
+(insert-pair embark--mark-target)
+(org-emphasize embark--mark-target))
 ;; do the actual work of selecting & deselecting targets
 (embark-select embark--select))
   "Alist associating commands with post-action hooks.



[elpa] externals/embark cf1325929e 2/3: Tweak whitespace behavior of insert actions

2023-09-15 Thread ELPA Syncer
branch: externals/embark
commit cf1325929e804c71b6ad5313228173b54bf2348b
Author: Omar Antolín 
Commit: Omar Antolín 

Tweak whitespace behavior of insert actions
---
 embark.el | 16 +++-
 1 file changed, 11 insertions(+), 5 deletions(-)

diff --git a/embark.el b/embark.el
index f2e0260a4f..e0186c4528 100644
--- a/embark.el
+++ b/embark.el
@@ -3582,13 +3582,19 @@ constituent character next to an existing word 
constituent.
 2. For a multiline inserted string, newlines may be added before
 or after as needed to ensure the inserted string is on lines of
 its own."
-  (let ((multiline (seq-some (lambda (s) (string-match-p "\n" s)) strings))
-(separator (embark--separator strings)))
+  (let* ((separator (embark--separator strings))
+ (multiline
+  (or (and (cdr strings) (string-match-p "\n" separator))
+  (and (null (cdr strings))
+   (equal (buffer-substring (line-beginning-position)
+(line-end-position))
+  (car strings)))
+  (seq-some (lambda (s) (string-match-p "\n" s)) strings
 (cl-labels ((maybe-space ()
   (and (looking-at "\\w") (looking-back "\\w" 1)
(insert " ")))
 (maybe-newline ()
-  (or (looking-back "^[ \t]*" 40) (looking-at "\n\n")
+  (or (looking-back "^[ \t]*" 40) (looking-at "\n")
   (newline-and-indent)))
 (maybe-whitespace ()
   (if multiline (maybe-newline) (maybe-space)))
@@ -3712,7 +3718,7 @@ Returns the new name actually used."
 (defun embark-insert-variable-value (var)
   "Insert value of VAR."
   (interactive "SVariable: ")
-  (insert (string-trim (pp-to-string (symbol-value var)
+  (embark-insert (list (string-trim (pp-to-string (symbol-value var))
 
 (defun embark-toggle-variable (var &optional local)
   "Toggle value of boolean variable VAR.
@@ -3729,7 +3735,7 @@ If prefix LOCAL is non-nil make variable local."
   "Insert relative path to FILE.
 The insert path is relative to `default-directory'."
   (interactive "FFile: ")
-  (insert (file-relative-name (substitute-in-file-name file
+  (embark-insert (list (file-relative-name (substitute-in-file-name file)
 
 (defun embark-save-relative-path (file)
   "Save the relative path to FILE in the kill ring.



[elpa] externals/embark updated (8fbb20d189 -> 0d89add290)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/embark.

  from  8fbb20d189 Add embark-org-refile-here action
   new  dbb1158ced Some more commands which should mark their targets
   new  cf1325929e Tweak whitespace behavior of insert actions
   new  0d89add290 Bad typo :(


Summary of changes:
 embark.el | 20 +++-
 1 file changed, 15 insertions(+), 5 deletions(-)



[elpa] externals/embark-consult updated (8fbb20d189 -> 0d89add290)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/embark-consult.

  from  8fbb20d189 Add embark-org-refile-here action
  adds  dbb1158ced Some more commands which should mark their targets
  adds  cf1325929e Tweak whitespace behavior of insert actions
  adds  0d89add290 Bad typo :(

No new revisions were added by this update.

Summary of changes:
 embark.el | 20 +++-
 1 file changed, 15 insertions(+), 5 deletions(-)



[elpa] externals/greader 15e4d55277 2/3: Use of `greader-tts-stop instead of `greader-stop' to stop reading

2023-09-15 Thread ELPA Syncer
branch: externals/greader
commit 15e4d552774f6851bfd274ff3fd163d3cd1d8f08
Author: Michelangelo Rodriguez 
Commit: Michelangelo Rodriguez 

Use of `greader-tts-stop instead of `greader-stop' to stop reading
internally.
---
 greader.el | 12 ++--
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/greader.el b/greader.el
index 4f288a3ed1..82267f83e9 100644
--- a/greader.el
+++ b/greader.el
@@ -696,12 +696,12 @@ buffer, so if you want to set it globally, please use `m-x
   (interactive)
   (if (not (greader-call-backend 'punctuation))
   (progn
-   (greader-stop)
+   (greader-tts-stop)
(greader-set-punctuation 'yes)
(message "punctuation enabled in current buffer")
(greader-read))
 (progn
-  (greader-stop)
+  (greader-tts-stop)
   (greader-set-punctuation 'no)
   (message "punctuation disabled in current buffer")
   (greader-read
@@ -967,7 +967,7 @@ If prefix, it will be used to increment by that.  Default 
is N=10."
   (interactive "P")
   (if (not n)
   (setq n 10))
-  (greader-stop)
+  (greader-tts-stop)
   (greader-set-rate (+ (greader-call-backend 'rate 'value) n))
   (greader-read))
 
@@ -977,7 +977,7 @@ If prefix, it will be used to decrement  rate."
   (interactive "P")
   (if (not n)
   (setq n 10))
-  (greader-stop)
+  (greader-tts-stop)
   (greader-set-rate (- (greader-call-backend 'rate 'value) n))
   (greader-read))
 
@@ -1215,7 +1215,7 @@ So you can use this command like a player, if you press 
 you
   (when greader--timer-backward
 (cancel-timer greader--timer-backward)
 (setq greader--timer-backward nil))
-  (greader-stop)
+  (greader-tts-stop)
   (backward-sentence)
   (greader-set-register)
   (setq greader--marker-backward (point))
@@ -1227,7 +1227,7 @@ So you can use this command like a player, if you press 
 you
   (interactive)
   (when (eobp)
 (signal 'end-of-buffer nil))
-  (greader-stop)
+  (greader-tts-stop)
   (greader-forward-sentence)
   (greader-set-register)
   (greader-read))



[elpa] externals/greader a9066ce745 1/3: greader now manages regions.

2023-09-15 Thread ELPA Syncer
branch: externals/greader
commit a9066ce745b83aba491ea3709267f839fcf6c3b9
Author: Michelangelo Rodriguez 
Commit: Michelangelo Rodriguez 

greader now manages regions.
---
 greader.el | 59 +++
 1 file changed, 59 insertions(+)

diff --git a/greader.el b/greader.el
index 3bcc0ec880..4f288a3ed1 100644
--- a/greader.el
+++ b/greader.el
@@ -303,6 +303,61 @@ when the buffer is visiting a file."
   (if greader-auto-bookmark-mode
   (add-hook 'greader-after-stop-hook 'set-bookmark-for-greader)
 (remove-hook 'greader-after-stop-hook 'set-bookmark-for-greader)))
+;; greader-region-mode is a non-interactive minor mode that deals with
+;; read the active region instead of the entire buffer.
+;; The current implementation of greader probably dictates that the
+;; buffer needs to be temporarily narrowed when the region is
+;; active, so that the functions that deal with obtaining the sentences
+;; to read and move the point "believe" that that is all the
+;; buffer to read.
+(defvar greader-start-region nil
+  "start of region.")
+(defvar greader-end-region nil
+  "end of region.")
+
+(defun greader--active-region-p ()
+  "Return t if the region in the current buffer is active.
+Active in this context means that the variables
+  `greader-start-region' and `greader-end-region' are set appropriately."
+  (if (and greader-start-region greader-end-region)
+  t
+nil))
+
+(defun greader-narrow ()
+  "Narrow current buffer if region is active."
+  (unless (buffer-narrowed-p)
+(narrow-to-region greader-start-region greader-end-region)))
+
+;; This function widens the buffer, and is added to the
+;; `greader-after-stop-hook' hook by `greader-region-mode'.
+(defun greader-widen ()
+  "Widen buffer and set greader-region variables to nil."
+  (setq greader-start-region nil)
+  (setq greader-end-region nil)
+  (greader-region-mode -1)  
+  (widen))
+
+;; This function places the point at the beginning of the active region.
+(defun greader-set-point-to-start-of-region ()
+  "set the point to the beginning of the active region.
+This only happens if the variables `greader-start-region' and
+`greader-end-region' are set."
+  (when (and greader-start-region greader-end-region)
+(goto-char greader-start-region)))
+
+(define-minor-mode greader-region-mode
+  "This mode activates when the region is active."
+  :interactive nil
+  (if greader-region-mode
+  (progn
+   (setq greader-start-region (region-beginning))
+   (setq greader-end-region (region-end))
+   (greader-narrow)
+   (add-hook 'greader-after-stop-hook 'greader-widen)
+   (add-hook 'greader-before-finish-hook 'greader-widen)
+   (greader-set-point-to-start-of-region))
+(remove-hook 'greader-before-finish-hook 'greader-widen)
+(remove-hook 'greader-after-stop-hook 'greader-widen)))
 
 (defun greader-set-register ()
   "Set the `?G' register to the point in current buffer."
@@ -534,6 +589,10 @@ if `GOTO-MARKER' is t and if you pass a prefix to this
   (cond
((and (greader-timer-flag-p) (not (timerp greader-stop-timer)))
 (greader-setup-timers)))
+  (when (region-active-p)
+(cond
+ ((and (not greader-region-mode) (not (greader--active-region-p)))
+  (greader-region-mode 1
   (run-hooks greader-before-get-sentence-functions)
   (let ((chunk (funcall greader-read-chunk-of-text)))
 (if chunk



[elpa] externals/embark 0d89add290 3/3: Bad typo :(

2023-09-15 Thread ELPA Syncer
branch: externals/embark
commit 0d89add290f9176b77a2d7155a9935e30351d90f
Author: Omar Antolín 
Commit: Omar Antolín 

Bad typo :(
---
 embark.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/embark.el b/embark.el
index e0186c4528..d4dc4a6ade 100644
--- a/embark.el
+++ b/embark.el
@@ -483,7 +483,7 @@ arguments and more details."
 (comment-dwim embark--mark-target)
 (insert-parentheses embark--mark-target)
 (insert-pair embark--mark-target)
-(org-emphasize embark--mark-target))
+(org-emphasize embark--mark-target)
 ;; do the actual work of selecting & deselecting targets
 (embark-select embark--select))
   "Alist associating commands with post-action hooks.



[elpa] externals/greader 230c54be6c 3/3: version 0.3.0

2023-09-15 Thread ELPA Syncer
branch: externals/greader
commit 230c54be6c82c4c7de6afa8ce04ec57d8c26d468
Author: Michelangelo Rodriguez 
Commit: Michelangelo Rodriguez 

version 0.3.0
---
 greader.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/greader.el b/greader.el
index 82267f83e9..8da00bcb4f 100644
--- a/greader.el
+++ b/greader.el
@@ -6,7 +6,7 @@
 ;; Author: Michelangelo Rodriguez 
 ;; Keywords: tools, accessibility
 
-;; Version: 0.2.1
+;; Version: 0.3.0
 
 ;; This program is free software; you can redistribute it and/or modify
 ;; it under the terms of the GNU General Public License as published by



[elpa] externals/greader updated (dd528739b9 -> 230c54be6c)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/greader.

  from  dd528739b9 More work on last commit.
   new  a9066ce745 greader now manages regions.
   new  15e4d55277 Use of `greader-tts-stop instead of `greader-stop' to 
stop reading internally.
   new  230c54be6c version 0.3.0


Summary of changes:
 greader.el | 73 --
 1 file changed, 66 insertions(+), 7 deletions(-)



[elpa] scratch/greader 08158a459b 1/4: * greader.el: Fix hook naming convention

2023-09-15 Thread Stefan Monnier via
branch: scratch/greader
commit 08158a459b9a67c97ca9500abb015716f599029a
Author: Stefan Monnier 
Commit: Stefan Monnier 

* greader.el: Fix hook naming convention

Normal hooks are those run with plain `run-hooks` (i.e. take no args
and return no value).  They use the `-hook` suffix.
Abnormal hooks use the `-functions` suffix.

(greader-before-get-sentence-hook): Rename from
`greader-before-get-sentence-functions`.
(greader-before-finish-functions): Rename from
`greader-before-finish-hook`.
---
 greader.el | 20 
 1 file changed, 12 insertions(+), 8 deletions(-)

diff --git a/greader.el b/greader.el
index 8da00bcb4f..761b0ca8c5 100644
--- a/greader.el
+++ b/greader.el
@@ -62,7 +62,9 @@
 (defvar greader-synth-process nil)
 (require 'seq)
 
-(defvar greader-before-get-sentence-functions nil
+(define-obsolete-variable-alias 'greader-before-get-sentence-functions
+  'greader-before-get-sentence-hook "2023")
+(defvar greader-before-get-sentence-hook nil
   "List of functions to run before getting a sentence.
 Functions in this variable don't receive arguments.")
 
@@ -95,7 +97,9 @@ Return SENTENCE, eventually modified by the functions."
 (defvar greader-after-read-hook nil
   "Execute code just after reading a sentence.")
 
-(defvar greader-before-finish-hook nil
+(define-obsolete-variable-alias 'greader-before-finish-hook
+  'greader-before-finish-functions "2023")
+(defvar greader-before-finish-functions nil
   "Code executed just after finishing reading of buffer.
 Functions in this hook should return non -nil if at least one function
   returns non-nil, meaning that reading of buffer continues.
@@ -105,10 +109,10 @@ If all the functions called return nil, reading finishes 
normally.")
   "Return t if at least one of the function return t.
 If all the functions in the hook return nil, this function return
   nil."
-  (if greader-before-finish-hook
+  (if greader-before-finish-functions
   (progn
(let ((flag nil) (result nil))
- (dolist (func greader-before-finish-hook)
+ (dolist (func greader-before-finish-functions)
(setq result (funcall func))
(when result
  (setq flag t)))
@@ -354,9 +358,9 @@ This only happens if the variables `greader-start-region' 
and
(setq greader-end-region (region-end))
(greader-narrow)
(add-hook 'greader-after-stop-hook 'greader-widen)
-   (add-hook 'greader-before-finish-hook 'greader-widen)
+   (add-hook 'greader-before-finish-functions 'greader-widen)
(greader-set-point-to-start-of-region))
-(remove-hook 'greader-before-finish-hook 'greader-widen)
+(remove-hook 'greader-before-finish-functions 'greader-widen)
 (remove-hook 'greader-after-stop-hook 'greader-widen)))
 
 (defun greader-set-register ()
@@ -593,7 +597,7 @@ if `GOTO-MARKER' is t and if you pass a prefix to this
 (cond
  ((and (not greader-region-mode) (not (greader--active-region-p)))
   (greader-region-mode 1
-  (run-hooks greader-before-get-sentence-functions)
+  (run-hooks greader-before-get-sentence-hook)
   (let ((chunk (funcall greader-read-chunk-of-text)))
 (if chunk
(progn
@@ -644,7 +648,7 @@ Argument ARG is not used."
 (defun greader-get-sentence ()
   "Get current sentence.
 Before returning sentence, this function runs
-`greader-before-get-sentence-functions'
+`greader-before-get-sentence-hook'
 If at end of buffer, nil is returned."
   (let ((result (greader-call-backend 'get-text)))
 (if (stringp result)



[elpa] branch scratch/greader created (now 6fe3129a11)

2023-09-15 Thread Stefan Monnier via
monnier pushed a change to branch scratch/greader.

at  6fe3129a11 Miscellanous simplifications and "tightening"

This branch includes the following new commits:

   new  08158a459b * greader.el: Fix hook naming convention
   new  7ae215c13f * greader.el: Prefer #' to quote function names
   new  dff6b60acb * greader.el: Improve some of the docstrings
   new  6fe3129a11 Miscellanous simplifications and "tightening"




[elpa] scratch/greader 7ae215c13f 2/4: * greader.el: Prefer #' to quote function names

2023-09-15 Thread Stefan Monnier via
branch: scratch/greader
commit 7ae215c13fe441fe196a31699d710d7bb2a625d9
Author: Stefan Monnier 
Commit: Stefan Monnier 

* greader.el: Prefer #' to quote function names
---
 greader.el | 54 +-
 1 file changed, 29 insertions(+), 25 deletions(-)

diff --git a/greader.el b/greader.el
index 761b0ca8c5..022cba6362 100644
--- a/greader.el
+++ b/greader.el
@@ -140,7 +140,7 @@ If all the functions in the hook return nil, this function 
return
 
 (defcustom
   greader-current-backend
-  'greader-espeak
+  #'greader-espeak
   "Greader back-end to use."
   :tag "greader current back-end"
   :type
@@ -305,8 +305,8 @@ when the buffer is visiting a file."
   :lighter "bk"
   :global t
   (if greader-auto-bookmark-mode
-  (add-hook 'greader-after-stop-hook 'set-bookmark-for-greader)
-(remove-hook 'greader-after-stop-hook 'set-bookmark-for-greader)))
+  (add-hook 'greader-after-stop-hook #'set-bookmark-for-greader)
+(remove-hook 'greader-after-stop-hook #'set-bookmark-for-greader)))
 ;; greader-region-mode is a non-interactive minor mode that deals with
 ;; read the active region instead of the entire buffer.
 ;; The current implementation of greader probably dictates that the
@@ -357,11 +357,11 @@ This only happens if the variables `greader-start-region' 
and
(setq greader-start-region (region-beginning))
(setq greader-end-region (region-end))
(greader-narrow)
-   (add-hook 'greader-after-stop-hook 'greader-widen)
-   (add-hook 'greader-before-finish-functions 'greader-widen)
+   (add-hook 'greader-after-stop-hook #'greader-widen)
+   (add-hook 'greader-before-finish-functions #'greader-widen)
(greader-set-point-to-start-of-region))
-(remove-hook 'greader-before-finish-functions 'greader-widen)
-(remove-hook 'greader-after-stop-hook 'greader-widen)))
+(remove-hook 'greader-before-finish-functions #'greader-widen)
+(remove-hook 'greader-after-stop-hook #'greader-widen)))
 
 (defun greader-set-register ()
   "Set the `?G' register to the point in current buffer."
@@ -430,7 +430,7 @@ backends."
 
 (defun greader-load-backends ()
   "Load backends taken from `greader-backends'."
-  (mapcar 'require greader-backends))
+  (mapcar #'require greader-backends))
 
 (defun greader-read-asynchronous (txt)
   "Read the text given in TXT."
@@ -447,8 +447,8 @@ backends."
 (setq backend (append greader-backend `(,txt) backend))
 (and (stringp txt) (setq-local greader-synth-process (make-process
  :name 
"greader-backend"
- :sentinel 
'greader-action
- :filter 
'greader-process-filter
+ :sentinel 
#'greader-action
+ :filter 
#'greader-process-filter
  :command backend)))
 (if greader-debug
(progn
@@ -478,14 +478,14 @@ backends."
 
 (defun greader-tts-stop ()
   "Stop reading of current buffer."
-  (set-process-sentinel greader-synth-process 'greader--default-action)
+  (set-process-sentinel greader-synth-process #'greader--default-action)
   (if
   (not
(eq
(greader-call-backend 'stop) 'not-implemented))
   (greader-call-backend 'stop))
   (delete-process greader-synth-process)
-  (setq-local greader-backend-action 'greader--default-action))
+  (setq-local greader-backend-action #'greader--default-action))
 
 (defun greader--default-action (&optional _process event)
   "Internal use.
@@ -611,7 +611,7 @@ if `GOTO-MARKER' is t and if you pass a prefix to this
  (setq-local greader-backend-action #'greader-next-action)
  (greader-read-asynchronous chunk))
   (progn
-   (setq-local greader-backend-action 'greader--default-action)
+   (setq-local greader-backend-action #'greader--default-action)
(greader-set-greader-keymap)
(unless (greader--call-before-finish-functions)
  (greader-read-asynchronous ". end"))
@@ -763,8 +763,9 @@ Optional argument TIMER-IN-MINS timer in minutes (integer)."
   (catch 'timer-is-nil
 (cond
  ((greader-timer-flag-p)
-  (setq-local greader-stop-timer (run-at-time (- 
(greader-convert-mins-to-secs greader-timer) greader-elapsed-time) nil 
'greader-stop-timer-callback))
-  (setq-local greader-elapsed-timer (run-at-time 1 1 
'greader-elapsed-time)))
+  (setq-local greader-stop-timer (run-at-time (- 
(greader-convert-mins-to-secs greader-timer) greader-elapsed-time) nil 
#'greader-stop-timer-callback))
+  (setq-local greader-elapsed-timer
+  (run-at-time 1 1 #'greader-elapsed-time)))
  ((not (greader-timer-flag-p))
   (throw 'timer-is-nil nil
   t)
@@ -810,7 +811,7 @@ time elapsed before you stopped."
 

[elpa] scratch/greader 6fe3129a11 4/4: Miscellanous simplifications and "tightening"

2023-09-15 Thread Stefan Monnier via
branch: scratch/greader
commit 6fe3129a11f32078f5622e1321e9e5021eb0f5cc
Author: Stefan Monnier 
Commit: Stefan Monnier 

Miscellanous simplifications and "tightening"

Many of the simplifications result from hoisting `setq`
out of ifs or avoiding `setq` altogether.
Among the tightening, don't treat hooks as mere "variables holding
a list of functions", since they're a bit more complex than that
(e.g. can contain a function rather than a list, can have both
global and buffer-local functions at the same time, ...).

* greader.el (greader--call-functions-after-get-of-sentence):
Use `run-hook-wrapped` and simplify.
(greader--call-before-finish-functions):
Use `run-hook-with-args-until-success`.
(greader-change-backend): Consolidate the `seq-local`s outside of
the ifs.  Use (cadr (memq ...)) to find the "next" item instead of
going through index numbers.
(greader-read-asynchronous): Simplify the computation of `txt` and
`backend` by avoiding `setq`.
(greader-action): Assume `greader-backend-action` is non-nil.
Use `ignore` rather than `nil` if you want a backend that does
nothing :-)
(greader-set-language): Simplify by avoiding `setq`.
(greader-timer-flag-p, greader-sentence-needs-dehyphenation): η-reduce.
(greader-compile-mode): Don't trust the `member` test
since hooks aren't just "normal var holding a list".

* greader-speechd.el (greader-speechd-set-punctuation): Simplify.
Signal an error when `punct` is nil instead of returning
`greader-speechd-punctuation` without a preceding "-m".

* greader-mac.el: Add missing `Code:` header.
(greader-mac-set-voice): Simplify.
(greader-mac-forward-sentence): Use `move` arg of `re-search-forward`.
(greader-mac-get-sentence): Use `greader-mac-forward-sentence`
and eliminate dummy initialization of `sentence-start` that's
immediately overwritten by something else.
(greader--mac-get-voices): `beginning-of-buffer` is "interactive-only".
---
 greader-mac.el |  45 +++
 greader-speechd.el |  29 
 greader.el | 127 +++--
 3 files changed, 80 insertions(+), 121 deletions(-)

diff --git a/greader-mac.el b/greader-mac.el
index 067a19caa6..2577af3875 100644
--- a/greader-mac.el
+++ b/greader-mac.el
@@ -1,6 +1,10 @@
 ;;; greader.el --- gnamù reader, send buffer contents to a speech engine. -*- 
lexical-binding: t; -*-
+;; FIXME: The above line is not right for this file :-(
 
 ;; Copyright (C) 2017-2023  Free Software Foundation, Inc.
+
+;;; Code:
+
 (defgroup greader-mac
   nil
   "Back-end of mac for greader."
@@ -37,27 +41,14 @@ nil means to use the system voice."
 
 (defun greader-mac-set-voice (voice)
   "Set specified VOICE for `say'.
-When called interactively, this function reads a string from the minibuffer 
providing completion."
+When called interactively, this function reads a string from the minibuffer
+providing completion."
   (interactive
-   (list (read-string "voice: " nil nil (greader--mac-get-voices
-  (let (result)
-(if (called-interactively-p 'any)
-   (progn
- (if (string-equal "system" voice)
- (setq-local greader-mac-voice nil)
-   (setq-local greader-mac-voice voice)))
-  (when voice
-   (if (string-equal voice "system")
-   (progn
- (setq result nil)
- (setq-local greader-mac-voice nil))
- (setq result (concat "-v" voice))
- (setq-local greader-mac-voice voice)))
-  (unless voice
-   (if greader-mac-voice
-   (setq result (concat "-v" greader-mac-voice))
- (setq result nil)))
-  result)))
+   (list (read-string "Voice: " nil nil (greader--mac-get-voices
+  (when voice
+(setq-local greader-mac-voice
+(if (string-equal "system" voice) nil voice)))
+  (when greader-mac-voice (concat "-v" greader-mac-voice)))
 
 ;;;###autoload
 (defun greader-mac (command &optional arg &rest _)
@@ -87,25 +78,21 @@ COMMAND must be a string suitable for `make-process'."
 (put 'greader-mac 'greader-backend-name "greader-mac")
 
 (defun greader-mac-get-sentence ()
-  (let ((sentence-start (make-marker)))
-(setq sentence-start (point))
+  (let ((sentence-start (point)))
 (save-excursion
-  (when (not (eobp))
-   (if (not (re-search-forward greader-mac-end-of-sentence-regexp nil t))
-   (end-of-buffer))
+  (greader-mac-forward-sentence)
   (if (> (point) sentence-start)
  (string-trim (buffer-substring-no-properties sentence-start (point)) 
"[ \t\n\r]+")
-   nil)
+   nil
 
 (defun greader-mac-forward-sentence ()
-  (if (not (re-search-forward greader-mac-end-of-sentence-regexp nil t))
-  (end-of-buffer)))
+  (re-search-forward greader-mac-end-of-sentence-regexp nil 'move))
 
 (defun greader--mac-get-voices ()
   "Return a list which contains all vo

[elpa] scratch/greader dff6b60acb 3/4: * greader.el: Improve some of the docstrings

2023-09-15 Thread Stefan Monnier via
branch: scratch/greader
commit dff6b60acb7456422f5e32e2c9a36fda8f077267
Author: Stefan Monnier 
Commit: Stefan Monnier 

* greader.el: Improve some of the docstrings
---
 greader-speechd.el |  9 -
 greader.el | 36 +---
 2 files changed, 21 insertions(+), 24 deletions(-)

diff --git a/greader-speechd.el b/greader-speechd.el
index 196eed3d99..c928de5a55 100644
--- a/greader-speechd.el
+++ b/greader-speechd.el
@@ -58,8 +58,8 @@ using `greader-speechd-executable' as basename."
 
 (defun greader-speechd-set-language
 (&optional lang)
-  "Set language 'lang' for speech-dispatcher client.
-if lang is omitted, it looks in variable greader-speechd-language and
+  "Set language LANG for speech-dispatcher client.
+if LANG is omitted, it looks in variable `greader-speechd-language' and
 retrieves the appropriate string used by spd-say or another client
 compatible."
   (if (not lang)
@@ -71,8 +71,7 @@ compatible."
 (defun greader-speechd-set-rate
 (&optional rate)
   "Return parameter suitable for spd-say to set speech rate.
-for further documentation, see the documentation for
-greader-speechd-rate variable."
+for further documentation, see the `greader-speechd-rate' variable."
   (if (not rate)
   (concat "-r " (number-to-string greader-speechd-rate))
 (progn
@@ -81,7 +80,7 @@ greader-speechd-rate variable."
 
 (defun greader-speechd-set-punctuation (&optional punct)
   "Return a suitable parameter to pass to spd-say for setting punctuation 
level.
-punct must be a numeric value, 0 for no punctuation, 1 for some and 2
+PUNCT must be a numeric value, 0 for no punctuation, 1 for some and 2
 or >2 for all punctuation."
   (catch 'return
 (cond
diff --git a/greader.el b/greader.el
index 022cba6362..bfe2ea837f 100644
--- a/greader.el
+++ b/greader.el
@@ -65,11 +65,11 @@
 (define-obsolete-variable-alias 'greader-before-get-sentence-functions
   'greader-before-get-sentence-hook "2023")
 (defvar greader-before-get-sentence-hook nil
-  "List of functions to run before getting a sentence.
+  "Hook run before getting a sentence.
 Functions in this variable don't receive arguments.")
 
 (defvar greader-after-get-sentence-functions nil
-  "Hook after getting a sentence.
+  "Hook run after getting a sentence.
 Functions in this hook take a string as argument, and should modify
   that string that contains the sentence that will be read.
 the function should return modified sentence, or nil if no operation
@@ -101,14 +101,13 @@ Return SENTENCE, eventually modified by the functions."
   'greader-before-finish-functions "2023")
 (defvar greader-before-finish-functions nil
   "Code executed just after finishing reading of buffer.
-Functions in this hook should return non -nil if at least one function
+Functions in this hook should return non-nil if at least one function
   returns non-nil, meaning that reading of buffer continues.
 If all the functions called return nil, reading finishes normally.")
 
 (defun greader--call-before-finish-functions ()
   "Return t if at least one of the function return t.
-If all the functions in the hook return nil, this function return
-  nil."
+If all the functions in the hook return nil, this function return nil."
   (if greader-before-finish-functions
   (progn
(let ((flag nil) (result nil))
@@ -119,7 +118,7 @@ If all the functions in the hook return nil, this function 
return
  flag))
 nil))
 (defvar greader-after-stop-hook nil
-  "The functions in this variable are executed just after tts is stopped.")
+  "Hook run just after tts is stopped.")
 
 (defgroup
   greader
@@ -208,9 +207,9 @@ Instead, the sentence will be read completely."
   :tag "enable debug"
   :type 'boolean)
 
-(defcustom   greader-hook nil
-  "Hook ran after mode activation.
-through this hook you can
+(defcustom   greader-hook nil ;; FIXME: Can't see where it's run!
+  "Hook run after mode activation.
+Through this hook you can
 customize your key definitions for greader, for example."
   :tag "greader-mode hook"
   :type 'hook)
@@ -381,8 +380,8 @@ This only happens if the variables `greader-start-region' 
and
 b))
 
 (defun greader-call-backend (command &optional arg)
-  "Call BACKEND passing it COMMAND and ARG.
-\(internal use!\)."
+  "Call backend passing it COMMAND and ARG.
+\(internal use!\)." ;; FIXME: Use "--" in the name, then.
 
   (if arg
   (funcall greader-current-backend command arg)
@@ -395,10 +394,9 @@ This only happens if the variables `greader-start-region' 
and
 (defvar greader-dissoc-buffer "*Dissociation*")
 (defvar greader-temp-function nil)
 (defun greader-change-backend (&optional backend)
-  "Change BACKEND used for actually read the buffer.
-If backend is
-specified, it changes to backend, else it cycles throwgh available
-backends."
+  "Change backend used for actually read the buffer.
+If BACKEND is non-nil, it changes to BACKEND, else it cycles through
+available backends."
   (interactive
(l

[elpa] externals/transient b150b48b31 2/2: transient-quit-one: Cancel prefix-arg instead of exiting transient

2023-09-15 Thread Jonas Bernoulli via
branch: externals/transient
commit b150b48b310d06db87e673f5aef672b341bd001e
Author: Jonas Bernoulli 
Commit: Jonas Bernoulli 

transient-quit-one: Cancel prefix-arg instead of exiting transient
---
 lisp/transient.el | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/lisp/transient.el b/lisp/transient.el
index 1caec3e850..5a956df135 100644
--- a/lisp/transient.el
+++ b/lisp/transient.el
@@ -2476,6 +2476,8 @@ If there is no parent prefix, then just call the command."
  (setq transient--editp nil)
  (transient-setup)
  transient--stay)
+(prefix-arg
+ transient--stay)
 (t transient--exit)))
 
 (defun transient--do-quit-all ()



[elpa] externals/transient d11a1040b2 1/2: transient-update: Preserve universal argument

2023-09-15 Thread Jonas Bernoulli via
branch: externals/transient
commit d11a1040b21e8e21b27f87490cb6bcec4cc2c6da
Author: Jonas Bernoulli 
Commit: Jonas Bernoulli 

transient-update: Preserve universal argument

Since [1: ed2febd0] we have already done so, iff we remapped from
`negative-argument' to this command.

Now we always preserve the universal argument.  This is necessary
to pass such argument to suffix commands, which are bound to key
sequences longer than one event.  In such cases there is an
additional transient keymap, which binds the prefix key itself to
`transient-update'.  Using that binding, causes an update of the
transient buffer, before the key is unread so that it can be looked
up in the transient keymap with the real suffix command bindings.

Like any other command `transient-update' consumes the universal
argument, and we have to set `prefix-arg' again, as if it were
itself an universal argument command.

1: 2022-04-24 ed2febd0056932689da00414af9db0260ea08ead
   Support use of an infix argument following a prefix argument
---
 lisp/transient.el | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/lisp/transient.el b/lisp/transient.el
index 67bcefd04b..1caec3e850 100644
--- a/lisp/transient.el
+++ b/lisp/transient.el
@@ -2583,8 +2583,7 @@ transient is active."
 (defun transient-update ()
   "Redraw the transient's state in the popup buffer."
   (interactive)
-  (when (equal this-original-command 'negative-argument)
-(setq prefix-arg current-prefix-arg)))
+  (setq prefix-arg current-prefix-arg))
 
 (defun transient-show ()
   "Show the transient's state in the popup buffer."



[elpa] externals/transient updated (dd970cd464 -> b150b48b31)

2023-09-15 Thread Jonas Bernoulli via
tarsius pushed a change to branch externals/transient.

  from  dd970cd464 Compile suffix commands that are defined inside prefix 
definitions
   new  d11a1040b2 transient-update: Preserve universal argument
   new  b150b48b31 transient-quit-one: Cancel prefix-arg instead of exiting 
transient


Summary of changes:
 lisp/transient.el | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)



[elpa] externals/expreg 9950c07ec9: * expreg.el (expreg--treesit): Add support for multi-language.

2023-09-15 Thread ELPA Syncer
branch: externals/expreg
commit 9950c07ec90293964baa33603f4a80e764b0a847
Author: Yuan Fu 
Commit: Yuan Fu 

* expreg.el (expreg--treesit): Add support for multi-language.
---
 expreg.el | 32 ++--
 1 file changed, 18 insertions(+), 14 deletions(-)

diff --git a/expreg.el b/expreg.el
index be49d8b53c..b03d69e11e 100644
--- a/expreg.el
+++ b/expreg.el
@@ -5,7 +5,7 @@
 ;; Author: Yuan Fu 
 ;; Maintainer: Yuan Fu 
 ;; URL: https://github.com/casouri/expreg
-;; Version: 1.2.1
+;; Version: 1.3.1
 ;; Keywords: text, editing
 ;; Package-Requires: ((emacs "29.1"))
 ;;
@@ -362,20 +362,24 @@ Only return something if ‘subword-mode’ is on, to keep 
consistency."
 (defun expreg--treesit ()
   "Return a list of regions according to tree-sitter."
   (when (treesit-parser-list)
-
-(let ((node (treesit-node-at
- (point) (treesit-language-at (point
-  (root (treesit-buffer-root-node
- (treesit-language-at (point
+(let ((parsers (append (treesit-parser-list)
+   (and (fboundp #'treesit-local-parsers-at)
+(treesit-local-parsers-at (point)
   result)
-
-  (while node
-(let ((beg (treesit-node-start node))
-  (end (treesit-node-end node)))
-  (when (not (treesit-node-eq node root))
-(push `(treesit . ,(cons beg end)) result)))
-
-(setq node (treesit-node-parent node)))
+  (dolist (parser parsers)
+(let ((node (treesit-node-at (point) parser))
+  (root (treesit-parser-root-node parser))
+  (lang (treesit-parser-language parser)))
+
+  (while node
+(let ((beg (treesit-node-start node))
+  (end (treesit-node-end node)))
+  (when (not (treesit-node-eq node root))
+(push (cons (intern (format "treesit--%s" lang))
+(cons beg end))
+  result)))
+
+(setq node (treesit-node-parent node)
   result)))
 
 (defun expreg--inside-list ()



[elpa] externals/tempel 023001cece 2/2: README: Update template

2023-09-15 Thread ELPA Syncer
branch: externals/tempel
commit 023001cece278a8ab4a6904c05a1ee68c570efe9
Author: Daniel Mendler 
Commit: Daniel Mendler 

README: Update template
---
 README.org | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/README.org b/README.org
index 34fd59f0f9..80b4463e43 100644
--- a/README.org
+++ b/README.org
@@ -270,7 +270,7 @@ org-mode
 (comment "#+begin_comment" n> r> n "#+end_comment")
 (verse "#+begin_verse" n> r> n "#+end_verse")
 (src "#+begin_src " q n r n "#+end_src")
-(gnuplot "#+begin_src gnuplot :var data=" (p "table") " :file " (p "plot.png") 
n> r> n "#+end_src" :post (org-edit-src-code))
+(gnuplot "#+begin_src gnuplot :var data=" (p "table") " :file " (p "plot.png") 
n r n "#+end_src" :post (org-edit-src-code))
 (elisp "#+begin_src emacs-lisp" n r n "#+end_src" :post (org-edit-src-code))
 (inlsrc "src_" p "{" q "}")
 (title "#+title: " p n "#+author: Daniel Mendler" n "#+language: en")



[elpa] externals/tempel 809e4ad4ee 1/2: Update org-mode template examples (#113)

2023-09-15 Thread ELPA Syncer
branch: externals/tempel
commit 809e4ad4ee64160be25ba21bceff830510078283
Author: Ian S. Pringle 
Commit: GitHub 

Update org-mode template examples (#113)

Updated the elisp and src examples so that they do not use 
`indent-according-to-tab`. When the header of the source block exists but the 
footer does not yet exist org-mode greedily looks for the next `#+end_src`, 
even if a `#+begin_src` precedes it. If it ends that `#+end_src` it then treats 
all text between tempel's freshly inserted src header and that existing footer 
as source doe and attempts to indent and possible format it accordingly. The 
results will vary by the source language [...]

This change introduces no degradation to the template examples, but it does 
work with org-mode as is.
---
 README.org | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/README.org b/README.org
index 320fbb23fd..34fd59f0f9 100644
--- a/README.org
+++ b/README.org
@@ -269,9 +269,9 @@ org-mode
 (latex "#+begin_export latex" n> r> n "#+end_export")
 (comment "#+begin_comment" n> r> n "#+end_comment")
 (verse "#+begin_verse" n> r> n "#+end_verse")
-(src "#+begin_src " q n> r> n "#+end_src")
+(src "#+begin_src " q n r n "#+end_src")
 (gnuplot "#+begin_src gnuplot :var data=" (p "table") " :file " (p "plot.png") 
n> r> n "#+end_src" :post (org-edit-src-code))
-(elisp "#+begin_src emacs-lisp" n> r> n "#+end_src" :post (org-edit-src-code))
+(elisp "#+begin_src emacs-lisp" n r n "#+end_src" :post (org-edit-src-code))
 (inlsrc "src_" p "{" q "}")
 (title "#+title: " p n "#+author: Daniel Mendler" n "#+language: en")
 



[elpa] externals/tempel updated (0101fd2abf -> 023001cece)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/tempel.

  from  0101fd2abf README: Add recommendation regarding template naming 
(See #112)
   new  809e4ad4ee Update org-mode template examples (#113)
   new  023001cece README: Update template


Summary of changes:
 README.org | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)



[nongnu] elpa/apropospriate-theme 6618e26a83: adjust `parenthesis` face (for `paren-face-mode`)

2023-09-15 Thread ELPA Syncer
branch: elpa/apropospriate-theme
commit 6618e26a833fdd2fbddf32075f1953cc4f86cb03
Author: justin talbott 
Commit: justin talbott 

adjust `parenthesis` face (for `paren-face-mode`)
---
 apropospriate-theme.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/apropospriate-theme.el b/apropospriate-theme.el
index 9e281b655b..0f7baab10a 100644
--- a/apropospriate-theme.el
+++ b/apropospriate-theme.el
@@ -141,7 +141,7 @@ Set to `1.0' or nil to prevent font size manipulation."
  `(highlight-indent-guides-odd-face ((,class (:background ,base00+1
  `(highlight-indent-guides-even-face ((,class (:background ,base00
  `(highlight-indent-guides-character-face ((,class (:foreground 
,base00+2
- `(parenthesis ((,class (:foreground ,base00+3
+ `(parenthesis ((,class (:foreground ,base01
  `(font-lock-comment-face ((,class (:foreground ,base01
  `(font-lock-comment-delimiter-face ((,class (:foreground ,base01
  `(font-lock-builtin-face ((,class (:foreground ,cyan



[elpa] externals/greader 92a14825eb: Version 0.4.0

2023-09-15 Thread ELPA Syncer
branch: externals/greader
commit 92a14825eb2b2c8764af7817bf2700eba10971a8
Author: Michelangelo Rodriguez 
Commit: Michelangelo Rodriguez 

Version 0.4.0
---
 greader.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/greader.el b/greader.el
index 017505e782..321342b9ba 100644
--- a/greader.el
+++ b/greader.el
@@ -6,7 +6,7 @@
 ;; Author: Michelangelo Rodriguez 
 ;; Keywords: tools, accessibility
 
-;; Version: 0.3.0
+;; Version: 0.4.0
 
 ;; This program is free software; you can redistribute it and/or modify
 ;; it under the terms of the GNU General Public License as published by



[elpa] externals/greader updated (230c54be6c -> 92a14825eb)

2023-09-15 Thread ELPA Syncer
elpasync pushed a change to branch externals/greader.

  from  230c54be6c version 0.3.0
  adds  08158a459b * greader.el: Fix hook naming convention
  adds  7ae215c13f * greader.el: Prefer #' to quote function names
  adds  dff6b60acb * greader.el: Improve some of the docstrings
  adds  6fe3129a11 Miscellanous simplifications and "tightening"
   new  92a14825eb Version 0.4.0


Summary of changes:
 greader-mac.el |  45 ---
 greader-speechd.el |  38 +++--
 greader.el | 225 +
 3 files changed, 136 insertions(+), 172 deletions(-)



[elpa] externals/llm 48ae59d149 14/34: Fix llm-chat-prompt-to-text, which was unusable

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 48ae59d14977aae60c6f2405fc9d8bbcf2182a3f
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix llm-chat-prompt-to-text, which was unusable
---
 llm.el | 12 
 1 file changed, 8 insertions(+), 4 deletions(-)

diff --git a/llm.el b/llm.el
index 29e907a093..f83233eaaf 100644
--- a/llm.el
+++ b/llm.el
@@ -142,10 +142,10 @@ ways."
 (defun llm-chat-prompt-to-text (prompt)
   "Convert PROMPT `llm-chat-prompt' to a simple text.
 This should only be used for logging or debugging."
-  (format "Context: %s\nExamples: %s\nInteractions: %s\nTemperature: %f\nMax 
tokens: %d\n"
+  (format "Context: %s\nExamples: %s\nInteractions: %s\n%s%s\n"
   (llm-chat-prompt-context prompt)
   (mapconcat (lambda (e) (format "User: %s\nResponse: %s" (car e) (cdr 
e)))
- (llm-chat-prompt-interactions prompt) "\n")
+ (llm-chat-prompt-examples prompt) "\n")
   (mapconcat (lambda (i)
(format "%s: %s"
(pcase (llm-chat-prompt-interaction-role i)
@@ -154,8 +154,12 @@ This should only be used for logging or debugging."
  ('assistant "Assistant"))
(llm-chat-prompt-interaction-content i)))
  (llm-chat-prompt-interactions prompt) "\n")
-  (llm-chat-prompt-temperature prompt)
-  (llm-chat-prompt-max-tokens prompt)))
+  (if (llm-chat-prompt-temperature prompt)
+  (format "Temperature: %s\n" (llm-chat-prompt-temperature prompt))
+"")
+  (if (llm-chat-prompt-max-tokens prompt)
+  (format "Max tokens: %s\n" (llm-chat-prompt-max-tokens prompt))
+"")))
 
 (provide 'llm)
 



[elpa] externals/llm 8f30feb5c1 32/34: README improvements, including noting the nonfree llm warning

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 8f30feb5c1a209f7280fd468a2fe4030434a0e81
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

README improvements, including noting the nonfree llm warning

Also, remove the somewhat duplicated section about different providers.

Require the right provider in the example setup.
---
 README.org | 28 
 1 file changed, 20 insertions(+), 8 deletions(-)

diff --git a/README.org b/README.org
index b9047e8103..a4f1b1a6da 100644
--- a/README.org
+++ b/README.org
@@ -13,7 +13,9 @@ Users who use an application that uses this package should 
not need to install i
 
 #+begin_src emacs-lisp
 (use-package llm-refactoring
-  :init (setq llm-refactoring-provider (make-llm-openai :key my-openai-key))
+  :init
+  (require 'llm-openai)
+  (setq llm-refactoring-provider (make-llm-openai :key my-openai-key))
 #+end_src
 
 Here ~my-openai-key~ would be a variable you set up before with your Open AI 
key.  Or, just substitute the key itself as a string.  It's important that you 
remember never to check your key into a public repository such as github, 
because your key must be kept private.  Anyone with your key can use the API, 
and you will be charged.
@@ -31,8 +33,24 @@ You can set up with ~make-llm-vertex~, with the following 
parameters:
 In addition to the provider, which you may want multiple of (for example, to 
charge against different projects), there are customizable variables:
 - ~llm-vertex-gcloud-binary~: The binary to use for generating the API key.
 - ~llm-vertex-gcloud-region~: The gcloud region to use.  It's good to set this 
to a region near where you are for best latency.  Defaults to "us-central1".
+** Fake
+This is a client that makes no call, but it just there for testing and 
debugging.  Mostly this is of use to programmatic clients of the llm package, 
but end users can also use it to understand what will be sent to the LLMs.  It 
has the following parameters:
+- ~:output-to-buffer~: if non-nil, the buffer or buffer name to append the 
request sent to the LLM to.
+- ~:chat-action-func~: a function that will be called to provide a string or 
symbol and message cons which are used to raise an error.
+- ~:embedding-action-func~: a function that will be called to provide a vector 
or symbol and message cons which are used to raise an error.
+* =llm= and the use of non-free LLMs
+The =llm= package is part of GNU Emacs by being part of GNU ELPA.  
Unfortunately, the most popular LLMs in use are non-free, which is not what GNU 
software should be promoting by inclusion.  On the other hand, by use of the 
=llm= package, the user can make sure that any client that codes against it 
will work with free models that come along.  It's likely that sophisticated 
free LLMs will, emerge, although it's unclear right now what free software 
means with respsect to LLMs.  Because of  [...]
+
+To build upon the example from before:
+#+begin_src emacs-lisp
+(use-package llm-refactoring
+  :init
+  (require 'llm-openai)
+  (setq llm-refactoring-provider (make-llm-openai :key my-openai-key)
+llm-warn-on-nonfree nil)
+#+end_src
 * Programmatic use
-Client applications should require the module, =llm=, and code against it.  
Most functions are generic, and take a struct representing a provider as the 
first argument. The client code, or the user themselves can then require the 
specific module, such as =llm-openai=, and create a provider with a function 
such as ~(make-llm-openai :key user-api-key)~.  The client application will use 
this provider to call all the generic functions.
+Client applications should require the =llm= package, and code against it.  
Most functions are generic, and take a struct representing a provider as the 
first argument. The client code, or the user themselves can then require the 
specific module, such as =llm-openai=, and create a provider with a function 
such as ~(make-llm-openai :key user-api-key)~.  The client application will use 
this provider to call all the generic functions.
 
 A list of all the functions:
 
@@ -40,11 +58,5 @@ A list of all the functions:
 - ~llm-chat-async provider prompt response-callback error-callback~: Same as 
~llm-chat~, but executes in the background.  Takes a ~response-callback~ which 
will be called with the text response.  The ~error-callback~ will be called in 
case of error, with the error symbol and an error message.
 - ~llm-embedding provider string~: With the user-chosen ~provider~, send a 
string and get an embedding, which is a large vector of floating point values.  
The embedding represents the semantic meaning of the string, and the vector can 
be compared against other vectors, where smaller distances between the vectors 
represent greater semantic similarity.
 - ~llm-embedding-async provider string vector-callback error-callback~: Same 
as ~llm-embedding~ but this is processed asynchronously. ~vector-callback~ is 
called with the vector embedding, and, in case of error, ~err

[elpa] externals/llm cff5db8ad5 16/34: Add unit tests and fix all brokenness detected by them

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit cff5db8ad5185ac623759f737fc2554948b62c6a
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add unit tests and fix all brokenness detected by them
---
 llm-fake.el | 32 +---
 llm-test.el | 57 +
 llm.el  |  2 +-
 3 files changed, 75 insertions(+), 16 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index f74f868c1f..8a72ccebd1 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -51,25 +51,27 @@ message cons. If nil, the response will be a simple vector."
 (with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
   (goto-char (point-max))
   (insert "\nCall to llm-chat-response\n"  (llm-chat-prompt-to-text 
prompt) "\n")))
-  (or (when-let (f (llm-fake-chat-action-func provider))
-(let ((result (funcall f)))
-  (pcase (type-of result)
-('string (funcall response-callback result))
-('cons (funcall error-callback (car result) (cdr result)))
-(_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of-result))
-  (funcall response-callback "Sample response from 
`llm-chat-response-async'")))
+  (if (llm-fake-chat-action-func provider)
+  (let* ((f (llm-fake-chat-action-func provider))
+ (result (funcall f)))
+(pcase (type-of result)
+('string (funcall response-callback result))
+('cons (funcall error-callback (car result) (cdr result)))
+(_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of-result)
+(funcall response-callback "Sample response from 
`llm-chat-response-async'")))
 
-(cl-defmethod llm-embedding-async ((provider llm-openai) string 
vector-callback error-callback)
+(cl-defmethod llm-embedding-async ((provider llm-fake) string vector-callback 
error-callback)
   (when (llm-fake-output-to-buffer provider)
 (with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
   (goto-char (point-max))
   (insert "\nCall to llm-embedding with text: " string "\n")))
-  (or (when-let (f (llm-fake-chat-action-func provider))
-(let ((result (funcall f)))
-  (pcase (type-of result)
-('vector (funcall vector-callback result))
-('cons (funcall error-callback (car result) (cdr result)))
-(_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result))
-  (funcall response-callback [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9])))
+  (if (llm-fake-embedding-action-func provider)
+  (let* ((f (llm-fake-embedding-action-func provider))
+ (result (funcall f)))
+(pcase (type-of result)
+('vector (funcall vector-callback result))
+('cons (funcall error-callback (car result) (cdr result)))
+(_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result)
+(funcall vector-callback [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9])))
 
 (provide 'llm-fake)
diff --git a/llm-test.el b/llm-test.el
new file mode 100644
index 00..5470db0c8a
--- /dev/null
+++ b/llm-test.el
@@ -0,0 +1,57 @@
+;;; llm-test.el --- Unit tests for the llm module -*- lexical-binding: t -*-
+
+;; Copyright (c) 2023  Andrew Hyatt 
+
+;; Author: Andrew Hyatt 
+;; SPDX-License-Identifier: GPL-3.0-or-later
+;;
+;; This program is free software; you can redistribute it and/or
+;; modify it under the terms of the GNU General Public License as
+;; published by the Free Software Foundation; either version 3 of the
+;; License, or (at your option) any later version.
+;;
+;; This program is distributed in the hope that it will be useful, but
+;; WITHOUT ANY WARRANTY; without even the implied warranty of
+;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+;; General Public License for more details.
+;;
+;; You should have received a copy of the GNU General Public License
+;; along with GNU Emacs.  If not, see .
+
+;;; Commentary:
+;; This tests just the code in the `llm' module, although it uses `llm-fake' as
+;; well to do so. All individual providers are probably best tested in the
+;; `llm-tester' module.
+
+;;; Code:
+
+(require 'llm)
+(require 'llm-fake)
+(require 'ert)
+
+(ert-deftest llm-test-embedding ()
+  (should-error (llm-embedding nil "Test"))
+  (should-error (llm-embedding-async nil "Test"))
+  ;; TODO: Test signals that are not errors, which ert doesn't seem to catch.
+  (should-error (llm-embedding (make-llm-fake
+:embedding-action-func
+(lambda () (cons 'error "my message")))
+   "Test"))
+  (should (equal
+   [0.1 0.2 0.3]
+   (llm-embedding (make-llm-fake :embedding-action-func (lambda () 
[0.1 0.2 0.3]))
+  "Test"
+
+(ert-deftest llm-test-chat ()
+  (should-error (llm-chat-

[elpa] externals/llm 40151757de 26/34: Switch to a method of nonfree warnings easier for provider modules

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 40151757ded8fc8a8c1312da8f80d56968e21c22
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Switch to a method of nonfree warnings easier for provider modules

Also, change how warnings are logged, since list types do not currently 
work.
---
 llm-openai.el |  7 +++
 llm-vertex.el |  7 +++
 llm.el| 33 -
 3 files changed, 38 insertions(+), 9 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index 76d6ab45cd..70d0836e89 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -50,15 +50,15 @@ EMBEDDING-MODEL is the model to use for embeddings.  If 
unset, it
 will use a reasonable default."
   key chat-model embedding-model)
 
-(defun llm-openai--maybe-warn ()
-  (llm--warn-on-nonfree "Open AI" "https://openai.com/policies/terms-of-use";))
+(cl-defmethod llm-nonfree-message-info ((provider llm-openai))
+  (ignore provider)
+  (cons "Open AI" "https://openai.com/policies/terms-of-use";))
 
 (defun llm-openai--embedding-make-request (provider string vector-callback 
error-callback sync)
   "Make a request to Open AI to get an embedding for STRING.
 PROVIDER, VECTOR-CALLBACK and ERROR-CALLBACK are as in the
 `llm-embedding-async' call. SYNC is non-nil when the request
 should wait until the response is received."
-  (llm-openai--maybe-warn)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, add a key to the `llm-openai' provider."))
   (request "https://api.openai.com/v1/embeddings";
@@ -102,7 +102,6 @@ ERROR-CALLBACK is called if there is an error, with the 
error
 signal and message.
 
 SYNC is non-nil when the request should wait until the response is received."
-  (llm-openai--maybe-warn)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, the key must have been set"))
   (let (request-alist system-prompt)
diff --git a/llm-vertex.el b/llm-vertex.el
index 3fdd50c245..11543a4ca5 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -69,8 +69,9 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
   (setf (llm-vertex-key provider) result))
 (setf (llm-vertex-key-gentime provider) (current-time
 
-(defun llm-vertex-maybe-warn ()
-  (llm--warn-on-nonfree "Google Cloud Vertex" 
"https://policies.google.com/terms/generative-ai";))
+(cl-defmethod llm-nonfree-message-info ((provider llm-vertex))
+  (ignore provider)
+  (cons "Google Cloud Vertex" 
"https://policies.google.com/terms/generative-ai";))
 
 (defun llm-vertex--embedding (provider string vector-callback error-callback 
sync)
   "Get the embedding for STRING.
@@ -78,7 +79,6 @@ PROVIDER, VECTOR-CALLBACK, ERROR-CALLBACK are all the same as
 `llm-embedding-async'. SYNC, when non-nil, will wait until the
 response is available to return."
   (llm-vertex-refresh-key provider)
-  (llm-vertex-maybe-warn)
   (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
llm-vertex-gcloud-region
(llm-vertex-project provider)
@@ -116,7 +116,6 @@ PROVIDER, RESPONSE-CALLBACK, ERROR-CALLBACK are all the 
same as
 `llm-chat-async'. SYNC, when non-nil, will wait until
 the response is available to return."
   (llm-vertex-refresh-key provider)
-  (llm-vertex-maybe-warn)
   (let ((request-alist))
 (when (llm-chat-prompt-context prompt)
   (push `("context" . ,(llm-chat-prompt-context prompt)) request-alist))
diff --git a/llm.el b/llm.el
index 5d6202b18b..4e78678383 100644
--- a/llm.el
+++ b/llm.el
@@ -57,7 +57,7 @@ TOS is the URL of the terms of service for the LLM.
 All non-free LLMs should call this function on each llm function
 invocation."
   (when llm-warn-on-nonfree
-(lwarn '(llm nonfree) :warning "%s API is not free software, and your 
freedom to use it is restricted.
+(lwarn 'llm :warning "%s API is not free software, and your freedom to use 
it is restricted.
 See %s for the details on the restrictions on use." name tos)))
 
 (cl-defstruct llm-chat-prompt
@@ -96,6 +96,17 @@ an LLM, and don't need the more advanced features that the
 `llm-chat-prompt' struct makes available."
   (make-llm-chat-prompt :interactions (list (make-llm-chat-prompt-interaction 
:role 'user :content text
 
+(cl-defgeneric llm-nonfree-message-info (provider)
+  "If PROVIDER is non-free, return info for a warning.
+This should be a cons of the name of the LLM, and the URL of the
+terms of service.
+
+If the LLM is free and has no restrictions on use, this should
+return nil. Since this function already returns nil, there is no
+need to override it."
+  (ignore provider)
+  nil)
+
 (cl-defgeneric llm-chat (provider prompt)
   "Return a response to PROMPT from PROVIDER.
 PROMPT is a `llm-chat-prompt'. The response is a string."
@@ -105,6 +116,11 @@ PROMPT is a `llm-chat-prompt'. The response is a string."
 (cl-defmethod llm-chat ((_ (eql nil)) _)
   (error "LLM provider was nil.  Please set the

[elpa] externals/llm 16ee85fd11 05/34: Add async options, and made the sync options just use those and wait

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 16ee85fd11451ab2f8b2db01a6d5f22d12913020
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add async options, and made the sync options just use those and wait
---
 llm-openai.el | 36 +++--
 llm-tester.el | 65 +++
 llm.el| 37 --
 3 files changed, 90 insertions(+), 48 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index 4e91f9c52d..3bc8a06f17 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -50,28 +50,34 @@ EMBEDDING-MODEL is the model to use for embeddings.  If 
unset, it
 will use a reasonable default."
   key chat-model embedding-model)
 
-(cl-defmethod llm-embedding ((provider llm-openai) string)
+(cl-defmethod llm-embedding-async ((provider llm-openai) string 
vector-callback error-callback)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, provide the ekg-embedding-api-key"))
-  (let ((resp (request "https://api.openai.com/v1/embeddings";
+  (request "https://api.openai.com/v1/embeddings";
 :type "POST"
 :headers `(("Authorization" . ,(format "Bearer %s" 
ekg-embedding-api-key))
("Content-Type" . "application/json"))
 :data (json-encode `(("input" . ,string) ("model" . ,(or 
(llm-openai-embedding-model provider) "text-embedding-ada-002"
 :parser 'json-read
+:success (cl-function (lambda (&key data &allow-other-keys)
+(funcall vector-callback
+ (cdr (assoc 'embedding (aref 
(cdr (assoc 'data data)) 0))
 :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
-  (error (format "Problem calling Open AI: 
%s, type: %s message: %s"
+  (funcall error-callback 'error
+   (format "Problem calling Open 
AI: %s, type: %s message: %s"
  (cdr error-thrown)
  (assoc-default 'type 
(cdar data))
- (assoc-default 'message 
(cdar data))
-:timeout 2
-:sync t)))
-(cdr (assoc 'embedding (aref (cdr (assoc 'data (request-response-data 
resp))) 0)
+ (assoc-default 'message 
(cdar data
 
-(defun llm-openai--chat-response (prompt &optional return-json-spec)
+(defun llm-openai--chat-response (prompt response-callback error-callback 
&optional return-json-spec)
   "Main method to send a PROMPT as a chat prompt to Open AI.
 RETURN-JSON-SPEC, if specified, is a JSON spec to return from the
-Open AI API."
+Open AI API.
+
+RESPONSE-CALLBACK is a function to call with the LLM response.
+
+ERROR-CALLBACK is called if there is an error, with the error
+signal and message."
   (unless (llm-openai-key provider)
 (error "To call Open AI API, the key must have been set"))
   (let (request-alist system-prompt)
@@ -116,14 +122,14 @@ Open AI API."
   :data (json-encode request-alist)
   :parser 'json-read
   :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
-(error (format "Problem calling Open 
AI: %s, type: %s message: %s"
-   (cdr error-thrown)
-   (assoc-default 'type 
(cdar data))
-   (assoc-default 'message 
(cdar data))
-  :sync t)))
+(funcall error-callback
+ (format "Problem calling Open 
AI: %s, type: %s message: %s"
+ (cdr error-thrown)
+ (assoc-default 'type 
(cdar data))
+ (assoc-default 
'message (cdar data)
   (let ((result (cdr (assoc 'content (cdr (assoc 'message (aref (cdr 
(assoc 'choices (request-response-data resp))) 0))
 (func-result (cdr (assoc 'arguments (cdr (assoc 'function_call 
(cdr (assoc 'message (aref (cdr (assoc 'choices (request-response-data resp))) 
0)
-(or func-result result)
+(funcall result-callback (or func-result result))
 
 (cl-defmethod llm-chat-response ((provider llm-openai) prompt)
   (llm-openai--chat-response prompt nil))
diff --git a/llm-tester.el b/llm-tester.el
index 53938ae721..089e5cd5de 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -1,4 +1,4 @@
-;;; llm-tester.el --- Helpers for testing LLM implementati

[elpa] externals/llm b2f1605514 33/34: Delete some trailing whitespace

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit b2f160551488c9d16ecf1b64f7b70576a3ed6775
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Delete some trailing whitespace
---
 llm-openai.el | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index edb10f7862..ba79e748af 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -139,7 +139,7 @@ SYNC is non-nil when the request should wait until the 
response is received."
   ("parameters" . ,return-json-spec
 request-alist)
   (push '("function_call" . (("name" . "output"))) request-alist))
-
+
 (request "https://api.openai.com/v1/chat/completions";
   :type "POST"
   :sync sync
@@ -150,7 +150,7 @@ SYNC is non-nil when the request should wait until the 
response is received."
   :success (cl-function
 (lambda (&key data &allow-other-keys)
   (let ((result (cdr (assoc 'content (cdr (assoc 'message 
(aref (cdr (assoc 'choices data)) 0))
-(func-result (cdr (assoc 'arguments (cdr (assoc 
'function_call (cdr (assoc 'message (aref (cdr (assoc 'choices data)) 
0)
+(func-result (cdr (assoc 'arguments (cdr (assoc 
'function_call (cdr (assoc 'message (aref (cdr (assoc 'choices data)) 0)
 (funcall response-callback (or func-result result)
   :error (cl-function (lambda (&key error-thrown data &allow-other-keys)
 (funcall error-callback



[elpa] externals/llm ad230d9d6b 10/34: Add methods for nil provider, to throw more meaningful errors

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit ad230d9d6bf895d46b82b9b24dfdbb9e511c0e96
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add methods for nil provider, to throw more meaningful errors
---
 llm.el | 12 
 1 file changed, 12 insertions(+)

diff --git a/llm.el b/llm.el
index e527dd9273..f513489f09 100644
--- a/llm.el
+++ b/llm.el
@@ -93,6 +93,9 @@ The return value will be the value passed into the success 
callback."
 PROMPT is a `llm-chat-prompt'. The response is a string."
   (llm--run-async-as-sync #'llm-chat-response-async provider prompt))
 
+(cl-defmethod llm-chat-response ((_ (eql nil)) _)
+  (error "LLM provider was nil.  Please set the provider in the application 
you are using."))
+
 (cl-defgeneric llm-chat-response-async (provider prompt response-callback 
error-callback)
   "Return a response to PROMPT from PROVIDER.
 PROMPT is a `llm-chat-prompt'.
@@ -101,10 +104,16 @@ ERROR-CALLBACK receives the error response."
   (ignore provider prompt response-callback error-callback)
   (signal 'not-implemented nil))
 
+(cl-defmethod llm-chat-response-async ((_ (eql nil)) _ _ _)
+  (error "LLM provider was nil.  Please set the provider in the application 
you are using."))
+
 (cl-defgeneric llm-embedding (provider string)
   "Return a vector embedding of STRING from PROVIDER."
   (llm--run-async-as-sync #'llm-embedding-async provider string))
 
+(cl-defmethod llm-chat-embedding ((_ (eql nil)) _)
+  (error "LLM provider was nil.  Please set the provider in the application 
you are using."))
+
 (cl-defgeneric llm-embedding-async (provider string vector-callback 
error-callback)
   "Calculate a vector embedding of STRING from PROVIDER.
 VECTOR-CALLBACK will be called with the vector embedding.
@@ -113,6 +122,9 @@ error signal and a string message."
   (ignore provider string vector-callback error-callback)
   (signal 'not-implemented nil))
 
+(cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _)
+  (error "LLM provider was nil.  Please set the provider in the application 
you are using."))
+
 (cl-defgeneric llm-count-tokens (provider string)
   "Return the number of tokens in STRING from PROVIDER.
 This may be an estimate if the LLM does not provide an exact



[elpa] externals/llm e94bc937c7 27/34: Fix issue with llm-chat before method having too many arguments

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit e94bc937c711f871adf8446dee0e75c97b4bfbf7
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix issue with llm-chat before method having too many arguments
---
 llm.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/llm.el b/llm.el
index 4e78678383..034bedb797 100644
--- a/llm.el
+++ b/llm.el
@@ -116,7 +116,7 @@ PROMPT is a `llm-chat-prompt'. The response is a string."
 (cl-defmethod llm-chat ((_ (eql nil)) _)
   (error "LLM provider was nil.  Please set the provider in the application 
you are using."))
 
-(cl-defmethod llm-chat :before (provider _ _ _)
+(cl-defmethod llm-chat :before (provider _)
   "Issue a warning if the LLM is non-free."
   (when-let (info (llm-nonfree-message-info provider))
 (llm--warn-on-nonfree (car info) (cdr info



[elpa] externals/llm 7edd36b2dc 28/34: Fix obsolete or incorrect function calls in llm-fake

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 7edd36b2dc1e8986adc191b0a30b31afc9dfa6bb
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix obsolete or incorrect function calls in llm-fake
---
 llm-fake.el | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index 95aea76400..76ac01d6bd 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -46,9 +46,9 @@ either a vector response for the chat, or a signal symbol and
 message cons. If nil, the response will be a simple vector."
  output-to-buffer chat-action-func embedding-action-func)
 
-(cl-defmethod llm-chat-response-async ((provider llm-fake) prompt 
response-callback error-callback)
+(cl-defmethod llm-chat-async ((provider llm-fake) prompt response-callback 
error-callback)
   (condition-case err
-  (funcall response-callback (llm-chat-response provider prompt))
+  (funcall response-callback (llm-chat provider prompt))
 (t (funcall error-callback (car err) (cdr err
   nil)
 
@@ -77,7 +77,7 @@ message cons. If nil, the response will be a simple vector."
 (pcase (type-of result)
 ('vector result)
 ('cons (signal (car result) (cdr result)))
-(_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result)
+(_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of result)
 [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]))
 
 (cl-defmethod llm-embedding-async ((provider llm-fake) string vector-callback 
error-callback)



[elpa] externals/llm ba65755326 30/34: Improve the README with information on providers for end-users

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit ba6575532680a27ced25a48f25e2425106a5eabd
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Improve the README with information on providers for end-users
---
 README.org | 34 --
 1 file changed, 28 insertions(+), 6 deletions(-)

diff --git a/README.org b/README.org
index 7856b6ef49..d5ef7ead39 100644
--- a/README.org
+++ b/README.org
@@ -1,5 +1,6 @@
 #+TITLE: llm package for emacs
 
+* Introduction
 This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but allows gives the end-user an option to choose which LLM 
they would prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as  locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local  [...]
 
 The functionality supported by LLMs is not completely consistent, nor are 
their APIs.  In this library we attempt to abstract functionality to a higher 
level, because sometimes those higher level concepts are supported by an API, 
and othertimes they must be put in more low-level concepts.  One such 
higher-level concept is "examples" where the client can show example 
interactions to demonstrate a pattern for the LLM.  The GCloud Vertex API has 
an explicit API for examples, but for Open AI [...]
@@ -7,8 +8,31 @@ The functionality supported by LLMs is not completely 
consistent, nor are their
 Some functionality may not be supported by LLMs.  Any unsupported 
functionality with throw a ='not-implemented= signal.
 
 This package is simple at the moment, but will grow as both LLMs and 
functionality is added.
-
-Clients should require the module, =llm=, and code against it.  Most functions 
are generic, and take a struct representing a provider as the first argument. 
The client code, or the user themselves can then require the specific module, 
such as =llm-openai=, and create a provider with a function such as 
~(make-llm-openai :key user-api-key)~.  The client application will use this 
provider to call all the generic functions.
+* Setting up providers
+Users who use an application that uses this package should not need to install 
it.  The llm module should be installed as a dependency when you install the 
package that uses it.  You do need to make sure to both require and set up the 
provider you will be using.  Typically, applications will have a variable you 
can set.  For example, let's say there's a package called "llm-refactoring", 
which has a variable ~llm-refactoring-provider~.  You would set it up like so:
+
+#+begin_src emacs-lisp
+(use-package llm-refactoring
+  :init (setq llm-refactoring-provider (make-llm-openai :key my-openai-key))
+#+end_src
+
+Here ~my-openai-key~ would be a variable you set up before with your Open AI 
key.  Or, just substitute the key itself as a string.  It's important that you 
remember never to check your key into a public repository such as github, 
because your key must be kept private.  Anyone with your key can use the API, 
and you will be charged.
+** Open AI
+You can set up with ~make-llm-openai~, with the following parameters:
+- ~:key~, the Open AI key that you get when you sign up to use Open AI's APIs. 
 Remember to keep this private.  This is non-optional.
+- ~:chat-model~: A model name from the 
[[https://platform.openai.com/docs/models/gpt-4][list of Open AI's model 
names.]]  Keep in mind some of these are not available to everyone.  This is 
optional, and will default to a reasonable 3.5 model.
+- ~:embedding-model~: A model name from 
[[https://platform.openai.com/docs/guides/embeddings/embedding-models][list of 
Open AI's embedding model names.]]  This is optional, and will default to a 
reasonable model.
+** Vertex
+You can set up with ~make-llm-vertex~, with the following parameters:
+- ~:project~: Your project number from Google Cloud that has Vertex API 
enabled.
+- ~:chat-model~: A model name from the 
[[https://cloud.google.com/vertex-ai/docs/generative-ai/chat/chat-prompts#supported_model][list
 of Vertex's model names.]]  This is optional, and will default to a reasonable 
model.
+- ~:embedding-model~: A model name from the 
[[https://cloud.google.com/vertex-ai/docs/generative-ai/embeddings/get-text-embeddings#supported_models][list
 of Vertex's embedding model names.]]  This is optional, and will default to a 
reasonable model.
+
+In addition to the provider, which you may want multiple of (for example, to 
charge against different projects), there are customizable variables:
+- ~llm-vertex-gcloud-binary~: The binary to use for generating the API key.
+- ~llm-vertex-gcloud-region~: The gcloud region to use.  It's good to set this 
to a region near where you are for best latency.  Defaults to "us-central1".
+* Programmatic use
+Client applications should require the module, =llm=, an

[elpa] externals/llm 414d25a625 09/34: Removed various unused things, and format fixes

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 414d25a625201acc0f7b87f6fdb8eca2b48d5bc8
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Removed various unused things, and format fixes

This fixes all byte compile warnings, and notably fixes an incorrect error
message formatting in the vertex provider.
---
 llm-openai.el |  4 ++--
 llm-vertex.el | 16 
 2 files changed, 10 insertions(+), 10 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index 45dee5fc4d..9478878322 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -117,7 +117,7 @@ signal and message."
 request-alist)
   (push '("function_call" . (("name" . "output"))) request-alist))
 
-(let* ((resp (request "https://api.openai.com/v1/chat/completions";
+(request "https://api.openai.com/v1/chat/completions";
   :type "POST"
   :headers `(("Authorization" . ,(format "Bearer %s" 
(llm-openai-key provider)))
  ("Content-Type" . "application/json"))
@@ -133,7 +133,7 @@ signal and message."
  (format "Problem calling Open 
AI: %s, type: %s message: %s"
  (cdr error-thrown)
  (assoc-default 'type 
(cdar data))
- (assoc-default 
'message (cdar data
+ (assoc-default 
'message (cdar data)
 
 (cl-defmethod llm-chat-response-async ((provider llm-openai) prompt 
response-callback error-callback)
   (llm-openai--chat-response provider prompt response-callback error-callback))
diff --git a/llm-vertex.el b/llm-vertex.el
index cbbf165e18..41fd97d1e9 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -71,7 +71,7 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
 
 (cl-defmethod llm-embedding-async ((provider llm-vertex) string 
vector-callback error-callback)
   (llm-vertex-refresh-key provider)
-  (let ((resp (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
+  (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
llm-vertex-gcloud-region
(llm-vertex-project provider)
llm-vertex-gcloud-region
@@ -87,8 +87,8 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
  (cdr (assoc 'values (cdr (assoc 
'embeddings (aref (cdr (assoc 'predictions data)) 0
 :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
   (funcall error-callback
-   (error (format "Problem calling 
GCloud AI: %s"
- (cdr 
error-thrown)))
+   (error (format "Problem calling 
GCloud AI: %s (%S)"
+ (cdr error-thrown) 
data)))
 
 (cl-defmethod llm-chat-response-async ((provider llm-vertex) prompt 
response-callback error-callback)
   (llm-vertex-refresh-key provider)
@@ -116,7 +116,7 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
 request-alist))
 (when (llm-chat-prompt-max-tokens prompt)
   (push `("max_tokens" . ,(llm-chat-prompt-max-tokens prompt)) 
request-alist))
-(let ((resp (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
+(request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
llm-vertex-gcloud-region
(llm-vertex-project provider)
llm-vertex-gcloud-region
@@ -132,10 +132,10 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
   :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
 (funcall error-callback 'error
  (error (format "Problem 
calling GCloud AI: %s, status: %s message: %s (%s)"
-'error(cdr 
error-thrown)
-   (assoc-default 
'status (assoc-default 'error data))
-   (assoc-default 
'message (assoc-default 'error data))
-   data)))
+

[elpa] externals/llm 723c0b3786 31/34: Minor README whitespace and formatting fixes

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 723c0b378645e0ba779dc93e43fae7b92dcb907f
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Minor README whitespace and formatting fixes
---
 README.org | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/README.org b/README.org
index d5ef7ead39..b9047e8103 100644
--- a/README.org
+++ b/README.org
@@ -1,11 +1,11 @@
 #+TITLE: llm package for emacs
 
 * Introduction
-This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but allows gives the end-user an option to choose which LLM 
they would prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as  locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local  [...]
+This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but allows gives the end-user an option to choose which LLM 
they would prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local L [...]
 
 The functionality supported by LLMs is not completely consistent, nor are 
their APIs.  In this library we attempt to abstract functionality to a higher 
level, because sometimes those higher level concepts are supported by an API, 
and othertimes they must be put in more low-level concepts.  One such 
higher-level concept is "examples" where the client can show example 
interactions to demonstrate a pattern for the LLM.  The GCloud Vertex API has 
an explicit API for examples, but for Open AI [...]
 
-Some functionality may not be supported by LLMs.  Any unsupported 
functionality with throw a ='not-implemented= signal.
+Some functionality may not be supported by LLMs.  Any unsupported 
functionality with throw a ~'not-implemented~ signal.
 
 This package is simple at the moment, but will grow as both LLMs and 
functionality is added.
 * Setting up providers



[elpa] externals/llm dd20d6353c 21/34: Fix bug on llm-fake's error response to chat-response

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit dd20d6353c5bb5e02b38095c58a50bb86a2bea53
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix bug on llm-fake's error response to chat-response
---
 llm-fake.el | 7 +--
 1 file changed, 1 insertion(+), 6 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index 172f7866d2..93b0b210d0 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -46,11 +46,6 @@ either a vector response for the chat, or a signal symbol and
 message cons. If nil, the response will be a simple vector."
  output-to-buffer chat-action-func embedding-action-func)
 
-(defun llm-fake--chat-response (provider prompt)
-  "Produce a fake chat response.
-PROVIDER, PROMPT are as in `llm-chat-response.'"
-  )
-
 (cl-defmethod llm-chat-response-async ((provider llm-fake) prompt 
response-callback error-callback)
   (condition-case err
   (funcall response-callback (llm-chat-response provider prompt))
@@ -68,7 +63,7 @@ PROVIDER, PROMPT are as in `llm-chat-response.'"
 (pcase (type-of result)
 ('string result)
 ('cons (signal (car result) (cdr result)))
-(_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of-result)
+(_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of result)
 "Sample response from `llm-chat-response-async'"))
 
 (cl-defmethod llm-embedding ((provider llm-fake) string)



[elpa] externals/llm 9057a50df4 11/34: Fix indenting in llm--run-async-as-sync

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 9057a50df4b92eacebae1620cf06404b97367d3f
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix indenting in llm--run-async-as-sync
---
 llm.el | 10 +-
 1 file changed, 5 insertions(+), 5 deletions(-)

diff --git a/llm.el b/llm.el
index f513489f09..d9468ceaba 100644
--- a/llm.el
+++ b/llm.el
@@ -81,11 +81,11 @@ The return value will be the value passed into the success 
callback."
 (apply f (append args
  (list
   (lambda (result)
-   (setq response result)
-   (condition-notify cv))
- (lambda (type msg)
-   (signal type msg)
-   (condition-notify cv)
+(setq response result)
+(condition-notify cv))
+  (lambda (type msg)
+(signal type msg)
+(condition-notify cv)
 response))
 
 (cl-defgeneric llm-chat-response (provider prompt)



[elpa] externals/llm 444850a981 24/34: Fix missing word in non-free warning message

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 444850a981bf312d01b0af677f007939a509ef5f
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix missing word in non-free warning message
---
 llm.el | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/llm.el b/llm.el
index e7c05c18a4..6c96aba4a2 100644
--- a/llm.el
+++ b/llm.el
@@ -51,7 +51,7 @@
 (defun llm--warn-on-nonfree (name tos)
   "Issue a warning if `llm-warn-on-nonfree' is non-nil."
   (when llm-warn-on-nonfree
-(lwarn '(llm nonfree) :warning "%s API is not free software, and your 
freedom to use it restricted.
+(lwarn '(llm nonfree) :warning "%s API is not free software, and your 
freedom to use it is restricted.
 See %s for the details on the restrictions on use." name tos)))
 
 (cl-defstruct llm-chat-prompt



[elpa] externals/llm 39ae6fc794 34/34: Assign copyright to FSF, in preparation of inclusion to GNU ELPA

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 39ae6fc79450fa08f0a7505033ae497d7dcae976
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Assign copyright to FSF, in preparation of inclusion to GNU ELPA
---
 llm-fake.el   | 2 +-
 llm-openai.el | 2 +-
 llm-test.el   | 2 +-
 llm-tester.el | 2 +-
 llm-vertex.el | 2 +-
 llm.el| 2 +-
 6 files changed, 6 insertions(+), 6 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index 76ac01d6bd..9ffdcdd936 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -1,6 +1,6 @@
 ;;; llm-fake.el --- Use for developers looking at llm calls. -*- 
lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm
diff --git a/llm-openai.el b/llm-openai.el
index ba79e748af..11dcf2912c 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -1,6 +1,6 @@
 ;;; llm-openai.el --- llm module for integrating with Open AI -*- 
lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm
diff --git a/llm-test.el b/llm-test.el
index e394c5895c..4179439fdf 100644
--- a/llm-test.el
+++ b/llm-test.el
@@ -1,6 +1,6 @@
 ;;; llm-test.el --- Unit tests for the llm module -*- lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; SPDX-License-Identifier: GPL-3.0-or-later
diff --git a/llm-tester.el b/llm-tester.el
index 0bc9723649..839b1e4627 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -1,6 +1,6 @@
 ;;; llm-tester.el --- Helpers for testing LLM implementation -*- 
lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; SPDX-License-Identifier: GPL-3.0-or-later
diff --git a/llm-vertex.el b/llm-vertex.el
index 3642b8bfce..3c465421c8 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -1,6 +1,6 @@
 ;;; llm-vertex.el --- LLM implementation of Google Cloud Vertex AI -*- 
lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm
diff --git a/llm.el b/llm.el
index 034bedb797..11b508cb36 100644
--- a/llm.el
+++ b/llm.el
@@ -1,6 +1,6 @@
 ;;; llm.el --- Interface to pluggable llm backends -*- lexical-binding: t -*-
 
-;; Copyright (c) 2023  Andrew Hyatt 
+;; Copyright (c) 2023  Free Software Foundation, Inc.
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm



[elpa] externals/llm c55ccf157a 03/34: Clean up package specifications in elisp files

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit c55ccf157ab42eb0fef9b3a13f369e9b2e0376a3
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Clean up package specifications in elisp files

Only llm.el should hold the requires, version, etc. Also, the keywords were 
not
correct.
---
 llm-openai.el | 3 ---
 llm-vertex.el | 3 ---
 llm.el| 1 -
 3 files changed, 7 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index ad1a2e20c4..4e91f9c52d 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -4,9 +4,6 @@
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm
-;; Package-Requires: ((request "0.3.3") (emacs "28.1"))
-;; Package-Version: 0.1
-;; Keywords: outlines, hypermedia
 ;; SPDX-License-Identifier: GPL-3.0-or-later
 ;;
 ;; This program is free software; you can redistribute it and/or
diff --git a/llm-vertex.el b/llm-vertex.el
index 22043a97ab..6c14f45cd0 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -4,9 +4,6 @@
 
 ;; Author: Andrew Hyatt 
 ;; Homepage: https://github.com/ahyatt/llm
-;; Package-Requires: ((request "0.3.3") (emacs "28.1"))
-;; Package-Version: 0.1
-;; Keywords: outlines, hypermedia
 ;; SPDX-License-Identifier: GPL-3.0-or-later
 ;;
 ;; This program is free software; you can redistribute it and/or
diff --git a/llm.el b/llm.el
index a88090306e..880fa3a0e7 100644
--- a/llm.el
+++ b/llm.el
@@ -6,7 +6,6 @@
 ;; Homepage: https://github.com/ahyatt/llm
 ;; Package-Requires: ((request "0.3.3") (emacs "28.1"))
 ;; Package-Version: 0.1
-;; Keywords: outlines, hypermedia
 ;; SPDX-License-Identifier: GPL-3.0-or-later
 ;;
 ;; This program is free software; you can redistribute it and/or



[elpa] externals/llm c8b14b4d9c 19/34: Fix fake provider embedding func and remove async unit tests

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit c8b14b4d9c87b2b3ac5004017825941f4bfe3461
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix fake provider embedding func and remove async unit tests
---
 llm-fake.el | 2 +-
 llm-test.el | 1 -
 2 files changed, 1 insertion(+), 2 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index f6142c0dec..172f7866d2 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -80,7 +80,7 @@ PROVIDER, PROMPT are as in `llm-chat-response.'"
   (let* ((f (llm-fake-embedding-action-func provider))
  (result (funcall f)))
 (pcase (type-of result)
-('vector (funcall vector-callback result))
+('vector result)
 ('cons (signal (car result) (cdr result)))
 (_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result)
 [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]))
diff --git a/llm-test.el b/llm-test.el
index 5470db0c8a..e7a87676ad 100644
--- a/llm-test.el
+++ b/llm-test.el
@@ -31,7 +31,6 @@
 
 (ert-deftest llm-test-embedding ()
   (should-error (llm-embedding nil "Test"))
-  (should-error (llm-embedding-async nil "Test"))
   ;; TODO: Test signals that are not errors, which ert doesn't seem to catch.
   (should-error (llm-embedding (make-llm-fake
 :embedding-action-func



[elpa] externals/llm eba797b295 04/34: Implement error handling for gcloud auth issues

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit eba797b295d320b3158fa4a491bbcf292417d0ac
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Implement error handling for gcloud auth issues
---
 llm-vertex.el | 7 ---
 1 file changed, 4 insertions(+), 3 deletions(-)

diff --git a/llm-vertex.el b/llm-vertex.el
index 6c14f45cd0..6bcc949079 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -63,9 +63,10 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
   (unless (and (llm-vertex-key provider)
(> (* 60 60)
   (float-time (time-subtract (current-time) (or 
(llm-vertex-key-gentime provider) 0)
-(setf (llm-vertex-key provider)
-  (string-trim
-   (shell-command-to-string (concat llm-vertex-gcloud-binary " auth 
print-access-token"
+(let ((result (string-trim (shell-command-to-string (concat 
llm-vertex-gcloud-binary " auth print-access-token")
+  (when (string-match-p "ERROR" result)
+(error "Could not refresh gcloud access token, received the following 
error: %s" result))
+  (setf (llm-vertex-key provider) result))
 (setf (llm-vertex-key-gentime provider) (current-time
 
 (cl-defmethod llm-embedding ((provider llm-vertex) string)



[elpa] externals/llm c322577b9b 13/34: Test both sync and async commands

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit c322577b9b31f3b17e4540812fddad0156965144
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Test both sync and async commands
---
 llm-tester.el | 46 +-
 1 file changed, 41 insertions(+), 5 deletions(-)

diff --git a/llm-tester.el b/llm-tester.el
index 78a578d8a1..c6e09c9e09 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -34,8 +34,8 @@
 
 (require 'llm)
 
-(defun llm-tester-embedding (provider)
-  "Test that PROVIDER can provide embeddings."
+(defun llm-tester-embedding-async (provider)
+  "Test that PROVIDER can provide embeddings in an async call."
   (message "Testing provider %s for embeddings" (type-of provider))
   (llm-embedding-async provider "This is a test."
(lambda (embedding)
@@ -50,7 +50,20 @@
(lambda (type message)
  (message "ERROR: Provider %s returned an error of 
type %s with message %s" (type-of provider) type message
 
-(defun llm-tester-chat (provider)
+(defun llm-tester-embedding-sync (provider)
+  "Test that PROVIDER can provide embeddings in a sync call."
+  (message "Testing provider %s for embeddings" (type-of provider))
+  (let ((embedding (llm-embedding provider "This is a test.")))
+(if embedding
+(if (eq (type-of embedding) 'vector)
+(if (> (length embedding) 0)
+(message "SUCCESS: Provider %s provided an embedding of length 
%d.  First 10 values: %S" (type-of provider)
+ (length embedding)
+ (seq-subseq embedding 0 (min 10 (length embedding
+  (message "ERROR: Provider %s returned an empty embedding" 
(type-of provider
+  (message "ERROR: Provider %s did not return any embedding" (type-of 
provider)
+
+(defun llm-tester-chat-async (provider)
   "Test that PROVIDER can interact with the LLM chat."
   (message "Testing provider %s for chat" (type-of provider))
   (llm-chat-response-async
@@ -74,10 +87,33 @@
(lambda (type message)
  (message "ERROR: Provider %s returned an error of type %s with message 
%s" (type-of provider) type message
 
+(defun llm-tester-chat-sync (provider)
+  "Test that PROVIDER can interact with the LLM chat."
+  (message "Testing provider %s for chat" (type-of provider))
+  (let ((response (llm-chat-response
+   provider
+   (make-llm-chat-prompt
+:interactions (list
+   (make-llm-chat-prompt-interaction
+:role 'user
+:content "Tell me a random cool feature of 
emacs."))
+:context "You must answer all questions as if you were the 
butler Jeeves from Jeeves and Wooster.  Start all interactions with the phrase, 
'Very good, sir.'"
+:examples '(("Tell me the capital of France." . "Very 
good, sir.  The capital of France is Paris, which I expect you to be familiar 
with, since you were just there last week with your Aunt Agatha.")
+("Could you take me to my favorite place?" . 
"Very good, sir.  I believe you are referring to the Drone's Club, which I will 
take you to after you put on your evening attire."))
+:temperature 0.5
+:max-tokens 100
+(if response
+(if (> (length response) 0)
+(message "SUCCESS: Provider %s provided a response %s" (type-of 
provider) response)
+  (message "ERROR: Provider %s returned an empty response" (type-of 
provider)))
+  (message "ERROR: Provider %s did not return any response" (type-of 
provider)
+
 (defun llm-tester-all (provider)
   "Test all llm functionality for PROVIDER."
-  (llm-tester-embedding provider)
-  (llm-tester-chat provider))
+  (llm-tester-embedding-sync provider)
+  (llm-tester-chat-sync provider)
+  (llm-tester-embedding-async provider)
+  (llm-tester-chat-async provider))
 
 (provide 'llm-tester)
 



[elpa] externals/llm b52958757a 18/34: Fix docstring wider than 80 characters in llm-vertex

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit b52958757aefd1f1aa17f34adb2b79ccf9407afa
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix docstring wider than 80 characters in llm-vertex
---
 llm-vertex.el | 10 ++
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/llm-vertex.el b/llm-vertex.el
index e51e9c8d3b..25f0be4259 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -71,8 +71,9 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
 
 (defun llm-vertex--embedding (provider string vector-callback error-callback 
sync)
   "Get the embedding for STRING.
-PROVIDER, VECTOR-CALLBACK, ERROR-CALLBACK are all the same as 
`llm-embedding-async'.
-SYNC, when non-nil, will wait until the response is available to return."
+PROVIDER, VECTOR-CALLBACK, ERROR-CALLBACK are all the same as
+`llm-embedding-async'. SYNC, when non-nil, will wait until the
+response is available to return."
   (llm-vertex-refresh-key provider)
   (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
llm-vertex-gcloud-region
@@ -107,8 +108,9 @@ SYNC, when non-nil, will wait until the response is 
available to return."
 
 (defun llm-vertex--chat-response (provider prompt response-callback 
error-callback sync)
   "Get the chat response for PROMPT.
-PROVIDER, RESPONSE-CALLBACK, ERROR-CALLBACK are all the same as 
`llm-chat-response-async'.
-SYNC, when non-nil, will wait until the response is available to return."
+PROVIDER, RESPONSE-CALLBACK, ERROR-CALLBACK are all the same as
+`llm-chat-response-async'. SYNC, when non-nil, will wait until
+the response is available to return."
   (llm-vertex-refresh-key provider)
   (let ((request-alist))
 (when (llm-chat-prompt-context prompt)



[elpa] externals/llm 4e9be8183d 07/34: Merge branch 'async'

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 4e9be8183d11e7bf652328769e6be2ad3d46d1a3
Merge: 3919b77383 16ee85fd11
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Merge branch 'async'
---
 llm-openai.el | 36 +++--
 llm-tester.el | 65 +++
 llm.el| 37 --
 3 files changed, 90 insertions(+), 48 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index 4e91f9c52d..3bc8a06f17 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -50,28 +50,34 @@ EMBEDDING-MODEL is the model to use for embeddings.  If 
unset, it
 will use a reasonable default."
   key chat-model embedding-model)
 
-(cl-defmethod llm-embedding ((provider llm-openai) string)
+(cl-defmethod llm-embedding-async ((provider llm-openai) string 
vector-callback error-callback)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, provide the ekg-embedding-api-key"))
-  (let ((resp (request "https://api.openai.com/v1/embeddings";
+  (request "https://api.openai.com/v1/embeddings";
 :type "POST"
 :headers `(("Authorization" . ,(format "Bearer %s" 
ekg-embedding-api-key))
("Content-Type" . "application/json"))
 :data (json-encode `(("input" . ,string) ("model" . ,(or 
(llm-openai-embedding-model provider) "text-embedding-ada-002"
 :parser 'json-read
+:success (cl-function (lambda (&key data &allow-other-keys)
+(funcall vector-callback
+ (cdr (assoc 'embedding (aref 
(cdr (assoc 'data data)) 0))
 :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
-  (error (format "Problem calling Open AI: 
%s, type: %s message: %s"
+  (funcall error-callback 'error
+   (format "Problem calling Open 
AI: %s, type: %s message: %s"
  (cdr error-thrown)
  (assoc-default 'type 
(cdar data))
- (assoc-default 'message 
(cdar data))
-:timeout 2
-:sync t)))
-(cdr (assoc 'embedding (aref (cdr (assoc 'data (request-response-data 
resp))) 0)
+ (assoc-default 'message 
(cdar data
 
-(defun llm-openai--chat-response (prompt &optional return-json-spec)
+(defun llm-openai--chat-response (prompt response-callback error-callback 
&optional return-json-spec)
   "Main method to send a PROMPT as a chat prompt to Open AI.
 RETURN-JSON-SPEC, if specified, is a JSON spec to return from the
-Open AI API."
+Open AI API.
+
+RESPONSE-CALLBACK is a function to call with the LLM response.
+
+ERROR-CALLBACK is called if there is an error, with the error
+signal and message."
   (unless (llm-openai-key provider)
 (error "To call Open AI API, the key must have been set"))
   (let (request-alist system-prompt)
@@ -116,14 +122,14 @@ Open AI API."
   :data (json-encode request-alist)
   :parser 'json-read
   :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
-(error (format "Problem calling Open 
AI: %s, type: %s message: %s"
-   (cdr error-thrown)
-   (assoc-default 'type 
(cdar data))
-   (assoc-default 'message 
(cdar data))
-  :sync t)))
+(funcall error-callback
+ (format "Problem calling Open 
AI: %s, type: %s message: %s"
+ (cdr error-thrown)
+ (assoc-default 'type 
(cdar data))
+ (assoc-default 
'message (cdar data)
   (let ((result (cdr (assoc 'content (cdr (assoc 'message (aref (cdr 
(assoc 'choices (request-response-data resp))) 0))
 (func-result (cdr (assoc 'arguments (cdr (assoc 'function_call 
(cdr (assoc 'message (aref (cdr (assoc 'choices (request-response-data resp))) 
0)
-(or func-result result)
+(funcall result-callback (or func-result result))
 
 (cl-defmethod llm-chat-response ((provider llm-openai) prompt)
   (llm-openai--chat-response prompt nil))
diff --git a/llm-tester.el b/llm-tester.el
index 53938ae721..089e5cd5de 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -1,4 +1,4 @@
-;;; llm-tester.el --- Helpers for testing LLM implementation
+;;; llm-tester.

[elpa] externals/llm 0ed280c208 15/34: Add llm-fake, useful for developer testing using the llm methods

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 0ed280c208efee3124eaf022accf47d493036de7
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add llm-fake, useful for developer testing using the llm methods
---
 llm-fake.el | 75 +
 1 file changed, 75 insertions(+)

diff --git a/llm-fake.el b/llm-fake.el
new file mode 100644
index 00..f74f868c1f
--- /dev/null
+++ b/llm-fake.el
@@ -0,0 +1,75 @@
+;;; llm-fake.el --- Use for developers looking at llm calls. -*- 
lexical-binding: t -*-
+
+;; Copyright (c) 2023  Andrew Hyatt 
+
+;; Author: Andrew Hyatt 
+;; Homepage: https://github.com/ahyatt/llm
+;; SPDX-License-Identifier: GPL-3.0-or-later
+;;
+;; This program is free software; you can redistribute it and/or
+;; modify it under the terms of the GNU General Public License as
+;; published by the Free Software Foundation; either version 3 of the
+;; License, or (at your option) any later version.
+;;
+;; This program is distributed in the hope that it will be useful, but
+;; WITHOUT ANY WARRANTY; without even the implied warranty of
+;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+;; General Public License for more details.
+;;
+;; You should have received a copy of the GNU General Public License
+;; along with GNU Emacs.  If not, see .
+
+;;; Commentary:
+;; This file implements the llm functionality defined in llm.el, for developers
+;; who want to just understand what llm calls are made, and with what data. Or,
+;; to test out various functionality they have. The functions return something,
+;; or throw errors, depending on how the `llm-fake' provider is configured.
+
+(require 'cl-lib)
+(require 'llm)
+
+;;; Code:
+
+(cl-defstruct llm-fake
+ "A provider for the fake LLM provider.
+
+OUTPUT-TO-BUFFER can be nil, in which case, nothing will be
+output. If a string or a buffer, it will append the request as
+text to that buffer.
+
+CHAT-ACTION-FUNC will be called with no arguments to produce
+either a string response for the chat, or a signal symbol and
+message cons. If nil, the response will be a short text string.
+
+EMBEDDING-ACTION-FUNC will be called with no arguments to produce
+either a vector response for the chat, or a signal symbol and
+message cons. If nil, the response will be a simple vector."
+ output-to-buffer chat-action-func embedding-action-func)
+
+(cl-defmethod llm-chat-response-async ((provider llm-fake) prompt 
response-callback error-callback)
+  (when (llm-fake-output-to-buffer provider)
+(with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
+  (goto-char (point-max))
+  (insert "\nCall to llm-chat-response\n"  (llm-chat-prompt-to-text 
prompt) "\n")))
+  (or (when-let (f (llm-fake-chat-action-func provider))
+(let ((result (funcall f)))
+  (pcase (type-of result)
+('string (funcall response-callback result))
+('cons (funcall error-callback (car result) (cdr result)))
+(_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of-result))
+  (funcall response-callback "Sample response from 
`llm-chat-response-async'")))
+
+(cl-defmethod llm-embedding-async ((provider llm-openai) string 
vector-callback error-callback)
+  (when (llm-fake-output-to-buffer provider)
+(with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
+  (goto-char (point-max))
+  (insert "\nCall to llm-embedding with text: " string "\n")))
+  (or (when-let (f (llm-fake-chat-action-func provider))
+(let ((result (funcall f)))
+  (pcase (type-of result)
+('vector (funcall vector-callback result))
+('cons (funcall error-callback (car result) (cdr result)))
+(_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result))
+  (funcall response-callback [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9])))
+
+(provide 'llm-fake)



[elpa] externals/llm cff9ab8f3c 22/34: Centralize nonfree llm warnings, and warn with a targeted type

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit cff9ab8f3c65f7ad92f0f0cb133df980cbcd4d6e
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Centralize nonfree llm warnings, and warn with a targeted type
---
 llm-openai.el | 4 +---
 llm-vertex.el | 4 +---
 llm.el| 6 ++
 3 files changed, 8 insertions(+), 6 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index ec20d34875..a0e0c4fe56 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -51,9 +51,7 @@ will use a reasonable default."
   key chat-model embedding-model)
 
 (defun llm-openai--maybe-warn ()
-  (when llm-warn-on-nonfree
-(warn "Open AI's API is not free software, and your freedom to use it is 
restricted by Open AI's terms of service.
-See https://openai.com/policies/terms-of-use for the restrictions on use.")))
+  (llm--warn-on-nonfree "Open AI" "https://openai.com/policies/terms-of-use";))
 
 (defun llm-openai--embedding-make-request (provider string vector-callback 
error-callback sync)
   "Make a request to Open AI to get an embedding for STRING.
diff --git a/llm-vertex.el b/llm-vertex.el
index 551403a59e..e072ee533a 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -70,9 +70,7 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
 (setf (llm-vertex-key-gentime provider) (current-time
 
 (defun llm-vertex-maybe-warn ()
-  (when llm-warn-on-nonfree
-(warn "Google Cloud's Vertex AI is not free software, and your freedom to 
use it is restricted by Google's terms of service.
-See https://policies.google.com/terms/generative-ai for more information.")))
+  (llm--warn-on-nonfree "Google Cloud Vertex" 
"https://policies.google.com/terms/generative-ai";))
 
 (defun llm-vertex--embedding (provider string vector-callback error-callback 
sync)
   "Get the embedding for STRING.
diff --git a/llm.el b/llm.el
index de2e05bbe3..408a5ab17b 100644
--- a/llm.el
+++ b/llm.el
@@ -48,6 +48,12 @@
   "Whether to issue a warning when using a non-free LLM."
   :type 'boolean)
 
+(defun llm--warn-on-nonfree (name tos)
+  "Issue a warning if `llm-warn-on-nonfree' is non-nil."
+  (when llm-warn-on-nonfree
+(lwarn '(llm nonfree) :warning "%s API is not free software, and your 
freedom to use it restricted.
+See %s for the details on the restrictions on use." name tos)))
+
 (cl-defstruct llm-chat-prompt
   "This stores all the information needed for a structured chat prompt.
 



[elpa] externals/llm d4bbe9d84c 29/34: Fix incorrect requires in openai and vertex implementations

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit d4bbe9d84caf2bec9d608c058beb7b986ecf2437
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Fix incorrect requires in openai and vertex implementations
---
 llm-openai.el | 1 +
 llm-vertex.el | 2 ++
 2 files changed, 3 insertions(+)

diff --git a/llm-openai.el b/llm-openai.el
index 70d0836e89..edb10f7862 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -26,6 +26,7 @@
 ;;; Code:
 
 (require 'cl-lib)
+(require 'llm)
 (require 'request)
 (require 'json)
 
diff --git a/llm-vertex.el b/llm-vertex.el
index 11543a4ca5..3642b8bfce 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -25,6 +25,8 @@
 
 (require 'cl-lib)
 (require 'llm)
+(require 'request)
+(require 'json)
 
 (defgroup llm-vertex nil
   "LLM implementation for Google Cloud Vertex AI."



[elpa] externals/llm 650bba65d5 25/34: Improve the docstring for llm--warn-on-nonfree

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 650bba65d5c25d66be9fb932c0818f3a8d65ef12
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Improve the docstring for llm--warn-on-nonfree
---
 llm.el | 8 +++-
 1 file changed, 7 insertions(+), 1 deletion(-)

diff --git a/llm.el b/llm.el
index 6c96aba4a2..5d6202b18b 100644
--- a/llm.el
+++ b/llm.el
@@ -49,7 +49,13 @@
   :type 'boolean)
 
 (defun llm--warn-on-nonfree (name tos)
-  "Issue a warning if `llm-warn-on-nonfree' is non-nil."
+  "Issue a warning if `llm-warn-on-nonfree' is non-nil.
+NAME is the human readable name of the LLM (e.g 'Open AI').
+
+TOS is the URL of the terms of service for the LLM.
+
+All non-free LLMs should call this function on each llm function
+invocation."
   (when llm-warn-on-nonfree
 (lwarn '(llm nonfree) :warning "%s API is not free software, and your 
freedom to use it is restricted.
 See %s for the details on the restrictions on use." name tos)))



[elpa] branch externals/llm created (now 39ae6fc794)

2023-09-15 Thread Andrew Hyatt
ahyatt pushed a change to branch externals/llm.

at  39ae6fc794 Assign copyright to FSF, in preparation of inclusion to 
GNU ELPA

This branch includes the following new commits:

   new  ad76cff80b Initial checkin of the llm package for emacs.
   new  3b761aca25 Add README.org
   new  c55ccf157a Clean up package specifications in elisp files
   new  eba797b295 Implement error handling for gcloud auth issues
   new  16ee85fd11 Add async options, and made the sync options just use 
those and wait
   new  3919b77383 Implement confusion and typos in README.org
   new  4e9be8183d Merge branch 'async'
   new  636014bf64 Make all remaining code async-friendly
   new  414d25a625 Removed various unused things, and format fixes
   new  ad230d9d6b Add methods for nil provider, to throw more meaningful 
errors
   new  9057a50df4 Fix indenting in llm--run-async-as-sync
   new  131a7ee5d3 Solve flaky errors when using sync llm commands
   new  c322577b9b Test both sync and async commands
   new  48ae59d149 Fix llm-chat-prompt-to-text, which was unusable
   new  0ed280c208 Add llm-fake, useful for developer testing using the llm 
methods
   new  cff5db8ad5 Add unit tests and fix all brokenness detected by them
   new  9a3fc01cac Switch from generic to per-provider sync solution
   new  b52958757a Fix docstring wider than 80 characters in llm-vertex
   new  c8b14b4d9c Fix fake provider embedding func and remove async unit 
tests
   new  9e3040bad2 Add warnings requested by GNU about nonfree software
   new  dd20d6353c Fix bug on llm-fake's error response to chat-response
   new  cff9ab8f3c Centralize nonfree llm warnings, and warn with a 
targeted type
   new  abbff2aa9d Change method name to llm-chat (without "-response"), 
update README
   new  444850a981 Fix missing word in non-free warning message
   new  650bba65d5 Improve the docstring for llm--warn-on-nonfree
   new  40151757de Switch to a method of nonfree warnings easier for 
provider modules
   new  e94bc937c7 Fix issue with llm-chat before method having too many 
arguments
   new  7edd36b2dc Fix obsolete or incorrect function calls in llm-fake
   new  d4bbe9d84c Fix incorrect requires in openai and vertex 
implementations
   new  ba65755326 Improve the README with information on providers for 
end-users
   new  723c0b3786 Minor README whitespace and formatting fixes
   new  8f30feb5c1 README improvements, including noting the nonfree llm 
warning
   new  b2f1605514 Delete some trailing whitespace
   new  39ae6fc794 Assign copyright to FSF, in preparation of inclusion to 
GNU ELPA




[elpa] externals/llm ad76cff80b 01/34: Initial checkin of the llm package for emacs.

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit ad76cff80b56ddf1c310eb2ec78f6547e39117de
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Initial checkin of the llm package for emacs.
---
 .gitignore|   2 +
 COPYING   | 674 ++
 llm-openai.el | 140 
 llm-tester.el |  80 +++
 llm-vertex.el | 139 
 llm.el| 114 ++
 6 files changed, 1149 insertions(+)

diff --git a/.gitignore b/.gitignore
new file mode 100644
index 00..257c9f7d33
--- /dev/null
+++ b/.gitignore
@@ -0,0 +1,2 @@
+*.elc
+*-autoloads.el
diff --git a/COPYING b/COPYING
new file mode 100644
index 00..f288702d2f
--- /dev/null
+++ b/COPYING
@@ -0,0 +1,674 @@
+GNU GENERAL PUBLIC LICENSE
+   Version 3, 29 June 2007
+
+ Copyright (C) 2007 Free Software Foundation, Inc. 
+ Everyone is permitted to copy and distribute verbatim copies
+ of this license document, but changing it is not allowed.
+
+Preamble
+
+  The GNU General Public License is a free, copyleft license for
+software and other kinds of works.
+
+  The licenses for most software and other practical works are designed
+to take away your freedom to share and change the works.  By contrast,
+the GNU General Public License is intended to guarantee your freedom to
+share and change all versions of a program--to make sure it remains free
+software for all its users.  We, the Free Software Foundation, use the
+GNU General Public License for most of our software; it applies also to
+any other work released this way by its authors.  You can apply it to
+your programs, too.
+
+  When we speak of free software, we are referring to freedom, not
+price.  Our General Public Licenses are designed to make sure that you
+have the freedom to distribute copies of free software (and charge for
+them if you wish), that you receive source code or can get it if you
+want it, that you can change the software or use pieces of it in new
+free programs, and that you know you can do these things.
+
+  To protect your rights, we need to prevent others from denying you
+these rights or asking you to surrender the rights.  Therefore, you have
+certain responsibilities if you distribute copies of the software, or if
+you modify it: responsibilities to respect the freedom of others.
+
+  For example, if you distribute copies of such a program, whether
+gratis or for a fee, you must pass on to the recipients the same
+freedoms that you received.  You must make sure that they, too, receive
+or can get the source code.  And you must show them these terms so they
+know their rights.
+
+  Developers that use the GNU GPL protect your rights with two steps:
+(1) assert copyright on the software, and (2) offer you this License
+giving you legal permission to copy, distribute and/or modify it.
+
+  For the developers' and authors' protection, the GPL clearly explains
+that there is no warranty for this free software.  For both users' and
+authors' sake, the GPL requires that modified versions be marked as
+changed, so that their problems will not be attributed erroneously to
+authors of previous versions.
+
+  Some devices are designed to deny users access to install or run
+modified versions of the software inside them, although the manufacturer
+can do so.  This is fundamentally incompatible with the aim of
+protecting users' freedom to change the software.  The systematic
+pattern of such abuse occurs in the area of products for individuals to
+use, which is precisely where it is most unacceptable.  Therefore, we
+have designed this version of the GPL to prohibit the practice for those
+products.  If such problems arise substantially in other domains, we
+stand ready to extend this provision to those domains in future versions
+of the GPL, as needed to protect the freedom of users.
+
+  Finally, every program is threatened constantly by software patents.
+States should not allow patents to restrict development and use of
+software on general-purpose computers, but in those that do, we wish to
+avoid the special danger that patents applied to a free program could
+make it effectively proprietary.  To prevent this, the GPL assures that
+patents cannot be used to render the program non-free.
+
+  The precise terms and conditions for copying, distribution and
+modification follow.
+
+   TERMS AND CONDITIONS
+
+  0. Definitions.
+
+  "This License" refers to version 3 of the GNU General Public License.
+
+  "Copyright" also means copyright-like laws that apply to other kinds of
+works, such as semiconductor masks.
+
+  "The Program" refers to any copyrightable work licensed under this
+License.  Each licensee is addressed as "you".  "Licensees" and
+"recipients" may be individuals or organizations.
+
+  To "modify" a work means to copy from or adapt all or part of the work
+in a fashion requiring copyright permission, other than the

[elpa] externals/llm 3b761aca25 02/34: Add README.org

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 3b761aca251eea164003743ed02f173f5fab888d
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add README.org
---
 README.org | 25 +
 1 file changed, 25 insertions(+)

diff --git a/README.org b/README.org
new file mode 100644
index 00..ee764e5d80
--- /dev/null
+++ b/README.org
@@ -0,0 +1,25 @@
+#+TITLE: llm package for emacs
+
+This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but gives the user an option to choose which LLM they would 
prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as  locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local LLM or is p [...]
+
+The functionality supported by LLMs is not completely consistent, nor are 
their APIs.  In this library we attempt to abstract functionality to a higher 
level, because sometimes those higher level concepts are supported by an API, 
and othertimes they must be put in more low-level concepts.  Examples are an 
example of this; the GCloud Vertex API has an explicit API for examples, but 
for Open AI's API, examples must be specified by modifying the sytem prompt.  
And Open AI has the concept of [...]
+
+Some functionality may not be supported by LLMs.  Any unsupported 
functionality with throw a ='not-implemented= signal.
+
+This package is simple at the moment, but will grow as both LLMs and 
functionality is added.
+
+Clients should require the module, =llm=, and code against it.  Most functions 
are generic, and take a struct representing a provider as the first argument. 
The client code, or the user themselves can then require the specific module, 
such as =llm-openai=, and create a provider with a function such as 
~(make-llm-openai :key user-api-key)~.  The client application will use this 
provider to call all the generic functions.
+
+A list of all the functions:
+
+- ~llm-chat-response provider prompt~:  With user-chosen ~provider~ , and a 
~llm-chat-prompt~ structure (containing context, examples, interactions, and 
parameters such as temperature and max tokens), send that prompt to the LLM and 
wait for the string output.
+- ~llm-embedding provider string~: With the user-chosen ~provider~, send a 
string and get an embedding, which is a large vector of floating point values.  
The embedding represents the semantic meaning of the string, and the vector can 
be compared against other vectors, where smaller distances between the vectors 
represent greater semantic similarity.
+
+All of the providers currently implemented.
+
+- =llm-openai=.  This is the interface to Open AI's Chat GPT.  The user must 
set their key, and select their preferred chat and embedding model.
+- =llm-vertex=.  This is the interface to Google Cloud's Vertex API.  The user 
needs to set their project number.  In addition, to get authenticated, the user 
must have logged in initially, and have a valid path in 
~llm-vertex-gcloud-binary~.  Users can also configure 
~llm-vertex-gcloud-region~ for using a region closer to their location.  It 
defaults to ="us-central1"=  The provider can also contain the user's chosen 
embedding and chat model.
+
+If you are interested in creating a provider, please send a pull request, or 
open a bug.
+
+This library is not yet part of any package archive.



[elpa] externals/llm abbff2aa9d 23/34: Change method name to llm-chat (without "-response"), update README

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit abbff2aa9d8c1df46c9b3e44d6b2e96861f3fd50
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Change method name to llm-chat (without "-response"), update README
---
 README.org|  5 -
 llm-fake.el   |  6 +++---
 llm-openai.el | 10 +-
 llm-test.el   |  8 
 llm-tester.el |  4 ++--
 llm-vertex.el | 12 ++--
 llm.el|  8 
 7 files changed, 28 insertions(+), 25 deletions(-)

diff --git a/README.org b/README.org
index dea73f1a66..7856b6ef49 100644
--- a/README.org
+++ b/README.org
@@ -12,13 +12,16 @@ Clients should require the module, =llm=, and code against 
it.  Most functions a
 
 A list of all the functions:
 
-- ~llm-chat-response provider prompt~:  With user-chosen ~provider~ , and a 
~llm-chat-prompt~ structure (containing context, examples, interactions, and 
parameters such as temperature and max tokens), send that prompt to the LLM and 
wait for the string output.
+- ~llm-chat provider prompt~:  With user-chosen ~provider~ , and a 
~llm-chat-prompt~ structure (containing context, examples, interactions, and 
parameters such as temperature and max tokens), send that prompt to the LLM and 
wait for the string output.
+- ~llm-chat-async provider prompt response-callback error-callback~: Same as 
~llm-chat~, but executes in the background.  Takes a ~response-callback~ which 
will be called with the text response.  The ~error-callback~ will be called in 
case of error, with the error symbol and an error message.
 - ~llm-embedding provider string~: With the user-chosen ~provider~, send a 
string and get an embedding, which is a large vector of floating point values.  
The embedding represents the semantic meaning of the string, and the vector can 
be compared against other vectors, where smaller distances between the vectors 
represent greater semantic similarity.
+- ~llm-embedding-async provider string vector-callback error-callback~: Same 
as ~llm-embedding~ but this is processed asynchronously. ~vector-callback~ is 
called with the vector embedding, and, in case of error, ~error-callback~ is 
called with the same arguments as in ~llm-chat-async~.
 
 All of the providers currently implemented.
 
 - =llm-openai=.  This is the interface to Open AI's Chat GPT.  The user must 
set their key, and select their preferred chat and embedding model.
 - =llm-vertex=.  This is the interface to Google Cloud's Vertex API.  The user 
needs to set their project number.  In addition, to get authenticated, the user 
must have logged in initially, and have a valid path in 
~llm-vertex-gcloud-binary~.  Users can also configure 
~llm-vertex-gcloud-region~ for using a region closer to their location.  It 
defaults to ="us-central1"=  The provider can also contain the user's chosen 
embedding and chat model.
+- =llm-fake=.  This is a provider that is useful for developers using this 
library, to be able to understand what is being sent to the =llm= library 
without actually sending anything over the wire.
 
 If you are interested in creating a provider, please send a pull request, or 
open a bug.
 
diff --git a/llm-fake.el b/llm-fake.el
index 93b0b210d0..95aea76400 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -52,11 +52,11 @@ message cons. If nil, the response will be a simple vector."
 (t (funcall error-callback (car err) (cdr err
   nil)
 
-(cl-defmethod llm-chat-response ((provider llm-fake) prompt)
+(cl-defmethod llm-chat ((provider llm-fake) prompt)
   (when (llm-fake-output-to-buffer provider)
 (with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
   (goto-char (point-max))
-  (insert "\nCall to llm-chat-response\n"  (llm-chat-prompt-to-text 
prompt) "\n")))
+  (insert "\nCall to llm-chat\n"  (llm-chat-prompt-to-text prompt) "\n")))
   (if (llm-fake-chat-action-func provider)
   (let* ((f (llm-fake-chat-action-func provider))
  (result (funcall f)))
@@ -64,7 +64,7 @@ message cons. If nil, the response will be a simple vector."
 ('string result)
 ('cons (signal (car result) (cdr result)))
 (_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of result)
-"Sample response from `llm-chat-response-async'"))
+"Sample response from `llm-chat-async'"))
 
 (cl-defmethod llm-embedding ((provider llm-fake) string)
   (when (llm-fake-output-to-buffer provider)
diff --git a/llm-openai.el b/llm-openai.el
index a0e0c4fe56..76d6ab45cd 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -89,7 +89,7 @@ should wait until the response is received."
 (lambda (_ error-message) (error 
error-message)) t)
 response))
 
-(defun llm-openai--chat-response (provider prompt response-callback 
error-callback &optional return-json-spec sync)
+(defun llm-openai--chat (provider prompt response-callback error-callback 
&optional return-json-spec sync)
   "Main method to send a PROMPT as a chat prompt to Open 

[elpa] externals/llm 636014bf64 08/34: Make all remaining code async-friendly

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 636014bf64a91d3ddbe3ba14e585e332f2b9820a
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Make all remaining code async-friendly

This finalizes (I hope) async changes to both openai and vertex providers, 
as
well as the tester.
---
 llm-openai.el | 22 +++---
 llm-tester.el | 39 ---
 llm-vertex.el | 27 ---
 3 files changed, 47 insertions(+), 41 deletions(-)

diff --git a/llm-openai.el b/llm-openai.el
index 3bc8a06f17..45dee5fc4d 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -69,11 +69,13 @@ will use a reasonable default."
  (assoc-default 'type 
(cdar data))
  (assoc-default 'message 
(cdar data
 
-(defun llm-openai--chat-response (prompt response-callback error-callback 
&optional return-json-spec)
+(defun llm-openai--chat-response (provider prompt response-callback 
error-callback &optional return-json-spec)
   "Main method to send a PROMPT as a chat prompt to Open AI.
 RETURN-JSON-SPEC, if specified, is a JSON spec to return from the
 Open AI API.
 
+PROVIDER is a `llm-openai' struct which holds the key and other options.
+
 RESPONSE-CALLBACK is a function to call with the LLM response.
 
 ERROR-CALLBACK is called if there is an error, with the error
@@ -121,22 +123,20 @@ signal and message."
  ("Content-Type" . "application/json"))
   :data (json-encode request-alist)
   :parser 'json-read
+  :success (cl-function
+(lambda (&key data &allow-other-keys)
+  (let ((result (cdr (assoc 'content (cdr (assoc 
'message (aref (cdr (assoc 'choices data)) 0))
+(func-result (cdr (assoc 'arguments (cdr 
(assoc 'function_call (cdr (assoc 'message (aref (cdr (assoc 'choices data)) 
0)
+(funcall response-callback (or func-result 
result)
   :error (cl-function (lambda (&key error-thrown data 
&allow-other-keys)
 (funcall error-callback
  (format "Problem calling Open 
AI: %s, type: %s message: %s"
  (cdr error-thrown)
  (assoc-default 'type 
(cdar data))
- (assoc-default 
'message (cdar data)
-  (let ((result (cdr (assoc 'content (cdr (assoc 'message (aref (cdr 
(assoc 'choices (request-response-data resp))) 0))
-(func-result (cdr (assoc 'arguments (cdr (assoc 'function_call 
(cdr (assoc 'message (aref (cdr (assoc 'choices (request-response-data resp))) 
0)
-(funcall result-callback (or func-result result))
-
-(cl-defmethod llm-chat-response ((provider llm-openai) prompt)
-  (llm-openai--chat-response prompt nil))
-
-(cl-defmethod llm-chat-structured-response ((provider llm-openai) prompt spec)
-  (llm-openai--chat-response prompt spec))
+ (assoc-default 
'message (cdar data
 
+(cl-defmethod llm-chat-response-async ((provider llm-openai) prompt 
response-callback error-callback)
+  (llm-openai--chat-response provider prompt response-callback error-callback))
 
 (provide 'llm-openai)
 
diff --git a/llm-tester.el b/llm-tester.el
index 089e5cd5de..78a578d8a1 100644
--- a/llm-tester.el
+++ b/llm-tester.el
@@ -53,25 +53,26 @@
 (defun llm-tester-chat (provider)
   "Test that PROVIDER can interact with the LLM chat."
   (message "Testing provider %s for chat" (type-of provider))
-  (llm-chat-response provider
- (make-llm-chat-prompt
-  :interactions (list
- (make-llm-chat-prompt-interaction
-  :role 'user
-  :content "Tell me a random cool feature 
of emacs."))
-  :context "You must answer all questions as if you were 
the butler Jeeves from Jeeves and Wooster.  Start all interactions with the 
phrase, 'Very good, sir.'"
-  :examples '(("Tell me the capital of France." . "Very 
good, sir.  The capital of France is Paris, which I expect you to be familiar 
with, since you were just there last week with your Aunt Agatha.")
-  ("Could you take me to my favorite place?" . 
"Very good, sir.  I believe you are referring to the Drone's Club, which I will 
take you to after you put on your evening attire."))
-  :temperature 0.5
-  :max-tokens 100)
- (lambda (response)
-   (if response
-

[elpa] externals/llm 131a7ee5d3 12/34: Solve flaky errors when using sync llm commands

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 131a7ee5d304d52ae5641017883dc88bc055a39a
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Solve flaky errors when using sync llm commands

The solution was to hold the mutexes every time we're changing the closure 
state.
---
 llm.el | 16 ++--
 1 file changed, 10 insertions(+), 6 deletions(-)

diff --git a/llm.el b/llm.el
index d9468ceaba..29e907a093 100644
--- a/llm.el
+++ b/llm.el
@@ -76,16 +76,20 @@ error callback. This will block until the async function 
calls
 one of the callbacks.
 
 The return value will be the value passed into the success callback."
-  (let ((cv (make-condition-variable (make-mutex "llm-chat-response")))
-(response))
+  (let* ((mutex (make-mutex "llm-chat-response"))
+ (cv (make-condition-variable mutex))
+ (response))
 (apply f (append args
  (list
   (lambda (result)
-(setq response result)
-(condition-notify cv))
+(with-mutex mutex
+  (setq response result)
+  (condition-notify cv)))
   (lambda (type msg)
-(signal type msg)
-(condition-notify cv)
+(with-mutex mutex
+  (message "async to sync, got error")
+  (signal type msg)
+  (condition-notify cv))
 response))
 
 (cl-defgeneric llm-chat-response (provider prompt)



[elpa] externals/llm 3919b77383 06/34: Implement confusion and typos in README.org

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 3919b77383324173dcff352c506112fee903a646
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Implement confusion and typos in README.org

This fixes the problems noted in https://github.com/ahyatt/llm/pull/1 by
https://github.com/tvraman.
---
 README.org | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/README.org b/README.org
index ee764e5d80..dea73f1a66 100644
--- a/README.org
+++ b/README.org
@@ -1,8 +1,8 @@
 #+TITLE: llm package for emacs
 
-This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but gives the user an option to choose which LLM they would 
prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as  locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local LLM or is p [...]
+This is a library for interfacing with Large Language Models.  It allows elisp 
code to use LLMs, but allows gives the end-user an option to choose which LLM 
they would prefer.  This is especially useful for LLMs, since there are various 
high-quality ones that in which API access costs money, as well as  locally 
installed ones that are free, but of medium quality.  Applications using LLMs 
can use this library to make sure their application works regardless of whether 
the user has a local  [...]
 
-The functionality supported by LLMs is not completely consistent, nor are 
their APIs.  In this library we attempt to abstract functionality to a higher 
level, because sometimes those higher level concepts are supported by an API, 
and othertimes they must be put in more low-level concepts.  Examples are an 
example of this; the GCloud Vertex API has an explicit API for examples, but 
for Open AI's API, examples must be specified by modifying the sytem prompt.  
And Open AI has the concept of [...]
+The functionality supported by LLMs is not completely consistent, nor are 
their APIs.  In this library we attempt to abstract functionality to a higher 
level, because sometimes those higher level concepts are supported by an API, 
and othertimes they must be put in more low-level concepts.  One such 
higher-level concept is "examples" where the client can show example 
interactions to demonstrate a pattern for the LLM.  The GCloud Vertex API has 
an explicit API for examples, but for Open AI [...]
 
 Some functionality may not be supported by LLMs.  Any unsupported 
functionality with throw a ='not-implemented= signal.
 



[elpa] externals/llm 9a3fc01cac 17/34: Switch from generic to per-provider sync solution

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 9a3fc01cac06c17e00d36a48990a638217692238
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Switch from generic to per-provider sync solution

The previous method of convering async calls to sync had issues with 
threading,
even after some basic fixes to the method. It's more reliable to handle 
this on
a per-provider basis, by having all the providers actually implementing 
their
own sync calls.
---
 llm-fake.el   | 29 ++
 llm-openai.el | 98 ++-
 llm-vertex.el | 89 +++--
 llm.el| 36 ++
 4 files changed, 155 insertions(+), 97 deletions(-)

diff --git a/llm-fake.el b/llm-fake.el
index 8a72ccebd1..f6142c0dec 100644
--- a/llm-fake.el
+++ b/llm-fake.el
@@ -46,7 +46,18 @@ either a vector response for the chat, or a signal symbol and
 message cons. If nil, the response will be a simple vector."
  output-to-buffer chat-action-func embedding-action-func)
 
+(defun llm-fake--chat-response (provider prompt)
+  "Produce a fake chat response.
+PROVIDER, PROMPT are as in `llm-chat-response.'"
+  )
+
 (cl-defmethod llm-chat-response-async ((provider llm-fake) prompt 
response-callback error-callback)
+  (condition-case err
+  (funcall response-callback (llm-chat-response provider prompt))
+(t (funcall error-callback (car err) (cdr err
+  nil)
+
+(cl-defmethod llm-chat-response ((provider llm-fake) prompt)
   (when (llm-fake-output-to-buffer provider)
 (with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
   (goto-char (point-max))
@@ -55,12 +66,12 @@ message cons. If nil, the response will be a simple vector."
   (let* ((f (llm-fake-chat-action-func provider))
  (result (funcall f)))
 (pcase (type-of result)
-('string (funcall response-callback result))
-('cons (funcall error-callback (car result) (cdr result)))
+('string result)
+('cons (signal (car result) (cdr result)))
 (_ (error "Incorrect type found in `chat-action-func': %s" 
(type-of-result)
-(funcall response-callback "Sample response from 
`llm-chat-response-async'")))
+"Sample response from `llm-chat-response-async'"))
 
-(cl-defmethod llm-embedding-async ((provider llm-fake) string vector-callback 
error-callback)
+(cl-defmethod llm-embedding ((provider llm-fake) string)
   (when (llm-fake-output-to-buffer provider)
 (with-current-buffer (get-buffer-create (llm-fake-output-to-buffer 
provider))
   (goto-char (point-max))
@@ -70,8 +81,14 @@ message cons. If nil, the response will be a simple vector."
  (result (funcall f)))
 (pcase (type-of result)
 ('vector (funcall vector-callback result))
-('cons (funcall error-callback (car result) (cdr result)))
+('cons (signal (car result) (cdr result)))
 (_ (error "Incorrect type found in `chat-embedding-func': %s" 
(type-of-result)
-(funcall vector-callback [0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9])))
+[0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9]))
+
+(cl-defmethod llm-embedding-async ((provider llm-fake) string vector-callback 
error-callback)
+  (condition-case err
+  (funcall vector-callback (llm-embedding provider string))
+(t (funcall error-callback (car err) (cdr err
+  nil)
 
 (provide 'llm-fake)
diff --git a/llm-openai.el b/llm-openai.el
index 9478878322..199ee86f14 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -50,26 +50,42 @@ EMBEDDING-MODEL is the model to use for embeddings.  If 
unset, it
 will use a reasonable default."
   key chat-model embedding-model)
 
-(cl-defmethod llm-embedding-async ((provider llm-openai) string 
vector-callback error-callback)
+(defun llm-openai--embedding-make-request (provider string vector-callback 
error-callback sync)
+  "Make a request to Open AI to get an embedding for STRING.
+PROVIDER, VECTOR-CALLBACK and ERROR-CALLBACK are as in the
+`llm-embedding-async' call. SYNC is non-nil when the request
+should wait until the response is received."
   (unless (llm-openai-key provider)
-(error "To call Open AI API, provide the ekg-embedding-api-key"))
+(error "To call Open AI API, add a key to the `llm-openai' provider."))
   (request "https://api.openai.com/v1/embeddings";
-:type "POST"
-:headers `(("Authorization" . ,(format "Bearer %s" 
ekg-embedding-api-key))
-   ("Content-Type" . "application/json"))
-:data (json-encode `(("input" . ,string) ("model" . ,(or 
(llm-openai-embedding-model provider) "text-embedding-ada-002"
-:parser 'json-read
-:success (cl-function (lambda (&key data &allow-other-keys)
-(funcall vector-callback
-

[elpa] externals/llm 9e3040bad2 20/34: Add warnings requested by GNU about nonfree software

2023-09-15 Thread Andrew Hyatt
branch: externals/llm
commit 9e3040bad27b8d73c2292127ccfc2c612eed1e8e
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

Add warnings requested by GNU about nonfree software

Also this correctly sets up the LLM customization group.
---
 llm-openai.el | 7 +++
 llm-vertex.el | 7 +++
 llm.el| 8 
 3 files changed, 22 insertions(+)

diff --git a/llm-openai.el b/llm-openai.el
index 199ee86f14..ec20d34875 100644
--- a/llm-openai.el
+++ b/llm-openai.el
@@ -50,11 +50,17 @@ EMBEDDING-MODEL is the model to use for embeddings.  If 
unset, it
 will use a reasonable default."
   key chat-model embedding-model)
 
+(defun llm-openai--maybe-warn ()
+  (when llm-warn-on-nonfree
+(warn "Open AI's API is not free software, and your freedom to use it is 
restricted by Open AI's terms of service.
+See https://openai.com/policies/terms-of-use for the restrictions on use.")))
+
 (defun llm-openai--embedding-make-request (provider string vector-callback 
error-callback sync)
   "Make a request to Open AI to get an embedding for STRING.
 PROVIDER, VECTOR-CALLBACK and ERROR-CALLBACK are as in the
 `llm-embedding-async' call. SYNC is non-nil when the request
 should wait until the response is received."
+  (llm-openai--maybe-warn)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, add a key to the `llm-openai' provider."))
   (request "https://api.openai.com/v1/embeddings";
@@ -98,6 +104,7 @@ ERROR-CALLBACK is called if there is an error, with the error
 signal and message.
 
 SYNC is non-nil when the request should wait until the response is received."
+  (llm-openai--maybe-warn)
   (unless (llm-openai-key provider)
 (error "To call Open AI API, the key must have been set"))
   (let (request-alist system-prompt)
diff --git a/llm-vertex.el b/llm-vertex.el
index 25f0be4259..551403a59e 100644
--- a/llm-vertex.el
+++ b/llm-vertex.el
@@ -69,12 +69,18 @@ KEY-GENTIME keeps track of when the key was generated, 
because the key must be r
   (setf (llm-vertex-key provider) result))
 (setf (llm-vertex-key-gentime provider) (current-time
 
+(defun llm-vertex-maybe-warn ()
+  (when llm-warn-on-nonfree
+(warn "Google Cloud's Vertex AI is not free software, and your freedom to 
use it is restricted by Google's terms of service.
+See https://policies.google.com/terms/generative-ai for more information.")))
+
 (defun llm-vertex--embedding (provider string vector-callback error-callback 
sync)
   "Get the embedding for STRING.
 PROVIDER, VECTOR-CALLBACK, ERROR-CALLBACK are all the same as
 `llm-embedding-async'. SYNC, when non-nil, will wait until the
 response is available to return."
   (llm-vertex-refresh-key provider)
+  (llm-vertex-maybe-warn)
   (request (format 
"https://%s-aiplatform.googleapis.com/v1/projects/%s/locations/%s/publishers/google/models/%s:predict";
llm-vertex-gcloud-region
(llm-vertex-project provider)
@@ -112,6 +118,7 @@ PROVIDER, RESPONSE-CALLBACK, ERROR-CALLBACK are all the 
same as
 `llm-chat-response-async'. SYNC, when non-nil, will wait until
 the response is available to return."
   (llm-vertex-refresh-key provider)
+  (llm-vertex-maybe-warn)
   (let ((request-alist))
 (when (llm-chat-prompt-context prompt)
   (push `("context" . ,(llm-chat-prompt-context prompt)) request-alist))
diff --git a/llm.el b/llm.el
index f01a130cf8..de2e05bbe3 100644
--- a/llm.el
+++ b/llm.el
@@ -40,6 +40,14 @@
 
 (require 'cl-lib)
 
+(defgroup llm nil
+  "Interface to pluggable llm backends."
+  :group 'external)
+
+(defcustom llm-warn-on-nonfree t
+  "Whether to issue a warning when using a non-free LLM."
+  :type 'boolean)
+
 (cl-defstruct llm-chat-prompt
   "This stores all the information needed for a structured chat prompt.
 



[elpa] main 884336d14e: elpa-packages(llm): New package

2023-09-15 Thread Andrew Hyatt
branch: main
commit 884336d14e8b62e4da80bdb3a16817013220039d
Author: Andrew Hyatt 
Commit: Andrew Hyatt 

elpa-packages(llm): New package
---
 elpa-packages | 1 +
 1 file changed, 1 insertion(+)

diff --git a/elpa-packages b/elpa-packages
index 45a534ce6e..ae4e9804d9 100644
--- a/elpa-packages
+++ b/elpa-packages
@@ -415,6 +415,7 @@
   :doc "README.org"
   :news "CHANGELOG.org"
   :ignored-files ("COPYING" "doclicense.texi"))
+ (llm  :url "https://github.com/ahyatt/llm";)
  (lmc  :url nil)
  (load-dir :url nil)
  (load-relative:url 
"https://github.com/rocky/emacs-load-relative";)