branch: externals/ellama
commit d9de354092ae1267f5c0f5a91177b402bccf7294
Author: Sergey Kostyaev <[email protected]>
Commit: Sergey Kostyaev <[email protected]>
Add simple configuration example to readme
---
README.org | 15 ++++++++++++++-
1 file changed, 14 insertions(+), 1 deletion(-)
diff --git a/README.org b/README.org
index 205216a3dc..f33a4524e4 100644
--- a/README.org
+++ b/README.org
@@ -31,6 +31,19 @@ You can use ~ellama~ with other model or other llm provider.
Without any configuration, the first available ollama model will be used.
You can customize ellama configuration like this:
+#+BEGIN_SRC emacs-lisp
+ (use-package ellama
+ :ensure t
+ :bind ("C-c e" . ellama-transient-main-menu)
+ :config
+ ;; show ellama context in header line in all buffers
+ (ellama-context-header-line-global-mode +1)
+ ;; send last message in chat buffer with C-c C-c
+ (add-hook 'org-ctrl-c-ctrl-c-final-hook #'ellama-chat-send-last-message))
+#+END_SRC
+
+More sofisticated configuration example:
+
#+BEGIN_SRC emacs-lisp
(use-package ellama
:ensure t
@@ -99,7 +112,7 @@ You can customize ellama configuration like this:
;; show ellama context in header line in all buffers
(ellama-context-header-line-global-mode +1)
;; send last message in chat buffer with C-c C-c
- (add-hook 'org-ctrl-c-ctrl-c-hook #'ellama-chat-send-last-message))
+ (add-hook 'org-ctrl-c-ctrl-c-final-hook #'ellama-chat-send-last-message))
#+END_SRC
** Commands