branch: elpa/gptel commit cdf27ce064cb77e124a8a0ef1222bf4f935217e9 Author: Dmitry <tumas...@yandex.ru> Commit: GitHub <nore...@github.com>
gptel: Add AI/ML API Integration (#930) README: Add instructions for AI/ML API integration. gptel.el (Commentary): Mention support for AI/ML API. --- README.org | 37 +++++++++++++++++++++++++++++++++++++ gptel.el | 4 ++-- 2 files changed, 39 insertions(+), 2 deletions(-) diff --git a/README.org b/README.org index 5cc6c78a46..d8ce6a474c 100644 --- a/README.org +++ b/README.org @@ -24,6 +24,7 @@ gptel is a simple Large Language Model chat client for Emacs, with support for m | Mistral Le Chat | ✓ | [[https://console.mistral.ai/api-keys][API key]] | | Perplexity | ✓ | [[https://docs.perplexity.ai/docs/getting-started][API key]] | | OpenRouter | ✓ | [[https://openrouter.ai/keys][API key]] | +| AI/ML API | ✓ | [[https://aimlapi.com/app/?utm_source=gptel&utm_medium=github&utm_campaign=integration][API key]] | | together.ai | ✓ | [[https://api.together.xyz/settings/api-keys][API key]] | | Anyscale | ✓ | [[https://docs.endpoints.anyscale.com/][API key]] | | PrivateGPT | ✓ | [[https://github.com/zylon-ai/private-gpt#-documentation][PrivateGPT running locally]] | @@ -111,6 +112,7 @@ gptel uses Curl if available, but falls back to the built-in url-retrieve to wor - [[#groq][Groq]] - [[#mistral-le-chat][Mistral Le Chat]] - [[#openrouter][OpenRouter]] + - [[#aiml-api][AI/ML API]] - [[#privategpt][PrivateGPT]] - [[#deepseek][DeepSeek]] - [[#sambanova-deepseek][Sambanova (Deepseek)]] @@ -924,6 +926,41 @@ The above code makes the backend available to select. If you want it to be the #+html: </details> +#+html: <details><summary> +**** AI/ML API +#+html: </summary> + +AI/ML API provides 300+ AI models including Deepseek, Gemini, ChatGPT. The models run at enterprise-grade rate limits and uptimes. + +Register a backend with +#+begin_src emacs-lisp +;; AI/ML API offers an OpenAI compatible API +(gptel-make-openai "AI/ML API" ;Any name you want + :host "api.aimlapi.com" + :endpoint "/v1/chat/completions" + :stream t + :key "your-api-key" ;can be a function that returns the key + :models '(deepseek-chat gemini-pro gpt-4o)) +#+end_src + +You can pick this backend from the menu when using gptel (see [[#usage][Usage]]). + +***** (Optional) Set as the default gptel backend + +The above code makes the backend available to select. If you want it to be the default backend for gptel, you can set this as the value of =gptel-backend=. Use this instead of the above. +#+begin_src emacs-lisp +;; OPTIONAL configuration +(setq gptel-model 'gpt-4o + gptel-backend + (gptel-make-openai "AI/ML API" + :host "api.aimlapi.com" + :endpoint "/v1/chat/completions" + :stream t + :key "your-api-key" + :models '(deepseek-chat gemini-pro gpt-4o))) +#+end_src + +#+html: </details> #+html: <details><summary> **** Github CopilotChat #+html: </summary> diff --git a/gptel.el b/gptel.el index fc8dbf5951..8e4bdee750 100644 --- a/gptel.el +++ b/gptel.el @@ -35,7 +35,7 @@ ;; gptel supports: ;; ;; - The services ChatGPT, Azure, Gemini, Anthropic AI, Together.ai, Perplexity, -;; Anyscale, OpenRouter, Groq, PrivateGPT, DeepSeek, Cerebras, Github Models, +;; AI/ML API, Anyscale, OpenRouter, Groq, PrivateGPT, DeepSeek, Cerebras, Github Models, ;; GitHub Copilot chat, AWS Bedrock, Novita AI, xAI, Sambanova, Mistral Le ;; Chat and Kagi (FastGPT & Summarizer). ;; - Local models via Ollama, Llama.cpp, Llamafiles or GPT4All @@ -73,7 +73,7 @@ ;; - For Gemini: define a gptel-backend with `gptel-make-gemini', which see. ;; - For Anthropic (Claude): define a gptel-backend with `gptel-make-anthropic', ;; which see. -;; - For Together.ai, Anyscale, Groq, OpenRouter, DeepSeek, Cerebras or +;; - For AI/ML API, Together.ai, Anyscale, Groq, OpenRouter, DeepSeek, Cerebras or ;; Github Models: define a gptel-backend with `gptel-make-openai', which see. ;; - For PrivateGPT: define a backend with `gptel-make-privategpt', which see. ;; - For Perplexity: define a backend with `gptel-make-perplexity', which see.