branch: externals/minuet commit aa0920b5c914d197536c7c8af784d70bc560f299 Author: Milan Glacier <d...@milanglacier.com> Commit: Milan Glacier <d...@milanglacier.com>
docs: add server demand note for deepseek model --- README.md | 5 +++++ 1 file changed, 5 insertions(+) diff --git a/README.md b/README.md index 93a6f74f04..c747b7db79 100644 --- a/README.md +++ b/README.md @@ -176,6 +176,11 @@ consider using the `deepseek-chat` model, which is compatible with both inference, you can deploy either `qwen-2.5-coder` or `deepseek-coder-v2` through Ollama using the `openai-fim-compatible` provider. +Note: as of January 27, 2025, the high server demand from deepseek may +significantly slow down the default provider used by Minuet +(`openai-fim-compatible` with deepseek). We recommend trying +alternative providers instead. + # Prompt See [prompt](./prompt.md) for the default prompt used by `minuet` and