This is an automated email from the ASF dual-hosted git repository.

orpiske pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/camel.git

commit 2f19294b4d9270d6664ee4af6e7528ce3ba3bd6c
Author: Otavio Rodolfo Piske <angusyo...@gmail.com>
AuthorDate: Thu Aug 1 17:03:43 2024 +0200

    CAMEL-21040: fix documentation in camel-langchain4j-chat
---
 .../src/main/docs/langchain4j-chat-component.adoc                 | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)

diff --git 
a/components/camel-ai/camel-langchain4j-chat/src/main/docs/langchain4j-chat-component.adoc
 
b/components/camel-ai/camel-langchain4j-chat/src/main/docs/langchain4j-chat-component.adoc
index 6735cb1852c..afc906d5586 100644
--- 
a/components/camel-ai/camel-langchain4j-chat/src/main/docs/langchain4j-chat-component.adoc
+++ 
b/components/camel-ai/camel-langchain4j-chat/src/main/docs/langchain4j-chat-component.adoc
@@ -76,7 +76,7 @@ Add the dependency for LangChain4j OpenAI support:
 </dependency>
 ----
 
-Init the OpenAI Chat Language Model, add add it to the Camel Registry:
+Init the OpenAI Chat Language Model, and add it to the Camel Registry:
 [source, java]
 ----
 ChatLanguageModel model = OpenAiChatModel.builder()
@@ -98,7 +98,7 @@ Use the model in the Camel LangChain4j Chat Producer
 
 [NOTE]
 ====
-To switch to another Large Language Model and its corresponding dependency, 
simply replace the `langchain4j-open-ai` dependency with the appropriate 
dependency for the desired model. Update the initialization parameters 
accordingly in the code snippet provided above.
+To switch to another Large Language Model and its corresponding dependency, 
replace the `langchain4j-open-ai` dependency with the appropriate dependency 
for the desired model. Update the initialization parameters accordingly in the 
code snippet provided above.
 ====
 
 == Send a prompt with variables
@@ -150,8 +150,8 @@ String response = 
template.requestBody("direct:send-multiple", messages, String.
 ----
 
 == Chat with Tool
-Camel langchain4j-chat component as a consumer can be used to implement a 
LangChain Tool,
-right now Tools are supported only via the OpenAiChatModel backed by OpenAI 
APIs.
+Camel langchain4j-chat component as a consumer can be used to implement a 
LangChain tool.
+Right now tools are supported only via the OpenAiChatModel backed by OpenAI 
APIs.
 
 Tool Input parameter can be defined as an Endpoint multiValue option in the 
form of `parameter.<name>=<type>`,
 or via the endpoint option camelToolParameter for a programmatic approach.

Reply via email to