orpiske commented on code in PR #13341:
URL: https://github.com/apache/camel/pull/13341#discussion_r1521651406


##########
components/camel-ai/camel-langchain-chat/src/main/java/docs/langchain-chat-component.adoc:
##########
@@ -0,0 +1,146 @@
+= Langchain4j Chat Component
+:doctitle: Langchain4j Chat
+:shortname: langchain-chat
+:artifactid: camel-langchain-chat
+:description: Langchain4j Chat
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-chat
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Chat Component allows you to integrate with any LLM supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+Maven users will need to add the following dependency to their `pom.xml`
+for this component:
+
+[source,xml]
+----
+<dependency>
+    <groupId>org.apache.camel</groupId>
+    <artifactId>camel-langchain-chat</artifactId>
+    <version>x.x.x</version>
+    <!-- use the same version as your Camel core version -->
+</dependency>
+----
+
+== URI format
+
+[source]
+----
+langchain-chat:chatIdId[?options]
+----
+
+Where *chatId* can be any string to uniquely identify the endpoint
+
+
+// component-configure options: START
+
+// component-configure options: END
+
+// component options: START
+include::partial$component-configure-options.adoc[]
+include::partial$component-endpoint-options.adoc[]
+// component options: END
+
+// endpoint options: START
+
+// endpoint options: END
+
+include::spring-boot:partial$starter.adoc[]
+
+== Using a specific Chat Model
+The Camel Langchain chat component provides an abstraction for interacting 
with various types of Large Language Models supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+To integrate with a specific Large Language Model, users should follow these 
steps:
+
+=== Example of Integrating with OpenAI
+Add the dependency for Langchain4j OpenAI support:
+
+[source,xml]
+----
+<dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-open-ai</artifactId>
+    <version>x.x.x</version>
+</dependency>
+----
+
+Init the OpenAI Chat Language Model, add add it to the Camel Registry:
+[source, java]
+----
+ChatLanguageModel model = OpenAiChatModel.builder()
+                .apiKey(openApiKey)
+                .modelName(GPT_3_5_TURBO)
+                .temperature(0.3)
+                .timeout(ofSeconds(3000))
+                .build();
+context.getRegistry().bind("chatModel", model);
+----
+
+Use the model in the Camel Langchain Chat Producer
+[source, java]
+----
+ from("direct:chat")
+      .to("langchain-chat:test?chatModel=#chatModel")
+
+----
+
+_NOTE:_ To switch to another Large Language Model and its corresponding 
dependency, simply replace the `langchain4j-open-ai` dependency with the 
appropriate dependency for the desired model. Update the initialization 
parameters accordingly in the code snippet provided above.

Review Comment:
   Use this: ```NOTE: To switch to another Large Language Model and its 
corresponding dependency...``` so it's properly rendered on the website as a 
note paragraph. Alternatively:
   
   ```
   NOTE
   =====
   text goes here
   =====
   ```



##########
components/camel-ai/camel-langchain-chat/src/main/java/docs/langchain-chat-component.adoc:
##########
@@ -0,0 +1,146 @@
+= Langchain4j Chat Component
+:doctitle: Langchain4j Chat
+:shortname: langchain-chat
+:artifactid: camel-langchain-chat
+:description: Langchain4j Chat
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-chat
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Chat Component allows you to integrate with any LLM supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+Maven users will need to add the following dependency to their `pom.xml`
+for this component:
+
+[source,xml]
+----
+<dependency>
+    <groupId>org.apache.camel</groupId>
+    <artifactId>camel-langchain-chat</artifactId>
+    <version>x.x.x</version>
+    <!-- use the same version as your Camel core version -->
+</dependency>
+----
+
+== URI format
+
+[source]
+----
+langchain-chat:chatIdId[?options]
+----
+
+Where *chatId* can be any string to uniquely identify the endpoint
+
+
+// component-configure options: START
+
+// component-configure options: END
+
+// component options: START
+include::partial$component-configure-options.adoc[]
+include::partial$component-endpoint-options.adoc[]
+// component options: END
+
+// endpoint options: START
+
+// endpoint options: END
+
+include::spring-boot:partial$starter.adoc[]
+
+== Using a specific Chat Model
+The Camel Langchain chat component provides an abstraction for interacting 
with various types of Large Language Models supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+To integrate with a specific Large Language Model, users should follow these 
steps:
+
+=== Example of Integrating with OpenAI
+Add the dependency for Langchain4j OpenAI support:
+
+[source,xml]
+----
+<dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-open-ai</artifactId>
+    <version>x.x.x</version>
+</dependency>
+----
+
+Init the OpenAI Chat Language Model, add add it to the Camel Registry:
+[source, java]
+----
+ChatLanguageModel model = OpenAiChatModel.builder()
+                .apiKey(openApiKey)
+                .modelName(GPT_3_5_TURBO)
+                .temperature(0.3)
+                .timeout(ofSeconds(3000))
+                .build();
+context.getRegistry().bind("chatModel", model);
+----
+
+Use the model in the Camel Langchain Chat Producer
+[source, java]
+----
+ from("direct:chat")
+      .to("langchain-chat:test?chatModel=#chatModel")
+
+----
+
+_NOTE:_ To switch to another Large Language Model and its corresponding 
dependency, simply replace the `langchain4j-open-ai` dependency with the 
appropriate dependency for the desired model. Update the initialization 
parameters accordingly in the code snippet provided above.
+
+== Send a prompt with variables
+To send a prompt with variables, use the Operation type 
`LangchainChatOperations.CHAT_SINGLE_MESSAGE_WITH_PROMPT`.
+This operation allows you to send a single prompt message with dynamic 
variables, which will be replaced with values provided in the request.
+
+Example of Route :

Review Comment:
   Nitpick: `Example of route:`



##########
components/camel-ai/camel-langchain-chat/src/main/java/docs/langchain-chat-component.adoc:
##########
@@ -0,0 +1,146 @@
+= Langchain4j Chat Component
+:doctitle: Langchain4j Chat
+:shortname: langchain-chat
+:artifactid: camel-langchain-chat
+:description: Langchain4j Chat
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-chat
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Chat Component allows you to integrate with any LLM supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+Maven users will need to add the following dependency to their `pom.xml`
+for this component:
+
+[source,xml]
+----
+<dependency>
+    <groupId>org.apache.camel</groupId>
+    <artifactId>camel-langchain-chat</artifactId>
+    <version>x.x.x</version>
+    <!-- use the same version as your Camel core version -->
+</dependency>
+----
+
+== URI format
+
+[source]
+----
+langchain-chat:chatIdId[?options]
+----
+
+Where *chatId* can be any string to uniquely identify the endpoint
+
+
+// component-configure options: START
+
+// component-configure options: END
+
+// component options: START
+include::partial$component-configure-options.adoc[]
+include::partial$component-endpoint-options.adoc[]
+// component options: END
+
+// endpoint options: START
+
+// endpoint options: END
+
+include::spring-boot:partial$starter.adoc[]
+
+== Using a specific Chat Model
+The Camel Langchain chat component provides an abstraction for interacting 
with various types of Large Language Models supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].

Review Comment:
   Use `... Large Language Models (LLMs) ...`. 



##########
components/camel-ai/camel-langchain-chat/src/main/java/docs/langchain-chat-component.adoc:
##########
@@ -0,0 +1,146 @@
+= Langchain4j Chat Component
+:doctitle: Langchain4j Chat
+:shortname: langchain-chat
+:artifactid: camel-langchain-chat
+:description: Langchain4j Chat
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-chat
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Chat Component allows you to integrate with any LLM supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+Maven users will need to add the following dependency to their `pom.xml`
+for this component:
+
+[source,xml]
+----
+<dependency>
+    <groupId>org.apache.camel</groupId>
+    <artifactId>camel-langchain-chat</artifactId>
+    <version>x.x.x</version>
+    <!-- use the same version as your Camel core version -->
+</dependency>
+----
+
+== URI format
+
+[source]
+----
+langchain-chat:chatIdId[?options]
+----
+
+Where *chatId* can be any string to uniquely identify the endpoint
+
+
+// component-configure options: START
+
+// component-configure options: END
+
+// component options: START
+include::partial$component-configure-options.adoc[]
+include::partial$component-endpoint-options.adoc[]
+// component options: END
+
+// endpoint options: START
+
+// endpoint options: END
+
+include::spring-boot:partial$starter.adoc[]
+
+== Using a specific Chat Model
+The Camel Langchain chat component provides an abstraction for interacting 
with various types of Large Language Models supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+To integrate with a specific Large Language Model, users should follow these 
steps:
+
+=== Example of Integrating with OpenAI
+Add the dependency for Langchain4j OpenAI support:
+
+[source,xml]
+----
+<dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-open-ai</artifactId>
+    <version>x.x.x</version>
+</dependency>
+----
+
+Init the OpenAI Chat Language Model, add add it to the Camel Registry:
+[source, java]
+----
+ChatLanguageModel model = OpenAiChatModel.builder()
+                .apiKey(openApiKey)
+                .modelName(GPT_3_5_TURBO)
+                .temperature(0.3)
+                .timeout(ofSeconds(3000))
+                .build();
+context.getRegistry().bind("chatModel", model);
+----
+
+Use the model in the Camel Langchain Chat Producer
+[source, java]
+----
+ from("direct:chat")
+      .to("langchain-chat:test?chatModel=#chatModel")
+
+----
+
+_NOTE:_ To switch to another Large Language Model and its corresponding 
dependency, simply replace the `langchain4j-open-ai` dependency with the 
appropriate dependency for the desired model. Update the initialization 
parameters accordingly in the code snippet provided above.
+
+== Send a prompt with variables
+To send a prompt with variables, use the Operation type 
`LangchainChatOperations.CHAT_SINGLE_MESSAGE_WITH_PROMPT`.
+This operation allows you to send a single prompt message with dynamic 
variables, which will be replaced with values provided in the request.
+
+Example of Route :
+[source, java]
+----
+ from("direct:chat")
+      
.to("langchain-chat:test?chatModel=#chatModel&chatOperation=CHAT_SINGLE_MESSAGE_WITH_PROMPT")
+
+----
+
+Example of usage:
+[source, java]
+----
+var promptTemplate = "Create a recipe for a {{dishType}} with the following 
ingredients: {{ingredients}}";
+
+Map<String, Object> variables = new HashMap<>();
+variables.put("dishType", "oven dish");
+variables.put("ingredients", "potato, tomato, feta, olive oil");
+
+String response = template.requestBodyAndHeader("direct:chat", variables,
+                LangchainChat.Headers.PROMPT_TEMPLATE, promptTemplate, 
String.class);
+----
+
+== Chat with history
+You can send a new prompt along with the chat message history by passing all 
messages in a List of Type `dev.langchain4j.data.message.ChatMessage`.

Review Comment:
   Capitalization consistency: ``` ... messages in a list of type ... ```



##########
components/camel-ai/camel-langchain-embeddings/src/main/docs/langchain-embeddings-component.adoc:
##########
@@ -0,0 +1,42 @@
+= Langchain4j Embeddings Component
+:doctitle: Langchain4j Embeddings
+:shortname: langchain-embeddings
+:artifactid: camel-langchain-embeddings
+:description: Langchain4j Embeddings
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-embeddings
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Embeddings Component provides support to compute Embeddings 
using Langchain4j Embeddings.

Review Comment:
   Capitalization consistency.



##########
components/camel-ai/camel-langchain-chat/pom.xml:
##########
@@ -0,0 +1,72 @@
+<?xml version="1.0" encoding="UTF-8"?>
+<!--
+
+    Licensed to the Apache Software Foundation (ASF) under one or more
+    contributor license agreements.  See the NOTICE file distributed with
+    this work for additional information regarding copyright ownership.
+    The ASF licenses this file to You under the Apache License, Version 2.0
+    (the "License"); you may not use this file except in compliance with
+    the License.  You may obtain a copy of the License at
+
+         http://www.apache.org/licenses/LICENSE-2.0
+
+    Unless required by applicable law or agreed to in writing, software
+    distributed under the License is distributed on an "AS IS" BASIS,
+    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+    See the License for the specific language governing permissions and
+    limitations under the License.
+
+-->
+<project xmlns="http://maven.apache.org/POM/4.0.0"; 
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"; 
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 
http://maven.apache.org/xsd/maven-4.0.0.xsd";>
+  <modelVersion>4.0.0</modelVersion>
+  <parent>
+    <artifactId>camel-ai-parent</artifactId>
+    <groupId>org.apache.camel</groupId>
+    <version>4.5.0-SNAPSHOT</version>
+  </parent>
+
+  <artifactId>camel-langchain-chat</artifactId>
+  <packaging>jar</packaging>
+
+  <name>Camel Langchain</name>
+  <description>Langchain4j Chat component</description>
+
+  <properties>
+    <firstVersion>4.5.0</firstVersion>
+    <supportLevel>Preview</supportLevel>
+    <label>ai</label>
+  </properties>
+
+  <dependencies>
+    <dependency>
+      <groupId>org.apache.camel</groupId>
+      <artifactId>camel-support</artifactId>
+    </dependency>
+    <!-- for testing -->
+    <dependency>
+      <groupId>org.apache.camel</groupId>
+      <artifactId>camel-test-spring-junit5</artifactId>
+      <scope>test</scope>
+    </dependency>
+
+    <dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-embeddings-all-minilm-l6-v2</artifactId>
+      <version>${langchain4j.version}</version>
+      <scope>test</scope>
+    </dependency>
+    <dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-ollama</artifactId>
+      <version>${langchain4j.version}</version>
+    </dependency>

Review Comment:
   Should this one be declared at test scope?



##########
components/camel-ai/camel-langchain-chat/src/main/java/docs/langchain-chat-component.adoc:
##########
@@ -0,0 +1,146 @@
+= Langchain4j Chat Component
+:doctitle: Langchain4j Chat
+:shortname: langchain-chat
+:artifactid: camel-langchain-chat
+:description: Langchain4j Chat
+:since: 4.5
+:supportlevel: Preview
+:tabs-sync-option:
+:component-header: Only producer is supported
+//Manually maintained attributes
+:camel-spring-boot-name: langchain-chat
+
+*Since Camel {since}*
+
+*{component-header}*
+
+The Langchain Chat Component allows you to integrate with any LLM supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+Maven users will need to add the following dependency to their `pom.xml`
+for this component:
+
+[source,xml]
+----
+<dependency>
+    <groupId>org.apache.camel</groupId>
+    <artifactId>camel-langchain-chat</artifactId>
+    <version>x.x.x</version>
+    <!-- use the same version as your Camel core version -->
+</dependency>
+----
+
+== URI format
+
+[source]
+----
+langchain-chat:chatIdId[?options]
+----
+
+Where *chatId* can be any string to uniquely identify the endpoint
+
+
+// component-configure options: START
+
+// component-configure options: END
+
+// component options: START
+include::partial$component-configure-options.adoc[]
+include::partial$component-endpoint-options.adoc[]
+// component options: END
+
+// endpoint options: START
+
+// endpoint options: END
+
+include::spring-boot:partial$starter.adoc[]
+
+== Using a specific Chat Model
+The Camel Langchain chat component provides an abstraction for interacting 
with various types of Large Language Models supported by 
https://github.com/langchain4j/langchain4j[Langchain4j].
+
+To integrate with a specific Large Language Model, users should follow these 
steps:
+
+=== Example of Integrating with OpenAI
+Add the dependency for Langchain4j OpenAI support:
+
+[source,xml]
+----
+<dependency>
+      <groupId>dev.langchain4j</groupId>
+      <artifactId>langchain4j-open-ai</artifactId>
+    <version>x.x.x</version>
+</dependency>
+----
+
+Init the OpenAI Chat Language Model, add add it to the Camel Registry:
+[source, java]
+----
+ChatLanguageModel model = OpenAiChatModel.builder()
+                .apiKey(openApiKey)
+                .modelName(GPT_3_5_TURBO)
+                .temperature(0.3)
+                .timeout(ofSeconds(3000))
+                .build();
+context.getRegistry().bind("chatModel", model);
+----
+
+Use the model in the Camel Langchain Chat Producer
+[source, java]
+----
+ from("direct:chat")
+      .to("langchain-chat:test?chatModel=#chatModel")
+
+----
+
+_NOTE:_ To switch to another Large Language Model and its corresponding 
dependency, simply replace the `langchain4j-open-ai` dependency with the 
appropriate dependency for the desired model. Update the initialization 
parameters accordingly in the code snippet provided above.
+
+== Send a prompt with variables
+To send a prompt with variables, use the Operation type 
`LangchainChatOperations.CHAT_SINGLE_MESSAGE_WITH_PROMPT`.
+This operation allows you to send a single prompt message with dynamic 
variables, which will be replaced with values provided in the request.
+
+Example of Route :
+[source, java]
+----
+ from("direct:chat")
+      
.to("langchain-chat:test?chatModel=#chatModel&chatOperation=CHAT_SINGLE_MESSAGE_WITH_PROMPT")
+
+----
+
+Example of usage:
+[source, java]
+----
+var promptTemplate = "Create a recipe for a {{dishType}} with the following 
ingredients: {{ingredients}}";
+
+Map<String, Object> variables = new HashMap<>();
+variables.put("dishType", "oven dish");
+variables.put("ingredients", "potato, tomato, feta, olive oil");
+
+String response = template.requestBodyAndHeader("direct:chat", variables,
+                LangchainChat.Headers.PROMPT_TEMPLATE, promptTemplate, 
String.class);
+----
+
+== Chat with history
+You can send a new prompt along with the chat message history by passing all 
messages in a List of Type `dev.langchain4j.data.message.ChatMessage`.
+Use the Operation type `LangchainChatOperations.CHAT_MULTIPLE_MESSAGES`.
+This operation allows you to continue the conversation with the context of 
previous messages.
+
+Example of Route :

Review Comment:
   `Example of route:` 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@camel.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to