This is an automated email from the ASF dual-hosted git repository.

aldettinger pushed a commit to branch camel-quarkus-main
in repository https://gitbox.apache.org/repos/asf/camel-quarkus-examples.git


The following commit(s) were added to refs/heads/camel-quarkus-main by this 
push:
     new 8a380d0  data-extraction-langchain4j: Add an illustration schema
8a380d0 is described below

commit 8a380d0b62f6a08a3eb4aa1cbbc32277f688d253
Author: aldettinger <aldettin...@gmail.com>
AuthorDate: Wed Sep 4 16:49:29 2024 +0200

    data-extraction-langchain4j: Add an illustration schema
---
 data-extract-langchain4j/README.adoc        |   7 +++++--
 data-extract-langchain4j/schema.png         | Bin 0 -> 96255 bytes
 data-extract-langchain4j/schemas-source.odp | Bin 0 -> 36144 bytes
 3 files changed, 5 insertions(+), 2 deletions(-)

diff --git a/data-extract-langchain4j/README.adoc 
b/data-extract-langchain4j/README.adoc
index 5260f5a..b05f777 100644
--- a/data-extract-langchain4j/README.adoc
+++ b/data-extract-langchain4j/README.adoc
@@ -12,10 +12,13 @@ For instance, let's imagine an insurance company that would 
record the transcrip
 There is probably a lot of valuable information that could be extracted from 
those conversation transcripts.
 In this example, we'll convert those text conversations into Java Objects that 
could then be used in the rest of the Camel route.
 
+image::schema.png[]
+
 In order to achieve this extraction, we'll need a 
https://en.wikipedia.org/wiki/Large_language_model[Large Language Model (LLM)] 
that natively supports JSON output.
 Here, we arbitrarily choose https://ollama.com/library/codellama[codellama] 
served through https://ollama.com/[ollama].
-In order to invoke the served model, we'll use the high-level LangChain4j APIs 
like https://docs.langchain4j.dev/tutorials/ai-services[AiServices].
-As we are using the Quarkus runtime, we can leverage all the advantages of the 
https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html[Quarkus 
LangChain4j extension].
+In order to request inference to the served model, we'll use the high-level 
LangChain4j APIs like 
https://docs.langchain4j.dev/tutorials/ai-services[AiServices].
+More precisely, we'll setup the 
https://docs.quarkiverse.io/quarkus-langchain4j/dev/index.html[Quarkus 
LangChain4j extension] to register an AiService bean.
+Finally, we'll invoke the AiService extraction method via the 
https://camel.apache.org/camel-quarkus/latest/reference/extensions/bean.html[Camel
 Quarkus bean extension] .
 
 === Start the Large Language Model
 
diff --git a/data-extract-langchain4j/schema.png 
b/data-extract-langchain4j/schema.png
new file mode 100644
index 0000000..4a8b105
Binary files /dev/null and b/data-extract-langchain4j/schema.png differ
diff --git a/data-extract-langchain4j/schemas-source.odp 
b/data-extract-langchain4j/schemas-source.odp
new file mode 100644
index 0000000..ef42139
Binary files /dev/null and b/data-extract-langchain4j/schemas-source.odp differ

Reply via email to