jamesnetherton opened a new issue, #7440:
URL: https://github.com/apache/camel-quarkus/issues/7440

   I am a bit confused about this. With Quarkus Langchain 1.0.0, I find that 
`langchain4j-chat` native tests are failing. It's due to the various model 
types not being able to be marshalled / unmarshalled.
   
   In the case of Ollama, I had to register a bunch of classes for reflection:
   
   ```
   dev.langchain4j.model.ollama.FormatSerializer
   dev.langchain4j.model.ollama.Message
   dev.langchain4j.model.ollama.OllamaChatRequest
   dev.langchain4j.model.ollama.OllamaChatResponse
   dev.langchain4j.model.ollama.Options
   dev.langchain4j.model.ollama.Role
   dev.langchain4j.model.ollama.Tool
   dev.langchain4j.model.ollama.ToolCall
   ```
   
   Previously this was not required. We should dig into why it's needed because 
ideally, we don't want to manage this in CQ. Especially for all of the 
different Langchain4j supported models users may want to work with.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@camel.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to