Re: LambdaMART XML model to JSON

2017-07-24 Thread Ryan Yacyshyn
t; Search Consultant, R&D Software Engineer, Director > Sease Ltd. - www.sease.io > -- > View this message in context: > http://lucene.472066.n3.nabble.com/LambdaMART-XML-model-to-JSON-tp4347277p4347343.html > Sent from the Solr - User mailing list archive at Nabble.com. >

Re: LambdaMART XML model to JSON

2017-07-24 Thread alessandro.benedetti
the way to go. - --- Alessandro Benedetti Search Consultant, R&D Software Engineer, Director Sease Ltd. - www.sease.io -- View this message in context: http://lucene.472066.n3.nabble.com/LambdaMART-XML-model-to-JSON-tp4347277p4347343.html Sent from the Solr - User mailing l

Re: LambdaMART XML model to JSON

2017-07-24 Thread Ryan Yacyshyn
Here's something that'll create a JSON model that can be directly uploaded into Solr: https://github.com/ryac/lambdamart-xml-to-json It'll map the feature IDs to the names found in the feature-store as well. I had this error when uploading model: Model type does not exist org.apache.solr.ltr.mo

Re: LambdaMART XML model to JSON

2017-07-23 Thread Ryan Yacyshyn
Thanks Doug, this is helpful. I also started something last night to output to JSON for Solr, I'll post it up as well. Ryan On Sun, 23 Jul 2017 at 23:48 Doug Turnbull < dturnb...@opensourceconnections.com> wrote: > Yes you're correct that the feature is the 1-based identifier from your > tra

Re: LambdaMART XML model to JSON

2017-07-23 Thread Doug Turnbull
Yes you're correct that the feature is the 1-based identifier from your training data. For a script. Not one to Solr exactly, but when developing the Elasticsearch plugin, I started to work on a JSON serialization format, and as part of that built a Python script for reading the Ranklib XML and ou

LambdaMART XML model to JSON

2017-07-23 Thread Ryan Yacyshyn
Hi everyone, I'm trying out the LTR plugin and have a couple questions when it comes to converting the LambdaMart XML to JSON. Below is a snippet of the model generated from rankLib: 10 0.28156844 11 7.11 7 2.