I have a very similar setup and that's precisely what we do - except with JSON.

1) Request comes into PHP
2) PHP runs the search against several different cores (in a multicore setup) - ours are a little more than "slightly" different 3) PHP constructs a new object with the responseHeader and response objects joined together (basically add the record counts together in the header and then concatenate the arrays of documents)
4) PHP encodes the combined data into JSON and returns it

It sounds clunky but it all manages to happen very quickly (< 200 ms round trip). The only problem you might hit is with paging, but from the way you describe your situation it doesn't sound like that will be a problem. It's more of an issue if you're trying to make them seamlessly flow into each other, but it sounds like you plan on presenting them separately (as we do).

--
Steve


it could be a custom request handler, but it doesn't have to be -- you
could implment it in whatever way is easiest for you (there's no reason why it has to run in the same JVM or on the same physical machine as Solr
... it could be a PHP script on another server if you want)




-Hoss


Reply via email to