Well, a lot depends upon the query analysis. Are you using the *exact* same analysis chains in both? Look at the admin/analysis page and see how your term evaluates. I'm guessing that WordDelimiterFilterFactory is being used in the 3.5 case and not in the 1.4.1 case so the 3.5 case is matching everything in your index in 3.5 and trying to re turn a huge number of rows.
How many documents are found in the 1.4.1 case as opposed to the 3.5 case? Best Erick On Thu, Mar 15, 2012 at 12:22 PM, Frederico Azeiteiro <frederico.azeite...@cision.com> wrote: > Hi all, > > > > Just testing SOLR 3.5.0. and notice a different behavior on this new > version: > > select?rows=1000000000&q=sig%3a("54ba3e8fd3d5d8371f0e01c403085a0c")&? > > > > this query returns no results on my indexes, but works for SOLR 1.4.0 > and returns "Java heap space java.lang.OutOfMemoryError: Java heap > space" on SOLR 3.5.0 > > > > Is this normal? As there are no results, why the OutOfMemoryError? > > Is it some memory allocated based on the rows number? > > > > Regards, > > Frederico > > >