Hi, You may raise your heap size via mapred.child.java.opts (or mapred.reduce.child.java.opts for reducers alone), and further raise the virtual-mem limit via mapred.child.ulimit (try setting it to 2x or 3x the heap size, in KB, or higher). I think its the latter you're running out with, since there's a subprocess involved.
Let us know if that helps. On Sun, Jul 29, 2012 at 1:47 PM, Mapred Learn <[email protected]> wrote: > hi, > One of my programs create a huge python dictionary and reducers fails with > Memory Error everytime. > > Is there a way to specify reducer memory to be a bigger value for reducers > to succeed ? > > I know we shuold not have this requirement in first place and not cerate > this kind of dictionary, but still can I finish this job with giving more > memory in jar command ? > > > Thanks, > JJ > -- Harsh J
