Hi. Marco gallotta,
   Yes , there are some properties available but i had not tried . Let me
know if not works.
   If you using yarn their properties where you specify max retries
" yarn.resourcemanager.am.max-retries" and in reducer end there is a
properies and for node manager that u can specifed per node job.maxtaskfai
lures.per.tracker

If its hadoop 1.0 then "mapred.map.max.attempts" and
"mapred.reduce.max.attempts"

Syed Abdul kather
send from Samsung S3
On Aug 3, 2012 5:22 AM, "Marco Gallotta" <[email protected]> wrote:

> Hi there
>
> Is there a way to disable retries when a mapper/reducer fails? I'm writing
> data in my mapper and I'd rather catch the failure, recover from a backup
> (fairly lightweight in this case, as the output tables aren't big) and
> restart.
>
>
>
> --
> Marco Gallotta | Mountain View, California
> Software Engineer, Infrastructure | Loki Studios
> fb.me/marco.gallotta | twitter.com/marcog
> [email protected] | +1 (650) 417-3313
>
> Sent with Sparrow (http://www.sparrowmailapp.com/?sig)
>
>

Reply via email to