Dear all,

i have noticed some strange behaviour in the „jitter“ function in R.
On the help page for jitter it is stated that

"The result, say r, is r <- x + runif(n, -a, a) where n <- length(x) and a is 
the amount argument (if specified).“

and

"If amount is NULL (default), we set a <- factor * d/5 where d is the smallest 
difference between adjacent unique (apart from fuzz) x values.“

This works fine as long as there is no (very) large outlier

> jitter(c(1,2,10^4))  # desired behaviour
[1]    1.083243    1.851571 9999.942716

But for very large outliers the added noise suddenly ‚jumps‘ to a much larger 
scale:

> jitter(c(1,2,10^5)) # bad behaviour
[1] -19535.649   9578.702 115693.854
# Noise should be of order (2-1)/5  = 0.2 but is of much larger order.

This probably does not matter much when jitter is used for plotting, but it can 
cause problems when jitter is used to break ties.

best regards,
Martin

--------------------------------
Martin Keller-Ressel
Professor für Stochastische Analysis und Finanzmathematik
Technische Universität Dresden
Institut für Mathematische Stochastik
Willersbau B 316, Zellescher Weg 12-14
01062 Dresden
--------------------------------


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to