Hi,

Let say I have a range [0, 100]

Now I need to simulate 1000 10 mid-points within the range with
accuracy upto second decimal number.

Let say, one simulated set is

X1, X2, ..., X10

Ofcourrse

X1 < X2 < ... <X10

I have one more constraint that the difference between any 2
consecutive mid-points shall be at-least 5.00.

I wonder if there is any Statistical theory available to support this
kind of simulation.

Alternately, is there any way in R to implement this?

______________________________________________
R-help@r-project.org mailing list -- To UNSUBSCRIBE and more, see
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide https://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to