Hi.
Yes, I looked at the text before submitting the patch.
I contacted Devroye and he confirmed that another reader had also
pointed out this bug but not the solution. I sent him my proposed patch,
he will look into it (no idea when though).
I would state that "comparison function for x = 1 is e^(1/78)" (which
becomes 1/78 as the algorithm uses log-probabilities).
I think the change is needed because otherwise, for that particular bin,
the rejection probability is lower than it should be, resulting in a
higher number of samples.
Regarding your second point, I understand what you mean and I would be
glad to help.
On 12/12/2017 01:51 AM, Paolo Carlini wrote:
Hi,
On 11/12/2017 23:16, Michele Pezzutti wrote:
I lowered to N = 2500000 and still fails with a good margin.
Good. At the moment however, I think we need a bit of rationale for
the change that you are proposing, what would you put in a comment in
the code? It's been a while since the last time I looked into these
algorithms, is there a simple way to explain why the change is needed
within the basic rejection method proposed by Devroye? Devroye's book
is freely available, have you been able to study the relevant bits
already? (http://www.nrbook.com/devroye/). He is also very
approachable in private email, if I remember correctly.
Eventually, we could also agree on a good way to extend the coverage
of the testing, maybe for gcc8 simply add the testcase, but then, for
gcc9 I think we could extend it quite a bit in a consistent way,
something like a grid from 1.0 to 50 step 1.0 with an increased N.
Better if we figure out something that looks generic but would also
have caught anyway 83237, if you see what I mean. I can take care of
that. For the other discrete distributions too of course.
Thanks a lot for your help!
Paolo.