I should have mentioned that when I said that most people/codes don't
care about super-accurate range reduction, I was speaking from my
various previous lives as a radio astronomer, numerical
hydrodynamicist (is that even a word?), and I think it's true for the
weather and climate codes that I've w
Just one physical oceanographer might need very different precision for a
3-day forecast of currents than for a one-day forecast. I suspect that
precision needs will vary even within a lab, much less within a field much
less between fields. In the cryptography example, precision is proportional
to
I assume, possibly incorrectly, that scientists know what level of
tolerance/accuracy they need. While we can all dream about the futures
or past - the present day needs are useful to know. In general a
certain field may need A and another field will tolerate B...
I guess this is a NaN... if(signa
I'm betting the answer to that will be "any" (i.e. "it depends").
In cryptography, we used to think of 128 bits for a PGP key as a lot, but
some folks have started using 4096 bits. Of course in exact arithmetic it's
much easier to deal with arbitrary precision than in quantitative analysis
of meas
I was hoping for feedback, from scientists, about what level of
accuracy their codes or fields of study typically require. Maybe the
weekend wasn't the best time to post.. hmm..
On Sun, May 1, 2016 at 1:31 AM, Peter St. John wrote:
> A bit off the wall, and not much help for what you are doing no
A bit off the wall, and not much help for what you are doing now, but
sooner or later we won't be hand-crating ruthlessly optimal code; we'll be
training neural nets. You could do this now if you wanted: the objective
function is just accurate answers (which you get from sub-optimal but
mathematica