Hi all
I noticed IDL uses at least 400% (4 processors or cores) out of the box
for simple things like reading and processing files, calculating the
mean etc.
I have never seen this happening with numpy except for the linalgebra
stuff (e.g lapack).
Any comments?
Thanks,
Siegfried
--
The U
I recently tried diff and gradient for some
medical time domain data, and the result nearly looked like pure noise.
I just found this after seeing John Agosta's post
https://gist.github.com/mblondel/487187
"""
Find the solution for the second order differential equation
u'' = -u
with u(0) = 1