many
• Pietro Berkes, NAGRA Kudelski, Lausanne Switzerland
• Rike-Benjamin Schuppner, Institute for Theoretical Biology,
Humboldt-Universität zu Berlin Germany
• Tiziano Zito, innoCampus, Technische Universität Berlin Germany
• Verjinia Metodieva, NeuroCure, Charité – Universitätsmedizin Berlin Ger
CNRS,
Paris, France
• Stéfan van der Walt, Berkeley Institute for Data Science, UC Berkeley, CA USA
• Nelle Varoquaux, Berkeley Institute for Data Science, UC Berkeley, CA USA
• Tiziano Zito, freelance consultant, Berlin, Germany
Organizers
==
For the German Neuroinformatics Node of the INC
On Sun 08 Jul, 22:35 -0400, Sandro Tosi wrote:
The Python versions supported by this release are 2.7, 3.4-3.6. The wheels are
linked with
OpenBLAS 3.0, which should fix some of the linalg problems reported for NumPy
1.14,
and the source archives were created using Cython 0.28.2 and should wor
ural Reckoning, Imperial College London, UK
• Pietro Berkes, NAGRA Kudelski, Lausanne, Switzerland
• Rike-Benjamin Schuppner, Institute for Theoretical Biology,
Humboldt-Universität zu Berlin, Germany
• Stéfan van der Walt, Berkeley Institute for Data Science, UC Berkeley, CA, USA
• Tiziano
for Theoretical Biology,
Humboldt-Universität zu Berlin Germany
• Tiziano Zito, Department of Psychology, Humboldt-Universität zu Berlin Germany
• Zbigniew Jędrzejewski-Szmek, Red Hat Inc., Warsaw Poland
Organizers
==
Head of the organization for ASPP and responsible for the scientific pro
Would a "complex default" mode ever make it into numpy, to behave more like
Matlab and other packages with respect to complex number handling? Sure it would make it
marginally slower if enabled, but it might open the door to better compatibility when
porting code to Python.
numpy already has
tute for Theoretical Biology,
Humboldt-Universität zu Berlin Germany
• Tiziano Zito, Department of Psychology, Humboldt-Universität zu Berlin Germany
• Zbigniew Jędrzejewski-Szmek, Red Hat Inc., Warsaw Poland
Organizers
==
Head of the organization for ASPP and responsible for the scient
Hi,
there's no integer inf, but you can use np.iinfo(np.int64).max or
np.iinfo(np.int64).min if you need an upper/lower bound for integers in
np.max/np.min
Ciao!
Tiziano
On Tue 22 Nov, 06:27 -, 2601536...@qq.com wrote:
Hi, I am a student and I have very little knowledge about python and
logy, Universität Potsdam
Germany
• Pamela Hathway, Orange Business, Berlin/Nürnberg Germany
• Pietro Berkes, NAGRA Kudelski, Lausanne Switzerland
• Rike-Benjamin Schuppner, Institute for Theoretical Biology,
Humboldt-Universität zu Berlin Germany
• Tiziano Zito, innoCampus, Technische Universität Be
Hi George,
what you see is due to the memory layout of numpy arrays. If you switch your
array to F-order you'll see that the two functions have the same timings, i.e.
both are fast (on my machine 25 times faster for the 1_000_000 points case).
Try:
vertices = np.array(np.random.random((n, 2))
y of Psychophysics, EPFL, Lausanne, Switzerland
• Pamela Hathway, YPOG, Berlin/Nürnberg, Germany
• Pietro Berkes, NAGRA Kudelski, Lausanne, Switzerland
• Rike-Benjamin Schuppner, Institute for Theoretical Biology,
Humboldt-Universität zu Berlin, Germany
• Tiziano Zito, innoCampus, Technische Univers
11 matches
Mail list logo