Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nadav Horesh
The reult tests of numpy 1.10.4 installed from source:

OK (KNOWNFAIL=4, SKIP=6)


I think I use openblas, as it is installed instead the normal blas/cblas.

  Nadav,

From: NumPy-Discussion  on behalf of Nadav 
Horesh 
Sent: 07 February 2016 07:28
To: Discussion of Numerical Python; SciPy Developers List
Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

Test platform: python 3.4.1 on archlinux x86_64

scipy test: OK

OK (KNOWNFAIL=97, SKIP=1626)


numpy tests: Failed on long double and int128 tests, and got one error:

Traceback (most recent call last):
  File "/usr/lib/python3.5/site-packages/nose/case.py", line 198, in runTest
self.test(*self.arg)
  File "/usr/lib/python3.5/site-packages/numpy/core/tests/test_longdouble.py", 
line 108, in test_fromstring_missing
np.array([1]))
  File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 296, in 
assert_equal
return assert_array_equal(actual, desired, err_msg, verbose)
  File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 787, in 
assert_array_equal
verbose=verbose, header='Arrays are not equal')
  File "/usr/lib/python3.5/site-packages/numpy/testing/utils.py", line 668, in 
assert_array_compare
raise AssertionError(msg)
AssertionError:
Arrays are not equal

(shapes (6,), (1,) mismatch)
 x: array([ 1., -1.,  3.,  4.,  5.,  6.])
 y: array([1])

--
Ran 6019 tests in 28.029s

FAILED (KNOWNFAIL=13, SKIP=12, errors=1, failures=18




From: NumPy-Discussion  on behalf of 
Matthew Brett 
Sent: 06 February 2016 22:26
To: Discussion of Numerical Python; SciPy Developers List
Subject: [Numpy-discussion] Multi-distribution Linux wheels - please test

Hi,

As some of you may have seen, Robert McGibbon and Nathaniel have just
guided a PEP for multi-distribution Linux wheels past the approval
process over on distutils-sig:

https://www.python.org/dev/peps/pep-0513/

The PEP includes a docker image on which y'all can build wheels which
match the PEP:

https://quay.io/repository/manylinux/manylinux

Now we're at the stage where we need stress-testing of the built
wheels to find any problems we hadn't thought of.

I've built numpy and scipy wheels here:

https://nipy.bic.berkeley.edu/manylinux/

So, if you have a Linux distribution handy, we would love to hear from
you about the results of testing these guys, maybe on the lines of:

pip install -f https://nipy.bic.berkeley.edu/manylinux numpy scipy
python -c 'import numpy; numpy.test()'
python -c 'import scipy; scipy.test()'

These manylinux wheels should soon be available on pypi, and soon
after, installable with latest pip, so we would like to fix as many
problems as possible before going live.

Cheers,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nathaniel Smith
On Feb 6, 2016 12:27 PM, "Matthew Brett"  wrote:
>
> Hi,
>
> As some of you may have seen, Robert McGibbon and Nathaniel have just
> guided a PEP for multi-distribution Linux wheels past the approval
> process over on distutils-sig:
>
> https://www.python.org/dev/peps/pep-0513/
>
> The PEP includes a docker image on which y'all can build wheels which
> match the PEP:
>
> https://quay.io/repository/manylinux/manylinux

This is the wrong repository :-) It moved, and there are two now:

quay.io/pypa/manylinux1_x86_64
quay.io/pypa/manylinux1_i686

-n
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] resizeable arrays using shared memory?

2016-02-07 Thread Elliot Hallmark
That makes sense.  I could either send a signal to the child process
letting it know to re-instantiate the numpy array using the same (but now
resized) buffer, or I could have it check to see if the buffer has been
resized when it might need it and re-instantiate then.  That's actually not
too bad.  It would be nice if the array could be resized, but it's probably
unstable to do so and there isn't much demand for it.

Thanks,
  Elliot

On Sat, Feb 6, 2016 at 8:01 PM, Sebastian Berg 
wrote:

> On Sa, 2016-02-06 at 16:56 -0600, Elliot Hallmark wrote:
> > Hi all,
> >
> > I have a program that uses resize-able arrays.  I already over
> > -provision the arrays and use slices, but every now and then the data
> > outgrows that array and it needs to be resized.
> >
> > Now, I would like to have these arrays shared between processes
> > spawned via multiprocessing (for fast interprocess communication
> > purposes, not for parallelizing work on an array).  I don't care
> > about mapping to a file on disk, and I don't want disk I/O happening.
> >   I don't care (really) about data being copied in memory on resize.
> > I *do* want the array to be resized "in place", so that the child
> > processes can still access the arrays from the object they were
> > initialized with.
> >
> >
> > I can share arrays easily using arrays that are backed by memmap.
> > Ie:
> >
> > ```
> > #Source: http://github.com/rainwoodman/sharedmem
> >
> >
> > class anonymousmemmap(numpy.memmap):
> > def __new__(subtype, shape, dtype=numpy.uint8, order='C'):
> >
> > descr = numpy.dtype(dtype)
> > _dbytes = descr.itemsize
> >
> > shape = numpy.atleast_1d(shape)
> > size = 1
> > for k in shape:
> > size *= k
> >
> > bytes = int(size*_dbytes)
> >
> > if bytes > 0:
> > mm = mmap.mmap(-1,bytes)
> > else:
> > mm = numpy.empty(0, dtype=descr)
> > self = numpy.ndarray.__new__(subtype, shape, dtype=descr,
> > buffer=mm, order=order)
> > self._mmap = mm
> > return self
> >
> > def __array_wrap__(self, outarr, context=None):
> > return
> > numpy.ndarray.__array_wrap__(self.view(numpy.ndarray), outarr,
> > context)
> > ```
> >
> > This cannot be resized because it does not own it's own data
> > (ValueError: cannot resize this array: it does not own its data).
> > (numpy.memmap has this same issue [0], even if I set refcheck to
> > False and even though the docs say otherwise [1]).
> >
> > arr._mmap.resize(x) fails because it is annonymous (error: [Errno 9]
> > Bad file descriptor).  If I create a file and use that fileno to
> > create the memmap, then I can resize `arr._mmap` but the array itself
> > is not resized.
> >
> > Is there a way to accomplish what I want?  Or, do I just need to
> > figure out a way to communicate new arrays to the child processes?
> >
>
> I guess the answer is no, but the first question should be whether you
> can create a new array viewing the same data that is just larger? Since
> you have the mmap, that would be creating a new view into it.
>
> I.e. your "array" would be the memmap, and to use it, you always rewrap
> it into a new numpy array.
>
> Other then that, you would have to mess with the internal ndarray
> structure, since these kind of operations appear rather unsafe.
>
> - Sebastian
>
>
> > Thanks,
> >   Elliot
> >
> > [0] https://github.com/numpy/numpy/issues/4198.
> >
> > [1] http://docs.scipy.org/doc/numpy/reference/generated/numpy.memmap.
> > resize.html
> >
> >
> > ___
> > NumPy-Discussion mailing list
> > NumPy-Discussion@scipy.org
> > https://mail.scipy.org/mailman/listinfo/numpy-discussion
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
Hi,

On Sun, Feb 7, 2016 at 2:06 AM, Nadav Horesh  wrote:
> The reult tests of numpy 1.10.4 installed from source:
>
> OK (KNOWNFAIL=4, SKIP=6)
>
>
> I think I use openblas, as it is installed instead the normal blas/cblas.

Thanks again for the further tests.

What do you get for:

python -c 'import numpy; print(numpy.__config__.show())'

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] [SciPy-Dev] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nathaniel Smith
On Feb 7, 2016 15:27, "Charles R Harris"  wrote:
>
>
>
> On Sun, Feb 7, 2016 at 2:16 PM, Nathaniel Smith  wrote:
>>
>> On Sun, Feb 7, 2016 at 9:49 AM, Charles R Harris
>>  wrote:
>> >
>> >
>> > On Sun, Feb 7, 2016 at 3:40 AM, Nathaniel Smith  wrote:
>> >>
>> >> On Feb 6, 2016 12:27 PM, "Matthew Brett" 
wrote:
>> >> >
>> >> > Hi,
>> >> >
>> >> > As some of you may have seen, Robert McGibbon and Nathaniel have
just
>> >> > guided a PEP for multi-distribution Linux wheels past the approval
>> >> > process over on distutils-sig:
>> >> >
>> >> > https://www.python.org/dev/peps/pep-0513/
>> >> >
>> >> > The PEP includes a docker image on which y'all can build wheels
which
>> >> > match the PEP:
>> >> >
>> >> > https://quay.io/repository/manylinux/manylinux
>> >>
>> >> This is the wrong repository :-) It moved, and there are two now:
>> >>
>> >> quay.io/pypa/manylinux1_x86_64
>> >> quay.io/pypa/manylinux1_i686
>> >
>> >
>> > I'm going to put out 1.11.0b3 today. What would be the best thing to
do for
>> > testing?
>>
>> I'd say, don't worry about building linux wheels as part of the
>> release cycle yet -- it'll still be a bit before they're allowed on
>> pypi or pip will recognize the new special tag. So for now you can
>> leave it to Matthew or someone to build test images and stick them up
>> on a server somewhere, same as before :-)
>
>
> Should I try putting the sources up on pypi?

+1

-n
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


[Numpy-discussion] Linking other libm-Implementation

2016-02-07 Thread Nils Becker
Hi all,

I wanted to know if there is any sane way to build numpy while linking to a
different implementation of libm?
A drop-in replacement for libm (e.g. openlibm) should in principle work, I
guess, but I did not manage to actually make it work. As far as I
understand the build code, setting MATHLIB=openlibm should suffice, but it
did not. The build works fine, but in the end when running numpy apparently
the functions of the system libm.so are used. I could not verify this
directly (as I do not know how) but noticed that there is no performance
difference between the builds - while there is one with pure C programs
linked against libm and openlibm.
Using amdlibm would require some work as the functions are prefixed with
"_amd", I guess? Using intels libimf should work when using intels
compiler, but I did not try this. With gcc I did not get it to work.

A quite general question: At the moment the performance and the accuracy of
the base mathematical functions depends on the platform and
libm-Implementation of the system. Although there are functions defined in
npy_math, they are only used as fall-backs, if they are not provided by a
library. (correct me if I am wrong here)
Is there some plan to change this in the future and provide defined
behaviour (specified accuracy and/or speed) across platforms? As I
understood it Julia started openlibm for this reason (which is based on
fdlibm/msun, same as npy_math).

Cheers
Nils
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Linking other libm-Implementation

2016-02-07 Thread Nathaniel Smith
On Sun, Feb 7, 2016 at 4:39 PM, Nils Becker  wrote:
> Hi all,
>
> I wanted to know if there is any sane way to build numpy while linking to a
> different implementation of libm?
> A drop-in replacement for libm (e.g. openlibm) should in principle work, I
> guess, but I did not manage to actually make it work. As far as I understand
> the build code, setting MATHLIB=openlibm should suffice, but it did not. The
> build works fine, but in the end when running numpy apparently the functions
> of the system libm.so are used. I could not verify this directly (as I do
> not know how) but noticed that there is no performance difference between
> the builds - while there is one with pure C programs linked against libm and
> openlibm.
> Using amdlibm would require some work as the functions are prefixed with
> "_amd", I guess? Using intels libimf should work when using intels compiler,
> but I did not try this. With gcc I did not get it to work.
>
> A quite general question: At the moment the performance and the accuracy of
> the base mathematical functions depends on the platform and
> libm-Implementation of the system. Although there are functions defined in
> npy_math, they are only used as fall-backs, if they are not provided by a
> library. (correct me if I am wrong here)
> Is there some plan to change this in the future and provide defined
> behaviour (specified accuracy and/or speed) across platforms? As I
> understood it Julia started openlibm for this reason (which is based on
> fdlibm/msun, same as npy_math).

The npy_math functions are used if otherwise unavailable OR if someone
has at some point noticed that say glibc 2.4-2.10 has a bad quality
tan (or whatever) and added a special case hack that checks for those
particular library versions and uses our built-in version instead.
It's not the most convenient setup to maintain, so there's been some
discussion of trying openlibm instead [1], but AFAIK you're the first
person to find the time to actually sit down and try doing it :-).

You should be able to tell what math library you're linked to by
running ldd (on linux) or otool (on OS X) against the .so / .dylib
files inside your built copy of numpy -- e.g.

  ldd numpy/core/umath.cpython-34m.so

(exact filename and command will vary depending on python version and platform).

-n

[1] https://github.com/numpy/numpy/search?q=openlibm&type=Issues&utf8=%E2%9C%93

-- 
Nathaniel J. Smith -- https://vorpus.org
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nadav Horesh
Thank you fo reminding me, it is OK now: 
$ python -c 'import numpy; print(numpy.__config__.show())'

lapack_opt_info:
library_dirs = ['/usr/local/lib']
language = c
libraries = ['openblas']
define_macros = [('HAVE_CBLAS', None)]
blas_mkl_info:
  NOT AVAILABLE
openblas_info:
library_dirs = ['/usr/local/lib']
language = c
libraries = ['openblas']
define_macros = [('HAVE_CBLAS', None)]
openblas_lapack_info:
library_dirs = ['/usr/local/lib']
language = c
libraries = ['openblas']
define_macros = [('HAVE_CBLAS', None)]
blas_opt_info:
library_dirs = ['/usr/local/lib']
language = c
libraries = ['openblas']
define_macros = [('HAVE_CBLAS', None)]
None

I updated openblas to the latest version (0.2.15) and it pass the tests

  Nadav.

From: NumPy-Discussion  on behalf of 
Matthew Brett 
Sent: 08 February 2016 01:33
To: Discussion of Numerical Python
Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

Hi,

On Sun, Feb 7, 2016 at 2:06 AM, Nadav Horesh  wrote:
> The reult tests of numpy 1.10.4 installed from source:
>
> OK (KNOWNFAIL=4, SKIP=6)
>
>
> I think I use openblas, as it is installed instead the normal blas/cblas.

Thanks again for the further tests.

What do you get for:

python -c 'import numpy; print(numpy.__config__.show())'

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
On Sun, Feb 7, 2016 at 10:09 PM, Nadav Horesh  wrote:
> Thank you fo reminding me, it is OK now:
> $ python -c 'import numpy; print(numpy.__config__.show())'
>
> lapack_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_mkl_info:
>   NOT AVAILABLE
> openblas_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> openblas_lapack_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> None
>
> I updated openblas to the latest version (0.2.15) and it pass the tests

Oh dear - now I'm confused.  So you installed the wheel, and tested
it, and it gave a test failure.  Then you updated openblas using
pacman, and then reran the tests against the wheel numpy, and they
passed?  That's a bit frightening - the wheel should only see its own
copy of openblas...

Thans for persisting,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nathaniel Smith
On Sat, Feb 6, 2016 at 9:28 PM, Nadav Horesh  wrote:
> Test platform: python 3.4.1 on archlinux x86_64
>
> scipy test: OK
>
> OK (KNOWNFAIL=97, SKIP=1626)
>
>
> numpy tests: Failed on long double and int128 tests, and got one error:

Could you post the complete output from the test suite somewhere?
(Maybe gist.github.com)

-n

-- 
Nathaniel J. Smith -- https://vorpus.org
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nadav Horesh
I have atlas-lapack-base installed via pacman (required by sagemath). Since the 
numpy installation insisted on openblas on /usr/local, I got the openblas 
source-code and installed  it on /usr/local.
BTW, I use 1.11b rather then 1.10.x since the 1.10 is very slow in handling 
recarrays. For the tests I am erasing the 1.11 installation, and installing the 
1.10.4 wheel. I do verify that I have the right version before running the 
tests, but I am not sure if there no unnoticed side effects.

Would it help if I put a side the openblas installation and rerun the test?

  Nadav

From: NumPy-Discussion  on behalf of 
Matthew Brett 
Sent: 08 February 2016 08:13
To: Discussion of Numerical Python
Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

On Sun, Feb 7, 2016 at 10:09 PM, Nadav Horesh  wrote:
> Thank you fo reminding me, it is OK now:
> $ python -c 'import numpy; print(numpy.__config__.show())'
>
> lapack_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_mkl_info:
>   NOT AVAILABLE
> openblas_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> openblas_lapack_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> blas_opt_info:
> library_dirs = ['/usr/local/lib']
> language = c
> libraries = ['openblas']
> define_macros = [('HAVE_CBLAS', None)]
> None
>
> I updated openblas to the latest version (0.2.15) and it pass the tests

Oh dear - now I'm confused.  So you installed the wheel, and tested
it, and it gave a test failure.  Then you updated openblas using
pacman, and then reran the tests against the wheel numpy, and they
passed?  That's a bit frightening - the wheel should only see its own
copy of openblas...

Thans for persisting,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Nathaniel Smith
(This is not relevant to the main topic of the thread, but FYI I think the
recarray issues are fixed in 1.10.4.)
On Feb 7, 2016 11:10 PM, "Nadav Horesh"  wrote:

> I have atlas-lapack-base installed via pacman (required by sagemath).
> Since the numpy installation insisted on openblas on /usr/local, I got the
> openblas source-code and installed  it on /usr/local.
> BTW, I use 1.11b rather then 1.10.x since the 1.10 is very slow in
> handling recarrays. For the tests I am erasing the 1.11 installation, and
> installing the 1.10.4 wheel. I do verify that I have the right version
> before running the tests, but I am not sure if there no unnoticed side
> effects.
>
> Would it help if I put a side the openblas installation and rerun the test?
>
>   Nadav
> 
> From: NumPy-Discussion  on behalf of
> Matthew Brett 
> Sent: 08 February 2016 08:13
> To: Discussion of Numerical Python
> Subject: Re: [Numpy-discussion] Multi-distribution Linux wheels - please
> test
>
> On Sun, Feb 7, 2016 at 10:09 PM, Nadav Horesh 
> wrote:
> > Thank you fo reminding me, it is OK now:
> > $ python -c 'import numpy; print(numpy.__config__.show())'
> >
> > lapack_opt_info:
> > library_dirs = ['/usr/local/lib']
> > language = c
> > libraries = ['openblas']
> > define_macros = [('HAVE_CBLAS', None)]
> > blas_mkl_info:
> >   NOT AVAILABLE
> > openblas_info:
> > library_dirs = ['/usr/local/lib']
> > language = c
> > libraries = ['openblas']
> > define_macros = [('HAVE_CBLAS', None)]
> > openblas_lapack_info:
> > library_dirs = ['/usr/local/lib']
> > language = c
> > libraries = ['openblas']
> > define_macros = [('HAVE_CBLAS', None)]
> > blas_opt_info:
> > library_dirs = ['/usr/local/lib']
> > language = c
> > libraries = ['openblas']
> > define_macros = [('HAVE_CBLAS', None)]
> > None
> >
> > I updated openblas to the latest version (0.2.15) and it pass the tests
>
> Oh dear - now I'm confused.  So you installed the wheel, and tested
> it, and it gave a test failure.  Then you updated openblas using
> pacman, and then reran the tests against the wheel numpy, and they
> passed?  That's a bit frightening - the wheel should only see its own
> copy of openblas...
>
> Thans for persisting,
>
> Matthew
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
> ___
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> https://mail.scipy.org/mailman/listinfo/numpy-discussion
>
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion


Re: [Numpy-discussion] Multi-distribution Linux wheels - please test

2016-02-07 Thread Matthew Brett
Hi Nadav,

On Sun, Feb 7, 2016 at 11:13 PM, Nathaniel Smith  wrote:
> (This is not relevant to the main topic of the thread, but FYI I think the
> recarray issues are fixed in 1.10.4.)
>
> On Feb 7, 2016 11:10 PM, "Nadav Horesh"  wrote:
>>
>> I have atlas-lapack-base installed via pacman (required by sagemath).
>> Since the numpy installation insisted on openblas on /usr/local, I got the
>> openblas source-code and installed  it on /usr/local.
>> BTW, I use 1.11b rather then 1.10.x since the 1.10 is very slow in
>> handling recarrays. For the tests I am erasing the 1.11 installation, and
>> installing the 1.10.4 wheel. I do verify that I have the right version
>> before running the tests, but I am not sure if there no unnoticed side
>> effects.
>>
>> Would it help if I put a side the openblas installation and rerun the
>> test?

Would you mind doing something like this, and posting the output?:

virtualenv test-manylinux
source test-manylinux/bin/activate
pip install -f https://nipy.bic.berkeley.edu/manylinux numpy==1.10.4 nose
python -c 'import numpy; numpy.test()'
python -c 'import numpy; print(numpy.__config__.show())'
deactivate

virtualenv test-from-source
source test-from-source/bin/activate
pip install numpy==1.10.4 nose
python -c 'import numpy; numpy.test()'
python -c 'import numpy; print(numpy.__config__.show())'
deactivate

I'm puzzled that the wheel gives a test error when the source install
does not, and my best guess was an openblas problem, but this just to
make sure we have the output from the exact same numpy version, at
least.

Thanks again,

Matthew
___
NumPy-Discussion mailing list
NumPy-Discussion@scipy.org
https://mail.scipy.org/mailman/listinfo/numpy-discussion