It seems that any `Tree` attribute that uses `self._get_node_ndarray()` to get a *floating point* ndarray suffers from this weird behavior when negating. Examples include `weighted_n_node_samples` and `threshold`. However, attributes that use the same function to get *integer* ndarrays don't. Examples include `n_node_samples` and `feature`.
Moreover, *only negation* seems to be problematic. Multiplying by `-1.0` seems fine: > (Pdb) clf.tree_.impurity.__neg__() > array([-4.69135802e-001, -1.59149684e-314, -1.50000000e+000, > -2.12199579e-314, nan]) > (Pdb) clf.tree_.impurity.__mul__(-1.0) > array([-0.4691358 , -0. , -0.22222222, -0. , -0.]) Some further observations: * On i386: (Pdb) clf.tree_.impurity.__array_interface__ {'data': (161778564, False), 'strides': (44,), 'descr': [('', '<f8')], 'typestr': '<f8', 'shape': (5,), 'version': 3} * On amd64: (Pdb) clf.tree_.impurity.__array_interface__ {'data': (54432144, False), 'strides': (64,), 'descr': [('', '<f8')], 'typestr': '<f8', 'shape': (5,), 'version': 3} * Note that 161778564 (the i386 data pointer) is not divisible by 8, while 54432144 (the amd64 data pointer) is. * CI started failing between NumPy 1.24 and 1.26. * Between NumPy 1.24 and 1.26, there seems to have been some work on SIMD-ifying array negation [1]. I haven't been able to dig deeper, but does this perhaps smell of an alignment issue? [1] https://github.com/numpy/numpy/commit/490b1e45ce16ca91d1c6a1e644f844179b5410eb
signature.asc
Description: PGP signature