On Fri, Mar 4, 2011 at 12:59 PM, eat <e.antero.ta...@gmail.com> wrote:
> Hi, > > On Fri, Mar 4, 2011 at 8:19 AM, Daniel Hyams <dhy...@gmail.com> wrote: > >> This is probably so easy, I'm embarrassed to ask it...but I've been >> casting around trying things to no avail for the last hour and a half, so >> here goes... >> >> I have a lot of dot products to take. The length-3 vectors that I want to >> dot are stacked in a 2D array like this: >> >> U = [u1 u2 u3....] >> >> and >> >> V = [v1 v2 v3....] >> >> So both of these arrays, are, say, 3x100 each. I just want to take the >> dot product of each of the corresponding vectors, so that the result is >> >> [u1.v1 u2.v2 u3.v3 ....] >> >> which would be a 1x100 array in this case. >> >> Which function do I need to use? I thought tensordot() was the one, but I >> couldn't make it work....pure user error I'm sure. >> > Hello Daniel Hyams and group, I guess the following code serves the purpose but I think something needs to be highlighted. I think Eat forgot to paste it here. No function needed for this case, just: > In [ ] : import numpy.random as rand > In []: x= rand(3, 7) > In []: y= rand(3, 7) > In []: d= (x* y).sum(0) > In [490]: d > Out[490]: > array([ 1.25404683, 0.19113117, 1.37267133, 0.74219888, 1.55296562, > 0.15264303, 0.72039922]) > In [493]: dot(x[:, 0].T, y[:, 0]) > Out[493]: 1.2540468282421895 > > Regards, > eat > >> >> >> I guess you can use 'rand' only this after import. >> _______________________________________________ >> NumPy-Discussion mailing list >> NumPy-Discussion@scipy.org >> http://mail.scipy.org/mailman/listinfo/numpy-discussion >> >> > > _______________________________________________ > NumPy-Discussion mailing list > NumPy-Discussion@scipy.org > http://mail.scipy.org/mailman/listinfo/numpy-discussion > > -- -Regards Hector Whenever you think you can or you can't, in either way you are right.
_______________________________________________ NumPy-Discussion mailing list NumPy-Discussion@scipy.org http://mail.scipy.org/mailman/listinfo/numpy-discussion