> Hi,
>
> I second James here, Theano do many of those optimizations. Only
> advanced coder can do better then Theano in most case, but that will
> take them much more time. If you find some optimization that you do
> and Theano don't, tell us. We want to add them :)
>
> Fred
I am sure Theano does
> Of course, maybe you were pointing out that if your derivative
> calculation depends in some intrinsic way on the topology of some
> graph, then your best bet is to have an automatic way to recompute it
> from scratch for each new graph you see. In that case, fair enough!
That is indeed what I h
On Thu, Jun 14, 2012 at 5:53 PM, Nathaniel Smith wrote:
> On Thu, Jun 14, 2012 at 9:22 PM, srean wrote:
> No, I'm saying I totally see the advantages. Here's the code I'm talking
> about:
>
> def _loglik(self, params):
> alpha, beta = self.used_alpha_beta(params)
> if np.any(alp
On Thu, Jun 14, 2012 at 9:22 PM, srean wrote:
>>
>> For example, I wrote a library routine for doing log-linear
>> regression. Doing this required computing the derivative of the
>> likelihood function, which was a huge nitpicky hassle; took me a few
>> hours to work out and debug. But it's still
On Jun 14, 2012, at 1:53 PM, James Bergstra wrote:
> On Thu, Jun 14, 2012 at 11:01 AM, Nathaniel Smith wrote:
>
>>> Indeed that would be great as sympy already has already excellent math
>>> expression rendering.
>>>
>>> An alternative would be to output mathml or something similar that
>>> co
Hi,
On Thu, Jun 14, 2012 at 4:49 PM, James Bergstra
wrote:
> You're right - there is definitely a difference between a correct
> gradient and a gradient is both correct and fast to compute.
>
> The current quick implementation of pyautodiff is naive in this
> regard. However, it is delegating th
>
> You're right - there is definitely a difference between a correct
> gradient and a gradient is both correct and fast to compute.
>
> The current quick implementation of pyautodiff is naive in this
> regard.
Oh and by no means was I criticizing your implementation. It is a very
hard problem to
On Thu, Jun 14, 2012 at 4:22 PM, srean wrote:
>>
>> For example, I wrote a library routine for doing log-linear
>> regression. Doing this required computing the derivative of the
>> likelihood function, which was a huge nitpicky hassle; took me a few
>> hours to work out and debug. But it's still
On Thu, Jun 14, 2012 at 3:38 PM, Nathaniel Smith wrote:
> On Thu, Jun 14, 2012 at 7:53 PM, James Bergstra
> wrote:
>> On Thu, Jun 14, 2012 at 11:01 AM, Nathaniel Smith wrote:
>>
Indeed that would be great as sympy already has already excellent math
expression rendering.
An al
>
> For example, I wrote a library routine for doing log-linear
> regression. Doing this required computing the derivative of the
> likelihood function, which was a huge nitpicky hassle; took me a few
> hours to work out and debug. But it's still just 10 lines of Python
> code that I needed to figu
On Thu, Jun 14, 2012 at 7:53 PM, James Bergstra
wrote:
> On Thu, Jun 14, 2012 at 11:01 AM, Nathaniel Smith wrote:
>
>>> Indeed that would be great as sympy already has already excellent math
>>> expression rendering.
>>>
>>> An alternative would be to output mathml or something similar that
>>> c
On Thu, Jun 14, 2012 at 11:01 AM, Nathaniel Smith wrote:
>> Indeed that would be great as sympy already has already excellent math
>> expression rendering.
>>
>> An alternative would be to output mathml or something similar that
>> could be understood by the mathjax rendering module of the IPytho
On Thu, Jun 14, 2012 at 3:42 PM, Olivier Grisel
wrote:
> 2012/6/14 James Bergstra :
>> On Thu, Jun 14, 2012 at 4:00 AM, Olivier Grisel
>> wrote:
>>> 2012/6/13 James Bergstra :
Further to the recent discussion on lazy evaluation & numba, I moved
what I was doing into a new project:
2012/6/14 James Bergstra :
> On Thu, Jun 14, 2012 at 4:00 AM, Olivier Grisel
> wrote:
>> 2012/6/13 James Bergstra :
>>> Further to the recent discussion on lazy evaluation & numba, I moved
>>> what I was doing into a new project:
>>>
>>> PyAutoDiff:
>>> https://github.com/jaberg/pyautodiff
>>>
>>>
On Thu, Jun 14, 2012 at 4:00 AM, Olivier Grisel
wrote:
> 2012/6/13 James Bergstra :
>> Further to the recent discussion on lazy evaluation & numba, I moved
>> what I was doing into a new project:
>>
>> PyAutoDiff:
>> https://github.com/jaberg/pyautodiff
>>
>> It currently works by executing CPytho
2012/6/13 James Bergstra :
> Further to the recent discussion on lazy evaluation & numba, I moved
> what I was doing into a new project:
>
> PyAutoDiff:
> https://github.com/jaberg/pyautodiff
>
> It currently works by executing CPython bytecode with a numpy-aware
> engine that builds a symbolic exp
Further to the recent discussion on lazy evaluation & numba, I moved
what I was doing into a new project:
PyAutoDiff:
https://github.com/jaberg/pyautodiff
It currently works by executing CPython bytecode with a numpy-aware
engine that builds a symbolic expression graph with Theano... so you
can d
17 matches
Mail list logo