On Thursday, September 8, 2016 at 7:27:36 PM UTC-4, Steven G. Johnson wrote:
>
>
>
> On Thursday, September 8, 2016 at 4:18:42 PM UTC-4, Mathieu 
> Taschereau-Dumouchel wrote:
>>
>> Not quite because the x_i on the right-hand side also depend on a_j
>
>
> If you just take the partial derivative of both sides with respect 
> to aₖ (for k = 1..N), you will see that (given the solution vector x) you 
> just get a *linear* system of equations for the gradient:
>
>    ∂xⱼ/∂aₖ - b aⱼ ∑ᵢ (xᵢ)ᵇ⁻¹ ∂xᵢ/∂aₖ = δⱼₖ ∑ᵢ (xᵢ)ᵇ
>
> Notice that the right-hand side is proportional to the identity matrix (δ 
> denotes the Kronecker delta).   So, just form the matrix corresponding to 
> the linear operator multiplying ∇ₐx on the left-hand side, and your 
> Jacobian matrix ∇ₐx will then be the inverse of this matrix multiplied by 
> ∑ᵢ (xᵢ)ᵇ.
>

More explicitly, if a and x are Julia 1d arrays, then your Jacobian matrix 
is (modulo algebra errors on my part):

 ∇ₐx = inv(I - b * a * (x.').^(b-1)) * sum(x.^b)


Note also that the matrix you are inverting is actually a rank-1 update to 
the identity matrix I, so you can compute ∇ₐx *much* more quickly (if 
needed) using the Woodbury 
<https://en.wikipedia.org/wiki/Woodbury_matrix_identity> formula. 

Reply via email to