Hi Oyvind,
I believe this is possible to implement. There is already some work ongoing in
using the GPU in R and they use the CUDA toolkit as the reference you supplied
do.
http://brainarray.mbni.med.umich.edu/Brainarray/rgpgpu/
Thanks,
Rob
On 18 May 2011, at 10:07, oyvfos wrote:
> Dear all
Thanks Rob,
I have notified the maintainer about the suggestion.
Oyvind
From: Robert Lowe
To: oyvfos
Cc: r-devel@r-project.org
Sent: Wed, May 18, 2011 2:27:15 PM
Subject: Re: [Rd] Max likelihood using GPU
Hi Oyvind,
I believe this is possible to implement
Dear all,
Probably many of you experience long computation times when estimating large
number of parameters using maximum likelihood with functions that reguire
numerical methods such as integration or root-finding. Maximum likelihood is
an example of paralellization that could sucessfully utilize