Re: [Rd] Max likelihood using GPU

2011-05-18 Thread Robert Lowe
Hi Oyvind, I believe this is possible to implement. There is already some work ongoing in using the GPU in R and they use the CUDA toolkit as the reference you supplied do. http://brainarray.mbni.med.umich.edu/Brainarray/rgpgpu/ Thanks, Rob On 18 May 2011, at 10:07, oyvfos wrote: > Dear all

Re: [Rd] Max likelihood using GPU

2011-05-18 Thread Øyvind Foshaug
Thanks Rob, I have notified the maintainer about the suggestion. Oyvind From: Robert Lowe To: oyvfos Cc: r-devel@r-project.org Sent: Wed, May 18, 2011 2:27:15 PM Subject: Re: [Rd] Max likelihood using GPU Hi Oyvind, I believe this is possible to implement

[Rd] Max likelihood using GPU

2011-05-18 Thread oyvfos
Dear all, Probably many of you experience long computation times when estimating large number of parameters using maximum likelihood with functions that reguire numerical methods such as integration or root-finding. Maximum likelihood is an example of paralellization that could sucessfully utilize