Forgot to cc the help list. On 18-12-2012, at 19:40, Lu, James T wrote:
> I am attempting to use optim to solve a neural network problem. I would like > to optimize coefficients that are currently stored in a matrix > > Y=270 x 1 > X= 27- x 14 > b1= 10x14 > b2= 11x1 > V= 10 x 14 set of prior variances. > > I have the following function: > > posterior.mode1=function(y,X,b_0,b2,V) { > > log.like=function(b1) { > a_g=compute(b1) > z_g=tanh(a_g); > z_g=cbind(1,z_g) > p=softmax(z_g%*%b2); > a=sum(y*log(p)+(1-y)*log(1-p)); > return(a); > } > > compute=function(b1) { > a_g=NULL; > for(i in 1:nrow(b1)){ > a_g=cbind(a_g,X%*%b1[i,]) > } > return(a_g); > } > > log.posterior=function(b1) { > -log.like(b1)+1/2*t(as.vector(b1))%*%diag(as.vector(V))%*%as.vector(b1) > } > > a=optim(b_0,log.posterior,method="CG",hessian=TRUE) > return(a); > } > > When I run > > posterior.mode1(y,X,b1,b2,b1) > > > > I get the following error > > > > Error in 1:nrow(b1) : argument of length 0 optim treats parameter b_0 as a vector and passes as such to the function it is trying to optimize. So in log.like you should convert b1 back to a matrix of the correct dimensions like so b1 <- matrix(b1,nrow=10) Then it should work. Note: you haven't provided function softmax. Berend ______________________________________________ R-help@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide http://www.R-project.org/posting-guide.html and provide commented, minimal, self-contained, reproducible code.