Hi Georg,

The documentation (?nnet) says that y should be a matrix or data frame, but in 
your case it is a vector. This is most likely the problem, if you do not have 
other data issues going on. Convert y to a matrix (or data frame) using 
'as.matrix' and see if this solves your problem. Library 'nnet' can do both 
classification and regression. I was able to replicate your problem, using an 
example from Modern Applied Statistics with S, Venables and Ripley, pages 246 
and 247), by turning y into a vector and verifying that all the predicted 
values are the same when y is a vector. This is not the case when y is part of 
a data frame. You can see this by running the code below. I tried about 4 
neural network packages in the past, including AMORE, but found 'nnet' to be 
the best for my needs.

Hope this helps.

Jude

# Neural Network model in Modern Applied Statistics with S, Venables and 
Ripley, pages 246 and 247
library(nnet)
attach(rock)
dim(rock)
area1 <- area/10000; peri1 <- peri/10000
rock1 <- data.frame(perm, area = area1, peri = peri1, shape)
dim(rock1)
head(rock1,15)
# skip = T
rock.nn <- nnet(log(perm) ~ area + peri + shape, rock1, size=3, decay=1e-3, 
linout=T, skip=T, maxit=1000, Hess=T)
rock1$actual <- log(perm)
rock1$predicted <- predict(rock.nn)
head(rock1,15)
summary(rock.nn)
sum((log(perm) - predict(rock.nn))^2)

y <- as.vector(log(rock1$perm))
head(rock1[,c(2:4)])
test.nn <- nnet(x=rock1[,c(2:4)], y=y, size=3, linout=T, maxit=1000)
head(predict(test.nn))

Georg wrote:


Hi,



I'm currently trying desperately to get the nnet function for training a

neural network (with one hidden layer) to perform a regression task.



So I run it like the following:



trainednet <- nnet(x=traindata, y=trainresponse, size = 30, linout = TRUE, 
maxit=1000)

(where x is a matrix and y a numerical vector consisting of the target

values for one variable)



To see whether the network learnt anything at all, I checked the network

weights and those have definitely changed. However, when examining the

trainednet$fitted.values, those are all the same so it rather looks as if

the network is doing a classification. I can even set linout=FALSE and

then it outputs "1" (the class?) for each training example. The

trainednet$residuals are correct (difference between predicted/fitted

example and actual response), but rather useless.



The same happens if I run nnet with the formula/data.frame interface, btw.



As per the suggestion in the ?nnet page: "If the response is not a factor,

it is passed on unchanged to 'nnet.default'", I assume that the network is

doing regression since my trainresponse variable is a numerical vector and

_not_ a factor.



I'm currently lost and I can't see that the AMORE/neuralnet packages are

any better (moreover, they don't implement the formula/dataframe/predict

things). I've read the manpages of nnet and predict.nnet a gazillion

times, but I can't really find an answer there. I don't want to do

classification, but regression.



Thanks for any help.



Georg.

--

Research Assistant

Otto-von-Guericke-Universit?t Magdeburg

resea...@georgruss.de

http://research.georgruss.de


Jude Ryan
MarketShare Partners
1270 Avenue of the Americas, Suite # 2702
New York, NY 10020
http://www.marketsharepartners.com
Work: (646)-745-9916 ext: 222
Cell: (973)-943-2029


        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to