Greetings,
I know this is a recurrent question, but I would like to use my own loss
function either in a MLPRegressor or in an SVR. For the MLPRegressor case
so far my conclusion was that it is not possible unless you modify the
source code. On the other hand, for the SVR I was looking at setting
Hi Thomas,
> For the MLPRegressor case so far my conclusion was that it is not possible
> unless you modify the source code.
Also, I suspect that this would be non-trivial. I haven't looked to closely at
how the MLPClassifier/MLPRegressor are implemented but since you perform the
weight update