Thanks so much, Greg!

On the demo(bernoulli), I FOUND  the following information: IT is used for
logistic regression.


My question is: when I define a decision tree, can I still use the formula
Y~X1+X2+X3,                # formula, even though I dont know the detailed
formula of decision tree.

Thanks!





    demo(bernoulli)
    ---- ~~~~~~~~~

Type  <Return>     to start :

> # LOGISTIC REGRESSION EXAMPLE
>
> cat("Running logistic regression example.\n")
Running logistic regression example.

> # create some data
> N <- 1000

> X1 <- runif(N)

> X2 <- runif(N)

> X3 <- factor(sample(letters[1:4],N,replace=T))

> mu <- c(-1,0,1,2)[as.numeric(X3)]

> p <- 1/(1+exp(-(sin(3*X1) - 4*X2 + mu)))

> Y <- rbinom(N,1,p)

> # random weights if you want to experiment with them
> w <- rexp(N)

> w <- N*w/sum(w)

> data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3)

> # fit initial model
> gbm1 <- gbm(Y~X1+X2+X3,                # formula
+             data=data,                 # dataset
+             weights=w,
+             var.monotone=c(0,0,0),     # -1: monotone decrease, +1:
monotone increase, 0: no monotone restrictions
+             distribution="bernoulli",
+             n.trees=3000,              # number of trees
+             shrinkage=0.001,           # shrinkage or learning rate, 0.001
to 0.1 usually work
+             interaction.depth=3,       # 1: additive model, 2: two-way
interactions, etc
+             bag.fraction = 0.5,        # subsampling fraction, 0.5 is
probably best
+             train.fraction = 0.5,      # fraction of data for training,
first train.fraction*N used for training
+             cv.folds=5,                # do 5-fold cross-validation
+             n.minobsinnode = 10)       # minimum total weight needed in
each node















On Mon, Apr 26, 2010 at 9:50 AM, Ridgeway, Greg <gr...@rand.org> wrote:

>  GBM implements boosted trees. It works for 0/1 outcomes, count outcomes,
> continuous outcomes and a few others. You do not need to combine rpart and
> gbm. You're best bet is to just load the package and run a demo
> >demo(bernoulli).
>
>  ------------------------------
> *From:* Changbin Du [mailto:changb...@gmail.com]
> *Sent:* Monday, April 26, 2010 9:48 AM
> *To:* r-help@r-project.org
> *Cc:* Ridgeway, Greg
> *Subject:* R.GBM package
>
> HI, Dear Greg,
>
> I AM A NEW to GBM package. Can boosting decision tree be implemented in
> 'gbm' package?   Or 'gbm' can only be used for regression?
>
> IF can, DO I need to combine the rpart and gbm command?
>
> Thanks so much!
>
>
>
> --
> Sincerely,
> Changbin
> --
>
>
>
> __________________________________________________________________________
>
> This email message is for the sole use of the intended...{{dropped:24}}

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to