Hi

Anyone have experience with very large datasets and the Bayesian Network
package, bnlearn?  In my experience R doesn't react well to very large
datasets.

Is there a way to divide up the dataset into pieces and incrementally learn
the network with the pieces?  This would also be helpful incase R crashes,
because I could save the network after learning each piece.

Thank you.

        [[alternative HTML version deleted]]

______________________________________________
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.

Reply via email to