r - How to reduce size of randomForest object -


i trying predict randomforest object huge raster layer (34 mio cells, 120+ layers). therefore, use clusterr function within raster package. however, if start predict calculated randomforest object, loaded parallel workers. thus, processes combined need lot of memory.

is possible reduce size of randomforest object, without loosing model? have experience this?

i create model this:

library(randomforest)  set.seed(42) df <- data.frame(class = sample(x = 1:3, size = 10000, replace = t)) str(df)  (i in 1:100){   df <- cbind(df, runif(10000)) }  colnames(df) <- c("class", 1:100)  df$class <- as.factor(df$class)  rfo <- randomforest(x = df[,2:ncol(df)],                      y = df$class,                      ntree = 500,                      do.trace = 10)  object.size(rfo)  # 57110816 bytes 


Comments

Popular posts from this blog

account - Script error login visual studio DefaultLogin_PCore.js -

xcode - CocoaPod Storyboard error: -