Exersice: Hyperparameter tuning

Models in machine learning usually have multiple parameters which decide how the model looks like. A simple example is the slope and intercept of a linar model. One common task in machine learning is to find the optimal set of parameters for a particular dataset. This is called hyperparameter tuning.

Pimp

source: vignette.wikia.nocookie.net/viva-deutschland

In R, the caret package provides a convinient framework for the hyperparameter tuning of various kinds of models.

Tasks:

  • Train your random forest model again, but this time use caret::train(). Make sure you set the parameter importance = "impurity" (why?).
  • Plot the variable importance of the model with caret::varImp.
  • Get some information about the hyperparameters of random forest in a publication from Probst et. al 2019 and the help page of ranger.
  • Tune one parameter of your choice by creating a caret::expand.grid() and the parameter tune.grid in caret::train().
  • Plot the resulting model object. What do you see?
  • Now train a model where you tune all the hyperparameters.
  • Save the model as a RDS file.

Updated: