This analysis is based on scores and stats from individual rounds in the five Bermuda Championships: 1,993 rounds in total.
Random Forest Regressor is an ensemble learning method that constructs multiple decision trees during training and outputs the average prediction. It combines the predictions of several models to improve accuracy and robustness.
Feature importance is a technique used to interpret a machine learning model. It refers to the score that quantifies the contribution of each feature to the prediction made by the model.
In a Random Forest, the importance of a feature is computed by looking at how much the feature decreases the impurity (e.g., variance for regression tasks) across all the trees in the forest. The more a feature decreases the impurity, the more important it is considered.
The calculated importance scores for all features are then normalized to give relative importance as a percentage. This shows the relative contribution of each feature to the prediction task.
Features with high relative importance percentages have a strong impact on the model's predictions. They are crucial for accurate predictions and indicate key areas where performance matters most.
Features with low relative importance have a minimal impact on the model's predictions. While they can still contribute, they are less critical.
The table below shows the top-5 ranked players and their average estimated scores from the different Random Forest models above.
Rank | Surname | Firstname | Average Predicted Score |
---|---|---|---|
1 | Hughes | Mackenzie | 68.74 |
2 | Springer | Hayden | 69.00 |
3 | Noh | Seung-Yul | 69.03 |
4 | Haas | Bill | 69.16 |
5 | Norlander | Henrik | 69.25 |
Estimated scores for all players can be found here.