Can Gradient Boosting Learn Simple Arithmetic?

Can Gradient Boosting Learn Simple Arithmetic?

As a rule of thumb, that I heard from a fellow Kaggle Grandmaster years ago, GBMs can approximate these interactions, but if they are very strong, we should specifically add them as another column in our input matrix. It’s time to upgrade this rule, do an experiment and shed some light on how effective this class of models really are when dealing with arithmetic interactions. Let’s use simple two-variable interactions for this experiment, and visually inspect how the model does. Gaussian Noise: Mean 0, StDev 0.5

With this amount of noise we see that the model can approximate the interactions, even division and multiplication. Now, if you really don’t want to create the interactions, here are some ideas that may help the model approximate it better:

Adding more examples
Tuning the hyperparameters
Use a model that is more adequate to the known form of the interaction

Source: mariofilho.com