Kaggle Days in Paris CPMP talk
https://www.youtube.com/watch?v=VC8Jc9_lNoY&feature=youtu.be&t=1018
Don't ovetune your parameters: do it once, maybe twice in a competition, no more.
パラメータ調整のしすぎはダメ: コンペ中にやるとしても1回・2回
For XGBoost / LightGBM
Start with subsample=0.7, leave other values to default
まずはsubsample=0.7で他はデフォルトのまま
Play with min_child_weight: increase it if train/val gap is large
min_child_weightはtrain/valの差が大きいなら増やそう
Then tune max_depth or num_of_leaves
その後max_depth or num_of_leavesを調整
Add regularization if LB Score is way below CV
もしLBがCVより低いなら正則化(L1/L2)を強めにかける