gbm
Here are 80 public repositories matching this topic...
Summary
LightGBM supports aliases for most parameter. For example, you can use bagging_fraction or sub_row to say "% of rows sampled during bagging".
The new dask API added in #3515 should be changed to support such aliases for the parameters it references.
See these comments for details on a few parameters that need to be updated to use aliases.
Hi!
There is no opportunity to get fitted models after cv, because catboost.cv returns only evaluation metric scores. At the same time popular ml libraries have such option in some kind.
For LightGBM there is optional argument return_cvbooster:
cv = lgb.cv(params, X_train, show_stdv=False, stratified=True-
Updated
Dec 30, 2020 - Jupyter Notebook
-
Updated
Oct 12, 2020 - Java
-
Updated
Oct 22, 2020 - HTML
-
Updated
Dec 28, 2020 - Python
-
Updated
Jul 8, 2019 - C++
-
Updated
Dec 29, 2020 - Makefile
-
Updated
Jul 28, 2020 - Ruby
-
Updated
Jul 18, 2019 - Jupyter Notebook
-
Updated
Oct 18, 2020 - C++
-
Updated
Mar 23, 2020 - Julia
-
Updated
Nov 3, 2019 - Jupyter Notebook
-
Updated
Apr 13, 2016 - Python
-
Updated
Oct 25, 2020 - R
-
Updated
Oct 20, 2020 - Python
-
Updated
Dec 19, 2020 - R
-
Updated
Sep 11, 2017 - HTML
-
Updated
Dec 16, 2020 - R
-
Updated
Dec 14, 2020 - C
-
Updated
Jun 15, 2019 - C++
-
Updated
Nov 7, 2019 - Jupyter Notebook
-
Updated
Dec 6, 2018 - Jupyter Notebook
Improve this page
Add a description, image, and links to the gbm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the gbm topic, visit your repo's landing page and select "manage topics."
Currently many more Python projects like dask and optuna are using Python type hints. With the Python package of xgboost gaining more and more features, we should also adopt mypy as a safe guard against some type errors and for better code documentation.