hyperparameter-optimization
Here are 603 public repositories matching this topic...
-
Updated
Jul 7, 2022 - Python
Motivation
optuna/optuna#3651 has introduced chunking to send too long tag/metric/params to avoid mlflow's API limit. Hopefully, mlflow itself will handle this case by mlflow/mlflow#6052, so Optuna doesn't need the logic introduced by #3651 with the next mlflow version anymore.
Description
Revert all changes introduced by #3651 after the n
The program throws an error at runtime, how can this error be solved?
[ERROR] [2022-06-27 14:13:51,391:asyncio.events]
Traceback (most recent call last):
File "/home/xumaoyuan/.virtualenvs/lib/python3.8/site-packages/distributed/utils.py", line 761, in wrapper
return await func(*args, **kwargs)
File "/home/xumaoyuan/.virtualenvs/lib/python3.8/site-packages/distributed/clieAdd PECOS model to TabularPredictor.
If you'd like to work on this, please respond to this GitHub issue.
It is recommended to follow the custom model tutorial for implementing the model into AutoGluon.
-
Updated
Jul 8, 2022 - Python
-
Updated
Jan 3, 2022
-
Updated
Feb 3, 2022 - Python
-
Updated
Jun 27, 2022 - Python
I trained models on Windows, then I tried to use them on Linux, however, I could not load them due to an incorrect path joining. During model loading, I got learner_path in the following format experiments_dir/model_1/100_LightGBM\\learner_fold_0.lightgbm. The last two slashes were incorrectly concatenated with the rest part of the path. In this regard, I would suggest adding something like `l
ray.init(n_cpus=16)
should be
ray.init(num_cpus=16) in the documentation page
https://microsoft.github.io/FLAML/docs/Use-Cases/Task-Oriented-AutoML#parallel-tuning
-
Updated
Jul 8, 2022 - Python
-
Updated
Feb 10, 2021 - Python
-
Updated
May 28, 2022 - Python
-
Updated
Jun 6, 2018 - Python
-
Updated
May 28, 2022 - Jupyter Notebook
-
Updated
Feb 6, 2021 - Python
-
Updated
Apr 4, 2022 - Jupyter Notebook
-
Updated
Jun 17, 2022 - Python
-
Updated
Jun 19, 2021
-
Updated
Jul 8, 2022 - Python
-
Updated
Oct 14, 2021 - JavaScript
-
Updated
Jul 8, 2022 - Python
-
Updated
Jan 20, 2021 - Python
-
Updated
Aug 15, 2018 - Python
-
Updated
Apr 22, 2022 - Python
-
Updated
Jul 8, 2022 - R
Describe the bug
Sometimes we init using a class that is not our parent. We should not do this.
Improve this page
Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."
What happened + What you expected to happen
When initializing a Ray Trainer, we provide a
logdirargument, and the__init__method of the Trainer stores it as alogdirclass variable.Then, when creating a Trainable with
Trainer.to_tune_trainable(), it in-turn calls_create_tune_trainable(), which does not useself.logdir. So whentune_functionis defined inside `_create_tu