Skip to content
#

hyperparameter-optimization

Here are 603 public repositories matching this topic...

VishDev12
VishDev12 commented Jun 4, 2022

What happened + What you expected to happen

When initializing a Ray Trainer, we provide a logdir argument, and the __init__ method of the Trainer stores it as a logdir class variable.

Then, when creating a Trainable with Trainer.to_tune_trainable(), it in-turn calls _create_tune_trainable(), which does not use self.logdir. So when tune_function is defined inside `_create_tu

bug good first issue P3 triage
nni
nzw0301
nzw0301 commented Jun 22, 2022

Motivation

optuna/optuna#3651 has introduced chunking to send too long tag/metric/params to avoid mlflow's API limit. Hopefully, mlflow itself will handle this case by mlflow/mlflow#6052, so Optuna doesn't need the logic introduced by #3651 with the next mlflow version anymore.

Description

Revert all changes introduced by #3651 after the n

feature contribution-welcome good first issue optuna.integration
xmyhhh
xmyhhh commented Jun 27, 2022

The program throws an error at runtime, how can this error be solved?

[ERROR] [2022-06-27 14:13:51,391:asyncio.events] 
Traceback (most recent call last):
  File "/home/xumaoyuan/.virtualenvs/lib/python3.8/site-packages/distributed/utils.py", line 761, in wrapper
    return await func(*args, **kwargs)
  File "/home/xumaoyuan/.virtualenvs/lib/python3.8/site-packages/distributed/clie
mljar-supervised
ViacheslavDanilov
ViacheslavDanilov commented May 19, 2022

I trained models on Windows, then I tried to use them on Linux, however, I could not load them due to an incorrect path joining. During model loading, I got learner_path in the following format experiments_dir/model_1/100_LightGBM\\learner_fold_0.lightgbm. The last two slashes were incorrectly concatenated with the rest part of the path. In this regard, I would suggest adding something like `l

bug help wanted good first issue

Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models

  • Updated May 28, 2022
  • Jupyter Notebook
Gradient-Free-Optimizers

A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.

  • Updated Jun 19, 2021

Robyn is an experimental, automated and open-sourced Marketing Mix Modeling (MMM) package from Facebook Marketing Science. It uses various machine learning techniques (Ridge regression, multi-objective evolutionary algorithm for hyperparameter optimisation, gradient-based optimisation for budget allocation etc.) to define media channel efficiency and effectivity, explore adstock rates and saturation curves. It's built for granular datasets with many independent variables and therefore especially suitable for digital and direct response advertisers with rich dataset.

  • Updated Jul 8, 2022
  • R
Neuraxle

Improve this page

Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."

Learn more