Skip to content
#

hyperparameter-optimization

Here are 613 public repositories matching this topic...

nni
not522
not522 commented Jul 27, 2022

Motivation

The Optuna's unit tests raise many warnings, which make it hard to recognize unexpected warning messages.

Description

We can filter expected warnings using @pytest.mark.filterwarnings or warnings.simplefilter("ignore"). Some warnings may be unexpected or unsuitable. We should fix tests or Optuna's codes in such cases.

This issue is contribution-welcome. We welcome any

test contribution-welcome good first issue
mljar-supervised
ViacheslavDanilov
ViacheslavDanilov commented May 19, 2022

I trained models on Windows, then I tried to use them on Linux, however, I could not load them due to an incorrect path joining. During model loading, I got learner_path in the following format experiments_dir/model_1/100_LightGBM\\learner_fold_0.lightgbm. The last two slashes were incorrectly concatenated with the rest part of the path. In this regard, I would suggest adding something like `l

bug help wanted good first issue
MichalChromcak
MichalChromcak commented Apr 1, 2022

I published a new v0.1.12 release of HCrystalBall, that updated some package dependencies and fixed some bugs in cross validation.

Should the original pin for 0.1.10 be updated? Unfortunately won't have time soon to submit a PR for this.

good first issue dependencies

Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models

  • Updated May 28, 2022
  • Jupyter Notebook
Gradient-Free-Optimizers

A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.

  • Updated Jun 19, 2021

Robyn is an experimental, automated and open-sourced Marketing Mix Modeling (MMM) package from Facebook Marketing Science. It uses various machine learning techniques (Ridge regression, multi-objective evolutionary algorithm for hyperparameter optimisation, gradient-based optimisation for budget allocation etc.) to define media channel efficiency and effectivity, explore adstock rates and saturation curves. It's built for granular datasets with many independent variables and therefore especially suitable for digital and direct response advertisers with rich dataset.

  • Updated Aug 3, 2022
  • R
Neuraxle

Improve this page

Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."

Learn more