Skip to content
#

hyperparameter-optimization

Here are 331 public repositories matching this topic...

A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.

  • Updated Apr 16, 2020
bcyphers
bcyphers commented Jan 31, 2018

If enter_data() is called with the same train_path twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.

We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing

mljar-supervised
spamz23
spamz23 commented Aug 26, 2020

I think it would be interesting to add a feature to export the best model with a scikit-learn wrapper. This would allow integrating the best AutoML model into a scikit-learn workflow.
I think most of the models that AutoML uses are already from scikit-learn, and those who aren't do provide scikit-learn wrappers, so I think it would be easy to implement.
Is there anything that makes this feature

Improve this page

Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.