Skip to content
Fast and flexible AutoML with learning guarantees.
Branch: master
Clone or download
cweill Fixes #82: Split up quick and end-to-end tests
Imported from GitHub PR #82

PiperOrigin-RevId: 239996667
Latest commit 91fdf06 Mar 23, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
adanet Fixes #82: Split up quick and end-to-end tests Mar 25, 2019
docs
images
oss_scripts Fixes #82: Split up quick and end-to-end tests Mar 25, 2019
research/improve_nas Add "Getting Started" documentation to adanet.readthedocs.io. Mar 21, 2019
.coveragerc Fixes #82: Split up quick and end-to-end tests Mar 25, 2019
.gitignore PR #79: Fix TravisCI Mar 15, 2019
.travis.yml Fixes #82: Split up quick and end-to-end tests Mar 25, 2019
AUTHORS Initial commit to open-source `adanet` implementation. Sep 25, 2018
CONTRIBUTING.md Add "Getting Started" documentation to adanet.readthedocs.io. Mar 21, 2019
LICENSE Initial commit to open-source `adanet` implementation. Sep 25, 2018
README.md Fixes #82: Split up quick and end-to-end tests Mar 25, 2019
RELEASE.md
WORKSPACE Fix WORKSPACE for Bazel version >=0.20.0. Nov 30, 2018
requirements.txt PR #79: Fix TravisCI Mar 15, 2019
setup.cfg Fixes #82: Split up quick and end-to-end tests Mar 25, 2019

README.md

AdaNet

Documentation Status PyPI version Travis codecov License

adanet_tangram_logo

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models.

This project is based on the AdaNet algorithm, presented in “AdaNet: Adaptive Structural Learning of Artificial Neural Networks” at ICML 2017, for learning the structure of a neural network as an ensemble of subnetworks.

AdaNet has the following goals:

  • Ease of use: Provide familiar APIs (e.g. Keras, Estimator) for training, evaluating, and serving models.
  • Speed: Scale with available compute and quickly produce high quality models.
  • Flexibility: Allow researchers and practitioners to extend AdaNet to novel subnetwork architectures, search spaces, and tasks.
  • Learning guarantees: Optimize an objective that offers theoretical learning guarantees.

The following animation shows AdaNet adaptively growing an ensemble of neural networks. At each iteration, it measures the ensemble loss for each candidate, and selects the best one to move onto the next iteration. At subsequent iterations, the blue subnetworks are frozen, and only yellow subnetworks are trained:

adanet_tangram_logo

AdaNet was first announced on the Google AI research blog: "Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees".

This is not an official Google product.

Features

AdaNet provides the following AutoML features:

Example

A simple example of learning to ensemble linear and neural network models:

import adanet
import tensorflow as tf

# Define the model head for computing loss and evaluation metrics.
head = tf.contrib.estimator.multi_class_head(n_classes=10)

# Feature columns define how to process examples.
feature_columns = ...

# Learn to ensemble linear and neural network models.
estimator = adanet.AutoEnsembleEstimator(
    head=head,
    candidate_pool={
        "linear":
            tf.estimator.LinearEstimator(
                head=head,
                feature_columns=feature_columns,
                optimizer=tf.train.FtrlOptimizer(...)),
        "dnn":
            tf.estimator.DNNEstimator(
                head=head,
                feature_columns=feature_columns,
                optimizer=tf.train.ProximalAdagradOptimizer(...),
                hidden_units=[1000, 500, 100])},
    max_iteration_steps=50)

estimator.train(input_fn=train_input_fn, steps=100)
metrics = estimator.evaluate(input_fn=eval_input_fn)
predictions = estimator.predict(input_fn=predict_input_fn)

Getting Started

To get you started:

Requirements

Requires Python 2.7, 3.4, 3.5, or 3.6.

adanet depends on bug fixes and enhancements not present in TensorFlow releases prior to 1.9. You must install or upgrade your TensorFlow package to at least 1.9:

$ pip install "tensorflow>=1.9.0"

Installing with Pip

You can use the pip package manager to install the official adanet package from PyPi:

$ pip install adanet

Installing from Source

To install from source first you'll need to install bazel following their installation instructions.

Next clone the adanet repository:

$ git clone https://github.com/tensorflow/adanet
$ cd adanet

From the adanet root directory run the tests:

$ bazel test -c opt //...

Once you have verified that the tests have passed, install adanet from source as a pip package .

You are now ready to experiment with adanet.

import adanet

Citing this Work

If you use this AdaNet library for academic research, you are encouraged to cite the following:

@misc{weill2018adanet,
  author    = {Charles Weill and Javier Gonzalvo and Vitaly Kuznetsov and
    Scott Yang and Scott Yak and Hanna Mazzawi and Eugen Hotaj and
    Ghassen Jerfel and Vladimir Macko and Mehryar Mohri and Corinna Cortes},
  title     = {AdaNet: Fast and flexible AutoML with learning guarantees},
  year      = {2018},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/tensorflow/adanet}},
}

License

AdaNet is released under the Apache License 2.0.

<script defer src="https://cdn.jsdelivr.net/npm/katex@0.10.1/dist/katex.min.js" integrity="sha384-2BKqo+exmr9su6dir+qCw08N2ZKRucY4PrGQPPWU1A7FtlCGjmEGFqXCv5nyM5Ij" crossorigin="anonymous"></script> <script defer src="https://cdn.jsdelivr.net/npm/katex@0.10.1/dist/contrib/auto-render.min.js" integrity="sha384-kWPLUVMOks5AQFrykwIup5lo0m3iMkkHrD0uJ4H5cjeGihAutqP0yW0J6dpFiVkI" crossorigin="anonymous"></script> <script> document.addEventListener("DOMContentLoaded", function() { renderMathInElement(document.body, { delimiters: [ {left: "$$", right: "$$", display: true}, {left: "$", right: "$", display: false}, ] }); }); </script>
You can’t perform that action at this time.