Grow your team on GitHub
GitHub is home to over 50 million developers working together. Join them to grow your own development teams, manage permissions, and collaborate on projects.
Sign up
Pinned repositories
Repositories
-
dex-lang
Research language for array processing in the Haskell/ML family
-
text-to-text-transfer-transformer
Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"
-
open-covid-19-data
Open source aggregation pipeline for public COVID-19 data, including hospitalization/ICU/ventilator numbers for many countries.
-
noisystudent
Code for Noisy Student Training. https://arxiv.org/abs/1911.04252
-
-
simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
-
xtreme
XTREME is a benchmark for the evaluation of the cross-lingual generalization ability of pre-trained multilingual models that covers 40 typologically diverse languages and includes nine tasks.
-
-
computation-thru-dynamics
Understanding computation in artificial and biological recurrent networks through the lens of dynamical systems.
-
electra
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
-
tiny-differentiable-simulator
Tiny Differentiable Simulator is a header-only C++ physics library with zero dependencies.
-
bert
TensorFlow code and pre-trained models for BERT
-
tensor2robot
Distributed machine learning infrastructure for large-scale robotics research
-
reassessed-imagenet
Labels and other data for the paper "Are we done with ImageNet?"
-
fast-soft-sort
Fast Differentiable Sorting and Ranking
-
tapas
End-to-end neural table-text understanding models.
-
albert
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
-
arxiv-latex-cleaner
arXiv LaTeX Cleaner: Easily clean the LaTeX code of your paper to submit to arXiv
-
batch_rl
Offline Reinforcement Learning (aka Batch Reinforcement Learning) on Atari 2600 games
-
seed_rl
SEED RL: Scalable and Efficient Deep-RL with Accelerated Central Inference. Implements IMPALA and R2D2 algorithms in TF2 with SEED's architecture.
-
morph-net
Fast & Simple Resource-Constrained Learning of Deep Network Structure
-
language
Shared repository for open-sourced projects from the Google AI Language team.
-
big_transfer
Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper.
-
neural-structural-optimization
Neural reparameterization improves structural optimization