HUAWEI Noah's Ark Lab
Grow your team on GitHub
GitHub is home to over 50 million developers working together. Join them to grow your own development teams, manage permissions, and collaborate on projects.
Sign upRepositories
-
AdderNet
Code for paper " AdderNet: Do We Really Need Multiplications in Deep Learning?"
-
ghostnet
[CVPR2020] Surpassing MobileNetV3: "GhostNet: More Features from Cheap Operations"
-
vega
AutoML tools chain
-
bolt
Bolt is a deep learning library with high performance and heterogeneous flexibility.
-
noah-research
Noah Research
-
multi_hyp_cc
[CVPR2020] A Multi-Hypothesis Approach to Color Constancy
-
trustworthyAI
trustworthy AI related projects
-
Disout
Code for AAAI 2020 paper, Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks (Disout).
-
Data-Efficient-Model-Compression
Data Efficient Model Compression
-
Pretrained-Language-Model
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
-
CARS
[CVPR2020] CARS: Continuous Evolution for Efficient Neural Architecture Search
-
BHT-ARIMA
Code for paper: Block Hankel Tensor ARIMA for Multiple Short Time Series Forecasting (AAAI-20)
-
GAN-pruning
A Pytorch implementation of "Co-Evolutionary Compression for Unpaired Image Translation" (ICCV 2019).
-
Full-Stack-Filters
Pytorch code for paper: Full-Stack Filters to Build Minimum Viable CNNs
-
Versatile-Filters
Pytorch code for paper: Learning Versatile Filters for Efficient Convolutional Neural Networks (NeurIPS 2018)
-
BGCN
A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
-
LegoNet
A Pytorch implementation of "LegoNet: Efficient Convolutional Neural Networks with Lego Filters" (ICML 2019).
-
streamDM
Stream Data Mining Library for Spark Streaming
-
streamDM-Cpp
stream Machine Learning in C++