Here are
70 public repositories
matching this topic...
Awesome Knowledge Distillation
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
-
Updated
Mar 27, 2020
-
Python
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
-
Updated
Jul 11, 2018
-
Jupyter Notebook
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
-
Updated
Apr 26, 2020
-
Python
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
-
Updated
Jan 28, 2020
-
Python
PaddleSlim is an open-source library for deep model compression and architecture search.
-
Updated
Jun 23, 2020
-
Python
knowledge distillation papers
Data Efficient Model Compression
-
Updated
May 8, 2020
-
Python
-
Updated
Dec 15, 2019
-
Python
A treasure chest for image classification powered by PaddlePaddle
-
Updated
Jun 26, 2020
-
Python
-
Updated
Mar 27, 2020
-
Cuda
Knowledge Distillation: CVPR2020, Revisiting Knowledge Distillation via Label Smoothing Regularization
-
Updated
May 11, 2020
-
Python
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
-
Updated
Dec 9, 2019
-
Python
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
-
Updated
Nov 21, 2019
-
Python
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
-
Updated
Jun 23, 2020
-
Python
Infrastructures™ for Machine Learning Training/Inference in Production.
-
Updated
Oct 3, 2019
-
Python
An Acceleration System for Large-scale Outlier Detection (Anomaly Detection)
-
Updated
Jun 15, 2020
-
Python
A large scale study of Knowledge Distillation.
-
Updated
Apr 19, 2020
-
Python
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
-
Updated
Jan 16, 2020
-
Python
Knowledge Distillation using Tensorflow
-
Updated
Aug 12, 2019
-
Python
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
-
Updated
Sep 9, 2019
-
Python
[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives
-
Updated
Mar 25, 2020
-
Python
PyTorch code for our CVPR-20 paper "Collaborative Distillation for Ultra-Resolution Universal Style Transfer"
-
Updated
May 10, 2020
-
Python
-
Updated
Jul 5, 2018
-
Jupyter Notebook
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
-
Updated
Sep 9, 2019
-
Python
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
-
Updated
Jun 20, 2019
-
Python
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
-
Updated
Aug 8, 2019
-
Python
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
-
Updated
Nov 28, 2019
-
Python
Improve this page
Add a description, image, and links to the
knowledge-distillation
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
knowledge-distillation
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.