-
Updated
Feb 26, 2021 - Jupyter Notebook
#
distillation
Here are 90 public repositories matching this topic...
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
deep-neural-networks
jupyter-notebook
pytorch
regularization
pruning
quantization
group-lasso
distillation
onnx
truncated-svd
network-compression
pruning-structures
early-exit
automl-for-compression
Awesome Knowledge Distillation
deep-learning
knowledge-distillation
teacher-student
knowledge-transfer
co-training
model-compression
distillation
kd
knowldge-distillation
distillation-model
model-distillation
-
Updated
Mar 12, 2022
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
-
Updated
May 23, 2022
A PyTorch-based knowledge distillation toolkit for natural language processing
-
Updated
Apr 25, 2022 - Python
Pytorch implementation of various Knowledge Distillation (KD) methods.
knowledge-distillation
teacher-student
knowledge-transfer
model-compression
distillation
kd
kd-methods
-
Updated
Nov 25, 2021 - Python
PyTorch implementation of various methods for continual learning (XdG, EWC, online EWC, SI, LwF, GR, GR+distill, RtF, ER, A-GEM, iCaRL).
deep-learning
artificial-neural-networks
replay
incremental-learning
variational-autoencoder
generative-models
lifelong-learning
distillation
continual-learning
elastic-weight-consolidation
replay-through-feedback
icarl
gradient-episodic-memory
-
Updated
Jul 15, 2021 - Python
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
text-classification
corpus
dataset
chinese
semantic-similarity
pretrained-models
sentence-classification
albert
bert
sentence-analysis
distillation
sentence-pairs
roberta
-
Updated
Jul 8, 2020 - Python
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
pytorch
imagenet
model-architecture
compression-algorithm
pre-trained
meal
imagenet-dataset
distillation
resnet50
mobilenetv3
efficientnet
distillation-model
-
Updated
Dec 24, 2021 - Python
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
-
Updated
Jul 10, 2021 - Jupyter Notebook
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
nes
pca
bim
benchmark-framework
evolutionary
spsa
boundary
adversarial-machine-learning
distillation
fgsm
adversarial-attacks
deepfool
adversarial-robustness
mi-fgsm
mmlda
hgd
-
Updated
Jun 22, 2022 - Python
SimpleAICV:pytorch training example on ImageNet(ILSVRC2012)/COCO2017/VOC2007+2012 datasets.Include ResNet/DarkNet/RetinaNet/FCOS/CenterNet/TTFNet/YOLOv3/YOLOv4/YOLOv5/YOLOX.
pytorch
classification
imagenet
coco
resnet
object-detection
darknet
cifar100
voc
distillation
retinanet
yolov3
fcos
centernet
yolov4
yolov5
ttfnet
ilsvrc2012
yolox
mosaic-augment
-
Updated
May 4, 2022 - Python
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
pytorch
quantization
hessian
8-bit
model-compression
distillation
tvm
4-bit
mixed-precision
tensorcore
quantized-neural-networks
hardware-aware
efficient-neural-networks
-
Updated
May 8, 2021 - Python
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
prediction
planning
perception
autonomous-driving
imitation-learning
distillation
carla-simulator
cvpr2022
-
Updated
Jun 20, 2022 - Python
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
deep-learning
artificial-neural-networks
replay
incremental-learning
variational-autoencoder
lifelong-learning
distillation
brain-inspired
continual-learning
elastic-weight-consolidation
replay-through-feedback
split-mnist
generative-replay
permuted-mnist
split-cifar100
internal-replay
synaptic-intelligence
-
Updated
Jun 22, 2022 - Python
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
-
Updated
Jun 16, 2021 - Python
Filter Grafting for Deep Neural Networks(CVPR 2020)
-
Updated
Feb 4, 2022 - Python
Insightface Keras implementation
tensorflow
keras
tf
triplet
distillation
arcface
insightface
tensorflow2
efficientnet
ghostnet
curricularface
subcenter-arcface
magface
vargface
-
Updated
Jun 28, 2022 - Python
Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。
-
Updated
Jan 26, 2022 - Python
(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model
-
Updated
Feb 17, 2022 - Python
Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."
machine-learning
pytorch
convolutional-neural-networks
incremental-learning
paper-implementations
distillation
-
Updated
Jun 21, 2022 - Python
仿 Scikit-Learn 设计的深度学习自然语言处理框架, 支持约 40 种模型类, 涵盖语言模型、文本分类、NER、MRC、机器翻译等各个领域
-
Updated
May 27, 2022 - Python
Pytorch implementation of CVPR2021 paper: SuperMix: Supervising the Mixing Data Augmentation
-
Updated
Jan 6, 2022 - Python
Yolov5 distillation training | Yolov5知识蒸馏训练,支持训练自己的数据
-
Updated
Jul 10, 2021 - Python
this is roberta wwm base distilled model which was distilled from roberta wwm by roberta wwm large
-
Updated
Mar 30, 2020 - Python
Distillation of BERT model with catalyst framework
-
Updated
Apr 18, 2021 - Python
PyTorch Implementation on Paper [CVPR2021]Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
pytorch
video-recognition
distillation
audio-visual-learning
contrastive-learning
cvpr2021
compositional-contrastive-learning
audio-teacher-models
multi-modal-distillation
-
Updated
Jul 7, 2021 - Python
JulesBelveze
commented
Dec 10, 2021
When performing distillation in soft or hard mode the way the datasets are concatenated is dubious.
Lightning offers a handy solution to use multiple datasets (see documentation), which will make code much cleaner and easier to understand.
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
-
Updated
Oct 3, 2021 - Python
Improve this page
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."
I've received requests for setting up a "conda" installation for BioSTEAM. This is not a priority for now, but any help (resources or direct contributions) would be really helpful!
Thanks :)