Skip to content
#

sparsity

Here are 88 public repositories matching this topic...

sparseml

Intel® Neural Compressor (formerly known as Intel® Low Precision Optimization Tool), targeting to provide unified APIs for network compression technologies, such as low precision quantization, sparsity, pruning, knowledge distillation, across different deep learning frameworks to pursue optimal inference performance.

  • Updated Jul 8, 2022
  • Python

Reference ImageNet implementation of SelecSLS CNN architecture proposed in the SIGGRAPH 2020 paper "XNect: Real-time Multi-Person 3D Motion Capture with a Single RGB Camera". The repository also includes code for pruning the model based on implicit sparsity emerging from adaptive gradient descent methods, as detailed in the CVPR 2019 paper "On implicit filter level sparsity in Convolutional Neural Networks".

  • Updated Jul 23, 2020
  • Python

Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).

  • Updated Jul 21, 2021
  • Python

Contains a wide-ranging collection of compressed sensing and feature selection algorithms. Examples include matching pursuit algorithms, forward and backward stepwise regression, sparse Bayesian learning, and basis pursuit.

  • Updated Mar 28, 2022
  • Julia

Improve this page

Add a description, image, and links to the sparsity topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the sparsity topic, visit your repo's landing page and select "manage topics."

Learn more