#
autograd
Here are 86 public repositories matching this topic...
PyTorch tutorials and fun projects including neural talk, neural style, poem writing, anime generation (《深度学习框架PyTorch:入门与实战》)
deep-learning
jupyter-notebook
nn
pytorch
autograd
caption
gan
image-classification
tensorboard
tensor
neural-style
visdom
pytorch-tutorials
pytorch-tutorials-cn
charrnn
neuraltalk
-
Updated
Dec 22, 2020 - Jupyter Notebook
Open
Yolo Model
zachgk
commented
Apr 8, 2020
Description
Implement a YOLO model and add it to the DJL model zoo
References
A C++ standalone library for machine learning
-
Updated
Jan 23, 2021 - C++
prabhuomkar
commented
Sep 7, 2020
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
machine-learning
statistics
neural-network
optimization
matrix
linear-algebra
automatic-differentiation
regression
autograd
numerical-calculations
scientific-computing
topic-modeling
ndarray
plotting
gsl
maths
sparse-linear-systems
statistical-functions
mcmc
algorithmic-differentation
-
Updated
Jan 6, 2021 - OCaml
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
iot
machine-learning
nim
deep-learning
opencl
linear-algebra
automatic-differentiation
openmp
parallel-computing
cuda
autograd
gpgpu
neural-networks
high-performance-computing
ndarray
tensor
gpu-computing
multidimensional-arrays
cudnn
matrix-library
-
Updated
Jan 20, 2021 - Nim
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
machine-learning
deep-learning
neural-network
tensorflow
optimization
quantum
differentiable-computing
forest
automatic-differentiation
pytorch
autograd
quantum-computing
quantum-chemistry
quantum-machine-learning
qiskit
qsharp
cirq
strawberryfields
-
Updated
Jan 23, 2021 - Python
Notes, examples, and Python demos for the textbook "Machine Learning Refined" (published by Cambridge University Press).
data-science
machine-learning
deep-learning
neural-network
numpy
slides
machine-learning-algorithms
jupyter-notebook
autograd
artificial-intelligence
lecture-notes
-
Updated
Jan 19, 2021 - Python
『ゼロから作る Deep Learning ❸』(O'Reilly Japan, 2020)
-
Updated
Dec 8, 2020 - Python
PyTorch Extension Library of Optimized Autograd Sparse Matrix Operations
-
Updated
Jan 19, 2021 - Python
Tensors and differentiable operations (like TensorFlow) in Rust
-
Updated
Jan 16, 2021 - Rust
Image registration laboratory for 2D and 3D image data
-
Updated
Oct 8, 2020 - Python
2
Jegp
commented
Nov 20, 2020
It would be helpful to have visualisation tools to plot/debug information in the SNNs. I also think we should be slightly careful to use correct/adequate abstractions since it is 1) not the primary purpose of Norse, and 2) we don't want to maintain something that will change a lot in the future.
Here are a few suggestions for visualisations:
- Layer parameters
- Weights / tuning curves
Introductions to key concepts in quantum machine learning, as well as tutorials and implementations from cutting-edge QML research.
demo
qml
tensorflow
automatic-differentiation
tutorials
pytorch
autograd
quantum-computing
neural-networks
quantum-chemistry
key-concepts
quantum-machine-learning
-
Updated
Jan 22, 2021 - Python
JAX - A curated list of resources https://github.com/google/jax
-
Updated
Jan 23, 2021
Julia port of the Python autograd package.
-
Updated
Jan 20, 2021 - Julia
scorch is a deep learning framework in Scala inspired by PyTorch
-
Updated
Apr 19, 2020 - Scala
Tiny and elegant deep learning library
-
Updated
Jul 1, 2019 - Python
Torch Containers simplified in PyTorch
-
Updated
Apr 28, 2017 - Lua
Scientific computing in pure Crystal
-
Updated
Jan 4, 2021 - Crystal
2
rsokl
commented
Jan 26, 2020
-
asarray: converts all array-likes to a numpy array; no copy for arrays -
astensorconverts all array-likes to a mygrad tensor; returns tensors unchanged -
assert_almost_equalwith various options to include checks of gradients and to include checks ac
Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux
swift
machine-learning
deep-neural-networks
deep-learning
automatic-differentiation
autograd
recurrent-neural-networks
recurrent-networks
neural-networks
derivatives
convolutional-neural-networks
tensor
gradient-descent
swift-machine-learning
optimizers
-
Updated
Nov 27, 2020 - Swift
Deep-Learning framework from scratch
-
Updated
Oct 13, 2018 - Python
用例子学习PyTorch1.0(Learning PyTorch with Examples 中文翻译与学习)
-
Updated
Mar 11, 2019 - Jupyter Notebook
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
reinforcement-learning
deep-learning
graph
gpu
automatic-differentiation
cuda
autograd
gan
neural-networks
openpose
-
Updated
Jul 15, 2020 - Python
-
Updated
Nov 8, 2020 - Python
Google AI Princeton control framework
-
Updated
Nov 2, 2020 - Jupyter Notebook
European Distributed Deep Learning (EDDL) library. A general-purpose library initially developed to cover deep learning needs in healthcare use cases within the DeepHealth project.
-
Updated
Jan 22, 2021 - C++
Improve this page
Add a description, image, and links to the autograd topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the autograd topic, visit your repo's landing page and select "manage topics."
Add support for
torch.maxwith:Motivation
Currently,
torch.maxhas support for CUDA float16:But all three other combinations of CPU/CUDA and float16/bfloat16 are not supported: