#
autograd
Here are 89 public repositories matching this topic...
PyTorch tutorials and fun projects including neural talk, neural style, poem writing, anime generation (《深度学习框架PyTorch:入门与实战》)
deep-learning
jupyter-notebook
nn
pytorch
autograd
caption
gan
image-classification
tensorboard
tensor
neural-style
visdom
pytorch-tutorials
pytorch-tutorials-cn
charrnn
neuraltalk
-
Updated
Feb 10, 2021 - Jupyter Notebook
Open
Yolo Model
zachgk
commented
Apr 8, 2020
Description
Implement a YOLO model and add it to the DJL model zoo
References
A C++ standalone library for machine learning
-
Updated
Feb 19, 2021 - C++
prabhuomkar
commented
Sep 7, 2020
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
machine-learning
statistics
neural-network
optimization
matrix
linear-algebra
automatic-differentiation
regression
autograd
numerical-calculations
scientific-computing
topic-modeling
ndarray
plotting
gsl
maths
sparse-linear-systems
statistical-functions
mcmc
algorithmic-differentation
-
Updated
Feb 13, 2021 - OCaml
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
iot
machine-learning
nim
deep-learning
opencl
linear-algebra
automatic-differentiation
openmp
parallel-computing
cuda
autograd
gpgpu
neural-networks
high-performance-computing
ndarray
tensor
gpu-computing
multidimensional-arrays
cudnn
matrix-library
-
Updated
Jan 20, 2021 - Nim
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
machine-learning
deep-learning
neural-network
tensorflow
optimization
quantum
differentiable-computing
forest
automatic-differentiation
pytorch
autograd
quantum-computing
quantum-chemistry
quantum-machine-learning
qiskit
qsharp
cirq
strawberryfields
-
Updated
Feb 19, 2021 - Python
Notes, examples, and Python demos for the textbook "Machine Learning Refined" (published by Cambridge University Press).
data-science
machine-learning
deep-learning
neural-network
numpy
slides
machine-learning-algorithms
jupyter-notebook
autograd
artificial-intelligence
lecture-notes
-
Updated
Feb 1, 2021 - Python
『ゼロから作る Deep Learning ❸』(O'Reilly Japan, 2020)
-
Updated
Feb 3, 2021 - Python
PyTorch Extension Library of Optimized Autograd Sparse Matrix Operations
-
Updated
Feb 16, 2021 - Python
Tensors and differentiable operations (like TensorFlow) in Rust
-
Updated
Feb 6, 2021 - Rust
Image registration laboratory for 2D and 3D image data
-
Updated
Oct 8, 2020 - Python
cpehle
opened
Jan 29, 2021
JAX - A curated list of resources https://github.com/google/jax
-
Updated
Feb 19, 2021
Introductions to key concepts in quantum machine learning, as well as tutorials and implementations from cutting-edge QML research.
demo
qml
tensorflow
automatic-differentiation
tutorials
pytorch
autograd
quantum-computing
neural-networks
quantum-chemistry
key-concepts
quantum-machine-learning
-
Updated
Feb 13, 2021 - Python
Julia port of the Python autograd package.
-
Updated
Feb 6, 2021 - Julia
scorch is a deep learning framework in Scala inspired by PyTorch
-
Updated
Apr 19, 2020 - Scala
Tiny and elegant deep learning library
-
Updated
Jul 1, 2019 - Python
Torch Containers simplified in PyTorch
-
Updated
Apr 28, 2017 - Lua
Scientific computing in pure Crystal
-
Updated
Jan 4, 2021 - Crystal
The templated deep learning framework, enabling framework-agnostic functions, layers and libraries.
python
template
machine-learning
deep-learning
neural-network
mxnet
tensorflow
gpu
numpy
pytorch
autograd
abstraction
ivy
jax
-
Updated
Feb 16, 2021 - Python
rsokl
commented
Jan 22, 2019
Okay, so this might not exactly be a "good first issue" - it is a little more advanced, but is still very much accessible to newcomers.
Similar to the mygrad.nnet.max_pool function, I would like there to be a mean-pooling layer. That is, a convolution-style windows is strided over the input, an
Accelerated tensor operations and dynamic neural networks based on reverse mode automatic differentiation for every device that can run Swift - from watchOS to Linux
swift
machine-learning
deep-neural-networks
deep-learning
automatic-differentiation
autograd
recurrent-neural-networks
recurrent-networks
neural-networks
derivatives
convolutional-neural-networks
tensor
gradient-descent
swift-machine-learning
optimizers
-
Updated
Feb 15, 2021 - Swift
Deep-Learning framework from scratch
-
Updated
Oct 13, 2018 - Python
用例子学习PyTorch1.0(Learning PyTorch with Examples 中文翻译与学习)
-
Updated
Mar 11, 2019 - Jupyter Notebook
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
reinforcement-learning
deep-learning
graph
gpu
automatic-differentiation
cuda
autograd
gan
neural-networks
openpose
-
Updated
Jul 15, 2020 - Python
-
Updated
Nov 8, 2020 - Python
Google AI Princeton control framework
-
Updated
Nov 2, 2020 - Jupyter Notebook
Improve this page
Add a description, image, and links to the autograd topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the autograd topic, visit your repo's landing page and select "manage topics."
Both
torch.dotandtorch.vdotrequire the input tensors to be of the same dtype, so we don't need to pass the derivative tohandle_r_to_chere.cc @ezyang @anjali411 @dylanbespalko @mruberry