automatic-differentiation
Here are 226 public repositories matching this topic...
I'm using TF 2.0, and I get this error when I import tangent, due to a list of non-differentiable functions that includes tf.to_float (line 60), which is deprecated:
https://www.tensorflow.org/versions/r1.14/api_docs/python/tf/to_float
I found that function mod2pi is not implemented yet, but mod works. Is there any list of implemented functions? Minimal working example is:
using Zygote
# This is working
gradient(x -> mod(x, 2pi), 1.)
# This is not
gradient(x -> mod2pi(x), 1.)
-
Updated
May 14, 2021 - OCaml
-
Updated
May 24, 2021 - Go
-
Updated
Jun 9, 2021 - Python
-
Updated
May 22, 2021 - Nim
-
Updated
May 28, 2021 - Scala
-
Updated
Apr 27, 2021 - C++
-
Updated
Jan 6, 2021 - C++
Summary:
The functions for the categorical distribution only accept a column vector, it would be great if it could accept also row vectors.
Description:
I use the categorical distribution to go over a matrix N_obs x N_probabilities, so it's more natural for me to use row vectors than column vectors.
Current functions:
real categorical_lpmf(ints y | vector theta)
real
-
Updated
Apr 28, 2021 - Julia
-
Updated
Jun 8, 2021 - C++
-
Updated
Jun 6, 2021 - LLVM
-
Updated
May 10, 2018 - Haskell
Debugging Kotlin∇ code within IntelliJ IDEA can be somewhat cumbersome due to the functional API structure (lots of deeply-nested stack traces and context switching). To facilitate more user-friendly debugging, we should add support for visual debugging by exposing Kaliningraph’s built-in graph visualization capabilities. For example, the use
-
Updated
Jun 4, 2021 - Python
-
Updated
Nov 16, 2016 - Python
-
Updated
Jan 10, 2018 - Python
-
Updated
Jun 6, 2021 - Julia
-
Updated
Apr 16, 2021 - Rust
-
Updated
Jun 4, 2021 - Jupyter Notebook
The init module has been deprecated, and the recommend approach for generating initial weights is to use the Template.shape method:
>>> from pennylane.templates import StronglyEntanglingLayers
>>> qml.init.strong_ent_layers_normal(n_layers=3, n_wires=2) # deprecated
>>> np.random.random(StronglyEntanglingLayers.shape(n_layers=3, n_wires=2)) # new approachWe should upd
The function can be easily implemented, but for the sake of getting closer to the numpy API we should add the function to aesara.tensor:
https://numpy.org/doc/stable/reference/generated/numpy.logaddexp.html
-
Updated
May 13, 2021 - Julia
profiles.h updates
At the moment profiles.h (in pkg/profiles) lacks many (any?) comments. Also lots of variables are declared somewhat separately from where they are associated with heap storage.
Both these make it a bit hard to read.
It would be nicer if it was called PROFILES.h too.
-
Updated
May 28, 2021 - Julia
Some of them can be ported over from Zygote.
cf. FluxML/Zygote.jl#906
https://github.com/FluxML/Zygote.jl/blob/956cbcf3c572c0eb09c146189bb38b1b434634ff/src/lib/array.jl#L130
-
Updated
Jun 2, 2021 - Julia
Improve this page
Add a description, image, and links to the automatic-differentiation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the automatic-differentiation topic, visit your repo's landing page and select "manage topics."
In operations_broadcast_test.go there are some tests that are not yet filled in. The point is to test that broadcasting works for different shapes. The semantics of broadcast probably isn't clear, so please do send me a message for anything.
This is a good first issue for anyone looking to get interested