automatic-differentiation
Here are 279 public repositories matching this topic...
I'm using TF 2.0, and I get this error when I import tangent, due to a list of non-differentiable functions that includes tf.to_float (line 60), which is deprecated:
https://www.tensorflow.org/versions/r1.14/api_docs/python/tf/to_float
Feature details
It seems that the utility function qml.utils.expand is no longer used anywhere as of #2609.
It therefore could be removed. A differentiable, more up-to-date version is available in qml.operation.expand_matrix.
Implementation
No response
How important would you say this feature is?
1: Not important. Would be nice to have.
Additional information
_No resp
As mentioned in FluxML/Zygote.jl#1212.
-
Updated
Jun 2, 2022 - Go
-
Updated
May 31, 2022 - C++
-
Updated
Jun 3, 2022 - OCaml
-
Updated
May 28, 2022 - Nim
-
Updated
Mar 28, 2022 - C++
-
Updated
May 21, 2022 - Scala
-
Updated
Jun 3, 2022 - C++
https://llvm.org/docs/NewPassManager.html
The tricky part is to keep our custom command line options working.
Currently, we are not running CI with Mac OS so verifying continued correct behavior on OS X is more difficult. Github Actions provides access to Mac OS Intel machines so we may want to start running these in CI.
-
Updated
May 31, 2022 - Julia
Description
Add adjoint-Jacobian specialization for reverse mode for the fast Fourier transform (FFT) and its inverse.
Example
FFT case
If y = fft(x), then the adjoint-Jacobian is just the inverse FFT applied to the adjoint of the result,
adjoint(x) += ifft(adjoint(y))
Inverse FFT case
If y = ifft(x), then the adjoint-Jacobian update rule is inve
-
Updated
May 8, 2022 - C++
-
Updated
Sep 6, 2021 - Python
Debugging Kotlin∇ code within IntelliJ IDEA can be somewhat cumbersome due to the functional API structure (lots of deeply-nested stack traces and context switching). To facilitate more user-friendly debugging, we should add support for visual debugging by exposing Kaliningraph’s built-in graph visualization capabilities. For example, the use
-
Updated
May 10, 2018 - Haskell
-
Updated
Feb 6, 2022 - Rust
-
Updated
May 5, 2022 - Julia
-
Updated
Jun 3, 2022 - Julia
-
Updated
May 30, 2022 - Jupyter Notebook
The init module has been deprecated, and the recommend approach for generating initial weights is to use the Template.shape method:
>>> from pennylane.templates import StronglyEntanglingLayers
>>> qml.init.strong_ent_layers_normal(n_layers=3, n_wires=2) # deprecated
>>> np.random.random(StronglyEntanglingLayers.shape(n_layers=3, n_wires=2)) # new approachWe should upd
-
Updated
Nov 16, 2016 - Python
-
Updated
Jan 10, 2018 - Python
Changes to Docs
Lots has changed since the docs were first written. #152 addresses a number of things, but there are a few more things that we might want to consider:
- changing all references to autodiff / automatic differentiation to AD / algorithmic differentiation, with a terminology box in the docs somewhere, explaining what we're on about.
- In the "On writing good rrule and frule " bit, we should consi
-
Updated
May 25, 2022 - Julia
profiles.h updates
At the moment profiles.h (in pkg/profiles) lacks many (any?) comments. Also lots of variables are declared somewhat separately from where they are associated with heap storage.
Both these make it a bit hard to read.
It would be nicer if it was called PROFILES.h too.
Improve this page
Add a description, image, and links to the automatic-differentiation topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the automatic-differentiation topic, visit your repo's landing page and select "manage topics."
In operations_broadcast_test.go there are some tests that are not yet filled in. The point is to test that broadcasting works for different shapes. The semantics of broadcast probably isn't clear, so please do send me a message for anything.
This is a good first issue for anyone looking to get interested