Skip to content
Please note that GitHub no longer supports your web browser.

We recommend upgrading to the latest Google Chrome or Firefox.

Learn more
Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
Branch: master
Clone or download
kshitij12345 and apeforest [MXNET-978] Support higher order gradient for `log`. (#14992)
* add higher order gradient support for log, log10, log2

* add tests

* address comments

* simplify NodeEntry creation.

* address comments

* update comment to avoid confusion.
Latest commit 8a9dd72 May 28, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Update PR Template (#9919) Mar 17, 2018
3rdparty Simplify creation of NodeEntry instances and use emplace_back (#14095) May 23, 2019
R-package [numpy] Support zero-dim and zero-size tensors in MXNet (#14661) Apr 16, 2019
amalgamation Fix typo in amalgamation README.md (TREAD->THREAD) (#15058) May 26, 2019
benchmark/python #14199: catch subprocess.CalledProcessError in get_gpus() (#14212) Mar 7, 2019
ci [MXNET-545] Fix broken cython build (#10951) May 24, 2019
cmake [MXNET-545] Fix broken cython build (#10951) May 24, 2019
contrib/clojure-package Simplify creation of NodeEntry instances and use emplace_back (#14095) May 23, 2019
cpp-package [C++] fix type inconsistent issue when loading quantized parameters (#… May 24, 2019
docker [MXNET-1093] Add python3 Docker images for each MXNet release (#12791) Mar 12, 2019
docs [MXNET-1382] Add the index_array operator (#14638) May 25, 2019
example NER example: fix divisions by zero (#15068) May 28, 2019
include Rename np_compat to np_shape (#15063) May 26, 2019
julia julia/ndarray: fix flaky test cases for `clamp` (#14776) Apr 24, 2019
make [MXNET-545] Fix broken cython build (#10951) May 24, 2019
matlab Fixes for CI downloads (#14504) Mar 26, 2019
perl-package [numpy] Support zero-dim and zero-size tensors in MXNet (#14661) Apr 16, 2019
plugin [MXNET-1330] Bring nnvm::Tuple to mxnet::Tuple (#14270) Mar 1, 2019
python Rename np_compat to np_shape (#15063) May 26, 2019
scala-package Rename np_compat to np_shape (#15063) May 26, 2019
setup-utils Update expected result in osx python install script (#10842) May 10, 2018
src [MXNET-978] Support higher order gradient for `log`. (#14992) May 28, 2019
tests [MXNET-978] Support higher order gradient for `log`. (#14992) May 28, 2019
tools Merge pull request #14887 from stu1130/publish_cuda10_1 May 21, 2019
.clang-tidy [MXNET-860] Remove std::moves that have no affect (#12730) Oct 4, 2018
.codecov.yml Enable C++ coverage (#12642) Sep 24, 2018
.gitattributes [R] To ignore R-pkg when releasing on github (#7007) Jul 13, 2017
.gitignore [Clojure] Add Fine Tuning Sentence Pair Classification BERT Example (#… May 6, 2019
.gitmodules Change CUB submodule to track Nvidia CUB project. (#13322) Mar 31, 2019
.mxnet_root CI docker revamp; Add Jetson, Raspberry and CentOS 7 build [MXNET-42]… Mar 9, 2018
.travis.yml Disable travis tests (#13137) Nov 6, 2018
CMakeLists.txt [MXNET-545] Fix broken cython build (#10951) May 24, 2019
CODEOWNERS [DEV] update code owner (#14862) May 3, 2019
CONTRIBUTORS.md [MXNET-1382] Add the index_array operator (#14638) May 25, 2019
DISCLAIMER Add DISCLAIMER and lxn2 GPG keys (#7344) Aug 5, 2017
KEYS add KEY for zachgk (#14965) May 16, 2019
LICENSE License Googletest and Appendix (#14687) Apr 19, 2019
MKLDNN_README.md [Doc] Start the tutorials for MKL-DNN backend (#14202) Mar 19, 2019
Makefile [MXNET-545] Fix broken cython build (#10951) May 24, 2019
NEWS.md [MXNET-1402] MXNet docs change for 1.4.1 release (#14949) May 20, 2019
NOTICE Update NOTICE (#14043) Feb 5, 2019
README.md [MXNET-1402] MXNet docs change for 1.4.1 release (#14949) May 20, 2019
appveyor.yml License Adds - some more (#9559) Jan 26, 2018
dev_menu.py Improve dev_menu virtualenv handling (#14788) May 9, 2019
mkldnn.mk Update MKL-DNN to v0.18 release (was: fix the Dense layer issue) (#13668 Mar 17, 2019
readthedocs.yml Update LICENSE File with subcomponents (#13808) Jan 10, 2019
snap.python Add snapcraft packaging (#4852) Mar 23, 2017
snapcraft.yaml Update LICENSE File with subcomponents (#13808) Jan 10, 2019

README.md


Apache MXNet (incubating) for Deep Learning

Master Docs License
Build Status Documentation Status GitHub license

banner

Apache MXNet (incubating) is a deep learning framework designed for both efficiency and flexibility. It allows you to mix symbolic and imperative programming to maximize efficiency and productivity. At its core, MXNet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly. A graph optimization layer on top of that makes symbolic execution fast and memory efficient. MXNet is portable and lightweight, scaling effectively to multiple GPUs and multiple machines.

MXNet is more than a deep learning project. It is a collection of blue prints and guidelines for building deep learning systems, and interesting insights of DL systems for hackers.

Ask Questions

How to Contribute

What's New

Contents

Features

  • Design notes providing useful insights that can re-used by other DL projects
  • Flexible configuration for arbitrary computation graph
  • Mix and match imperative and symbolic programming to maximize flexibility and efficiency
  • Lightweight, memory efficient and portable to smart devices
  • Scales up to multi GPUs and distributed setting with auto parallelism
  • Support for Python, Scala, C++, Java, Clojure, R, Go, Javascript, Perl, Matlab, and Julia
  • Cloud-friendly and directly compatible with S3, HDFS, and Azure

License

Licensed under an Apache-2.0 license.

Reference Paper

Tianqi Chen, Mu Li, Yutian Li, Min Lin, Naiyan Wang, Minjie Wang, Tianjun Xiao, Bing Xu, Chiyuan Zhang, and Zheng Zhang. MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems. In Neural Information Processing Systems, Workshop on Machine Learning Systems, 2015

History

MXNet emerged from a collaboration by the authors of cxxnet, minerva, and purine2. The project reflects what we have learned from the past projects. MXNet combines aspects of each of these projects to achieve flexibility, speed, and memory efficiency.

You can’t perform that action at this time.