#
distributed-learning
Here are 11 public repositories matching this topic...
BurrMill core
-
Updated
Dec 10, 2020 - Shell
Docker CLI package for the vantage6 infrastructure
-
Updated
Dec 17, 2020 - Python
-
Updated
May 12, 2019 - Python
GGM structure learning using 1 bit.
-
Updated
May 11, 2020 - Python
The smartphones of the people probably carry the most valueable but also private data. Since using data promisses to be one of the best ways to fight back against COVID-19, it is highly desirable to get access. By using a Federated Learning approach with PySyft it is possible to learn from the private data right on the smartphone, with the data never leaving the device.
machine-learning
deep-learning
health-data
federated-learning
distributed-learning
coronavirus
wirvsvirus
-
Updated
Mar 22, 2020 - Jupyter Notebook
-
Updated
Dec 24, 2017 - Jupyter Notebook
Sample PHT implementations efforts from the PHT German team
-
Updated
Jan 28, 2019 - Java
Codes and experiments for the paper "Max-Discrepancy Distributed Learning: Fast Risk Bounds and Algorithms"
-
Updated
Jun 9, 2019 - C++
online-learning
incremental-learning
multi-class-classification
distributed-learning
continue-learning
-
Updated
Jan 7, 2019 - Python
[WIP] elastic training implemented with MXNet
-
Updated
Dec 20, 2019 - Python
Improve this page
Add a description, image, and links to the distributed-learning topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the distributed-learning topic, visit your repo's landing page and select "manage topics."
When using the ESC50 dataset, we preprocess the data at each instanciation of a dataset. However the preprocessing is quite long.
It would be great to store the preprocessed inputs in the local_data/esc50 folder.
It would also save time during unit-tests