A Flexible and Powerful Parameter Server for large-scale machine learning
-
Updated
Nov 24, 2022 - Java
A Flexible and Powerful Parameter Server for large-scale machine learning
Lightweight and Scalable framework that combines mainstream algorithms of Click-Through-Rate prediction based computational DAG, philosophy of Parameter Server and Ring-AllReduce collective communication.
extremely distributed machine learning
自己实现的深度学习训练框架,纯java实现,没有过多的第三方依赖,可分布式训练
OpenEmbedding is an open source framework for Tensorflow distributed training acceleration.
Serverless ML Framework
A fully adaptive, zero-tuning parameter manager that enables efficient distributed machine learning training
WIP. Veloce is a low-code Ray-based parallelization library that makes machine learning computation novel, efficient, and heterogeneous.
PetPS: Supporting Huge Embedding Models with Tiered Memory
Distributed Fieldaware Factorization Machines based on Parameter Server
Serving layer for large machine learning models on Apache Flink
Machine Learning models for large datasets
ROS utility package for build-time configuration file generation and dumping/restoring contents of ROS parameter server to/from ROS bags.
a simple machine learning library
Distributed training with Multi-worker & Parameter Server in TensorFlow 2
A lightweight community-aware heterogeneous parameter server paradigm.
A demonstration app of the parameter server implementation for gSMFRETda.
Add a description, image, and links to the parameter-server topic page so that developers can more easily learn about it.
To associate your repository with the parameter-server topic, visit your repo's landing page and select "manage topics."