The Triton Inference Server provides an optimized cloud and edge inferencing solution.
-
Updated
Jun 9, 2023 - Python
The Triton Inference Server provides an optimized cloud and edge inferencing solution.
FireSim: Fast and Effortless FPGA-accelerated Hardware Simulation with On-Prem and Cloud Flexibility
Using network observability to operate and design healthier networks
Disseminated, Distributed OS for Hardware Resource Disaggregation. USENIX OSDI 2018 Best Paper.
Automated, multi-region container deployment
An Advanced Linux RAM Drive and Caching kernel modules. Dynamically allocate RAM as block devices. Use them as stand alone drives or even map them as caching nodes to slower local disk drives. Access those volumes locally or export them across an NVMe Target network. Manage it all from a web API.
CloudSimPy: Datacenter job scheduling simulation framework
Toolkit to accelerate Azure adoption for enterprise customers
AMD OpenNIC Project Overview
API to automate IP Networking management, resource allocation and provisioning.
Lists of locations & IP addresses of Valve servers
AMD OpenNIC Shell includes the HDL source files
A platform to test reinforcement learning policies in the datacenter setting.
Collaborative Datacenter Simulation and Exploration for Everybody
Run speed tests for all DigitalOcean datacenters faster than ever.
AMD OpenNIC driver includes the Linux kernel driver
MemLiner is a remote-memory-friendly runtime system.
Extension for iTop: Easily manage & visualize your racks, enclosures and datacenter devices.
https://blog.koehntopp.info, previously named https://isotopp.github.io
The official open source ns-3 simulation framework for datacenter network architectures
Add a description, image, and links to the datacenter topic page so that developers can more easily learn about it.
To associate your repository with the datacenter topic, visit your repo's landing page and select "manage topics."