Skip to content
Avatar
💭
I may be slow to respond.
💭
I may be slow to respond.

Organizations

@budatascienceandanalytics
Block or Report

Block or report rmccorm4

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse

Pinned

  1. TensorRT is a C++ library for high performance inference on NVIDIA GPUs and deep learning accelerators.

    C++ 5.7k 1.4k

  2. The Triton Inference Server provides an optimized cloud and edge inferencing solution.

    Python 3.9k 922

  3. Google Sheets Python API v4

    Python 1.3k 191

  4. Useful scripts when using TensorRT

    Python 210 52

  5. 🔬 Some personal research code on analyzing CNNs. Started with a thorough exploration of Stanford's Tiny-Imagenet-200 dataset.

    Python 68 24

  6. NumberPhile Public

    A repository for simulating some of the interesting mathematics problems discussed on the popular YouTube channel, NumberPhile. One implementation done so far is a visualization of the golden ratio…

    Python 6

282 contributions in the last year

Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Mon Wed Fri

Contribution activity

July 2022

Created a pull request in triton-inference-server/core that received 5 comments

Verify startup_models (--load-model args) exist

Detect non-existent startup models and raise an error on failing to load them Server QA test update: triton-inference-server/server#4681

+8 −6 5 comments
Reviewed 25 pull requests in 6 repositories
triton-inference-server/server 17 pull requests
triton-inference-server/core 3 pull requests
triton-inference-server/client 2 pull requests
triton-inference-server/tensorflow_backend 1 pull request
triton-inference-server/tensorrt_backend 1 pull request
triton-inference-server/backend 1 pull request

Seeing something unexpected? Take a look at the GitHub profile guide.