Skip to content
#

document-parser

Here are 13 public repositories matching this topic...

gregbugaj
gregbugaj commented Mar 21, 2022

To improve model performance during CPU inference we can convert the models for ONNX and then use if onnxruntime is available during inference time.

Following script check_onnx_runtime.py can be used to test the performance of the models.

Inference time Results

2400x2400 on Resnet50 model
PyTorch 3.6160961884500464 VS ONNX 2.131322395749976
![image](https://user-images.githubusercon

enhancement good first issue

Improve this page

Add a description, image, and links to the document-parser topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the document-parser topic, visit your repo's landing page and select "manage topics."

Learn more