Skip to main content

Polygraphy Trtexec: Extension to run on trtexec backend

Project description

Extending polygraphy run to support trtexec

Introduction

polygraphy run allows you to run inference with multiple backends, including TensorRT and ONNX-Runtime, and compare outputs. This extension adds support to run inference with trtexec.

Installation

Follow the steps below to install the extension module. After installation, you should see the trtexec options in the help output of polygraphy run:

  1. Build using setup.py:

    python3 setup.py bdist_wheel
    
  2. Install the wheel: The wheel is installed in the dist directory. Install the wheel by running the following command

    python3 -m pip install dist/polygraphy_trtexec-*.whl \
        --extra-index-url https://pypi.ngc.nvidia.com
    

    NOTE: You may have to update the above command to install the appropriate version of the wheel

  3. After the installation, you can run it on the trtexec backend by using the --trtexec flag as follows:

    polygraphy run sample.onnx --trtexec
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

polygraphy_trtexec-0.0.9-py3-none-any.whl (17.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page