DeepFoldProtein/openfold3-trt

TensorRT BF16 engines for OpenFold3 (pairformer + token transformer), built with Bio-TRT.

TRT engines are GPU-architecture specific. Download the directory matching your GPU's compute capability.

Available Engines

Directory GPU SM TRT Version CUDA Built
B200_SM10.0_TRT10.15/ NVIDIA B200 SM 10.0 10.15.1.29 13.0 2026-04-07T05:01:28.561628Z

Contents (per GPU directory)

<gpu_tag>/
β”œβ”€β”€ pairformer_engine/trt/
β”‚   β”œβ”€β”€ rank0.engine    (~340 MB, BF16)
β”‚   └── config.json
└── token_transformer_engine/trt/
    β”œβ”€β”€ rank0.engine    (~387 MB, BF16)
    └── config.json

Usage

# Download engines for your GPU
pip install huggingface-hub
huggingface-cli download DeepFoldProtein/openfold3-trt --local-dir engines \
    --include "B200_SM10.0_TRT10.15/**"

# Run inference
bash openfold3/run_inference.sh \
    --query examples/openfold3/ubiquitin.json \
    --checkpoint /path/to/of3-p2-155k.pt \
    --engines-dir engines/B200_SM10.0_TRT10.15 \
    --output-dir results/ubiquitin
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support