text-generation-inference
cc212154 - Bump TensorRT-LLM backend dependency to v0.16.0 (#2931)

Commit
1 year ago
Bump TensorRT-LLM backend dependency to v0.16.0 (#2931) * backend(trtllm): update to 0.16.0 * backend(trtllm): do not use shallow clone * backend(trtllm): use tag instead * backend(trtllm): move to nvidia remote instead of hf * backend(trtllm): reenable shallow clone * backend(trtllm): attempt to use ADD instead of RUN for openmpi * backend(trtllm): make sure we are using correct path for openmpi ADD in dockerfile * backend(trtllm): add correctly untar it
Author
Parents
Loading