onnxruntime
d7f32e25 - Fix CUDA minimal build requiring CUDNN_HOME (#27308)

Commit
33 days ago
Fix CUDA minimal build requiring CUDNN_HOME (#27308) When building with `onnxruntime_CUDA_MINIMAL=ON`, cuDNN is not needed since minimal CUDA builds only use basic CUDA runtime (e.g., for TensorRT-only builds). Skip `cuDNN.cmake` inclusion and `CUDNN_INCLUDE_DIR` references when `CUDA_MINIMAL` is enabled. Fixes #24361 ### Description Make cuDNN dependency conditional on `NOT onnxruntime_CUDA_MINIMAL`: - `cmake/external/onnxruntime_external_deps.cmake`: Skip `include(cuDNN)` - `cmake/onnxruntime_python.cmake`: Skip `CUDNN_INCLUDE_DIR` in include paths - `cmake/onnxruntime_unittests.cmake`: Skip `CUDNN_INCLUDE_DIR` in two places ### Motivation and Context When building with `onnxruntime_CUDA_MINIMAL=ON` (e.g., TensorRT-only builds), the build fails with: CMake Error at external/cuDNN.cmake:3 (find_path): Could not find CUDNN_INCLUDE_DIR using the following files: cudnn.h cuDNN is not needed for minimal CUDA builds since TensorRT has its own optimized kernels. This fix allows building without setting `CUDNN_HOME`.
Author
Parents
Loading