To solve ERROR: Preprocessor transform input data failed nvinfer error:NVDSINFER_CUDA_ERROR error follow below methods.
ERROR: [TRT]: ../rtSafe/cuda/caskConvolutionRunner.cpp (317) - Cuda Error in allocateContextResources: 700 (an illegal memory access was encountered) ERROR: [TRT]: FAILED_EXECUTION: std::exception ERROR: Failed to enqueue inference batch ERROR: Infer context enqueue buffer failed, nvinfer error:NVDSINFER_TENSORRT_ERROR 0:00:14.231465240 15632 0x3d67d8f0 WARN nvinfer gstnvinfer.cpp:1225:gst_nvinfer_input_queue_loop:<primary-inference> error: Failed to queue input batch for inferencing Error: gst-stream-error-quark: Failed to queue input batch for inferencing (1): /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1225): gst_nvinfer_input_queue_loop (): /GstPipeline:pipeline0/GstNvInfer:primary-inference ERROR: Failed to make stream wait on event, cuda err_no:77, err_str:cudaErrorIllegalAddress ERROR: Preprocessor transform input data failed., nvinfer error:NVDSINFER_CUDA_ERROR
How to solve ERROR: Preprocessor transform input data failed nvinfer error:NVDSINFER_CUDA_ERROR ?
It seems something related to your JetPack prebuilt libs (CUDA/TensorRT). This error generally occurs when there are compatibility issues with various packages like tensorrt, jetpack verison, deepstream version and cudnn. To resolve this error use the following configurations as mentioned below .
If you are using DeepStream 5.1 then it must work on Jetpack 4.5.1, so please make sure your Jetpack version is 4.5.1.
If you are using DeepStream 6.0 then it is better to work on Jetpack 4.6 and tensorrt 8.0 and above so please make sure your Jetpack version is 4.6.
Deepstream 5.0 and below
Deepstream 5.0 GA is intended to be used in conjunction with JetPack 4.4, which includes CUDA 10.2, TensorRT 7.1, and cuDNN 8.0. You’re also getting a cuDNN version mismatch notice. After deleting your present CUDA configuration, try installing the right environment using NVIDIA SDK Manager .
Hope the above solution works.
Also read :