Guest User

Untitled

a guest
Jul 24th, 2025
34
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 12.06 KB | None | 0 0
  1. [07/24/25 10:24:27] INFO Loading textual model 'ViT-B-16-SigLIP-384__webli'
  2. to memory
  3. [07/24/25 10:24:27] INFO Setting execution providers to
  4. ['CUDAExecutionProvider', 'CPUExecutionProvider'],
  5. in descending order of preference
  6. 2025-07-24 10:24:30.816710599 [E:onnxruntime:Default, cuda_call.cc:118 CudaCall] CUDNN failure 5000: CUDNN_STATUS_EXECUTION_FAILED ; GPU=0 ; hostname=dbbeeeb0e665 ; file=/onnxruntime_src/onnxruntime/core/providers/cuda/reduction/reduction_ops.cc ; line=571 ; expr=cudnnReduceTensor( CudaKernel::GetCudnnHandle(cuda_stream), reduce_desc, indices_cuda.get(), indices_bytes, workspace_cuda.get(), workspace_bytes, &one, input_tensor, reinterpret_cast<const CudaT*>(input.Data<T>()), &zero, output_tensor, p_output);
  7. 2025-07-24 10:24:30.816818329 [E:onnxruntime:, sequential_executor.cc:516 ExecuteKernel] Non-zero status code returned while running ReduceL2 node. Name:'ReduceL2_1624' Status Message: CUDNN failure 5000: CUDNN_STATUS_EXECUTION_FAILED ; GPU=0 ; hostname=dbbeeeb0e665 ; file=/onnxruntime_src/onnxruntime/core/providers/cuda/reduction/reduction_ops.cc ; line=571 ; expr=cudnnReduceTensor( CudaKernel::GetCudnnHandle(cuda_stream), reduce_desc, indices_cuda.get(), indices_bytes, workspace_cuda.get(), workspace_bytes, &one, input_tensor, reinterpret_cast<const CudaT*>(input.Data<T>()), &zero, output_tensor, p_output);
  8. [07/24/25 10:24:30] ERROR Exception in ASGI application
  9.  
  10. ╭─────── Traceback (most recent call last) ───────╮
  11. │ /app/immich/machine-learning/immich_ml/main.py: │
  12. │ 177 in predict │
  13. │ │
  14. │ 174 │ │ inputs = text │
  15. │ 175 │ else: │
  16. │ 176 │ │ raise HTTPException(400, "Either │
  17. │ ❱ 177 │ response = await run_inference(inputs │
  18. │ 178 │ return ORJSONResponse(response) │
  19. │ 179 │
  20. │ 180 │
  21. │ │
  22. │ /app/immich/machine-learning/immich_ml/main.py: │
  23. │ 200 in run_inference │
  24. │ │
  25. │ 197 │ │ response[entry["task"]] = output │
  26. │ 198 │ │
  27. │ 199 │ without_deps, with_deps = entries │
  28. │ ❱ 200 │ await asyncio.gather(*[_run_inference │
  29. │ 201 │ if with_deps: │
  30. │ 202 │ │ await asyncio.gather(*[_run_infer │
  31. │ 203 │ if isinstance(payload, Image): │
  32. │ │
  33. │ /app/immich/machine-learning/immich_ml/main.py: │
  34. │ 195 in _run_inference │
  35. │ │
  36. │ 192 │ │ │ │ message = f"Task {entry[' │
  37. │ output of {dep}" │
  38. │ 193 │ │ │ │ raise HTTPException(400, │
  39. │ 194 │ │ model = await load(model) │
  40. │ ❱ 195 │ │ output = await run(model.predict, │
  41. │ 196 │ │ outputs[model.identity] = output │
  42. │ 197 │ │ response[entry["task"]] = output │
  43. │ 198 │
  44. │ │
  45. │ /app/immich/machine-learning/immich_ml/main.py: │
  46. │ 213 in run │
  47. │ │
  48. │ 210 │ if thread_pool is None: │
  49. │ 211 │ │ return func(*args, **kwargs) │
  50. │ 212 │ partial_func = partial(func, *args, * │
  51. │ ❱ 213 │ return await asyncio.get_running_loop │
  52. │ 214 │
  53. │ 215 │
  54. │ 216 async def load(model: InferenceModel) -> │
  55. │ │
  56. │ /usr/lib/python3.11/concurrent/futures/thread.p │
  57. │ y:58 in run │
  58. │ │
  59. │ /app/immich/machine-learning/immich_ml/models/b │
  60. │ ase.py:61 in predict │
  61. │ │
  62. │ 58 │ │ self.load() │
  63. │ 59 │ │ if model_kwargs: │
  64. │ 60 │ │ │ self.configure(**model_kwargs │
  65. │ ❱ 61 │ │ return self._predict(*inputs, **m │
  66. │ 62 │ │
  67. │ 63 │ @abstractmethod │
  68. │ 64 │ def _predict(self, *inputs: Any, **mo │
  69. │ │
  70. │ /app/immich/machine-learning/immich_ml/models/c │
  71. │ lip/textual.py:24 in _predict │
  72. │ │
  73. │ 21 │ │
  74. │ 22 │ def _predict(self, inputs: str, langu │
  75. │ 23 │ │ tokens = self.tokenize(inputs, la │
  76. │ ❱ 24 │ │ res: NDArray[np.float32] = self.s │
  77. │ 25 │ │ return serialize_np_array(res) │
  78. │ 26 │ │
  79. │ 27 │ def _load(self) -> ModelSession: │
  80. │ │
  81. │ /app/immich/machine-learning/immich_ml/sessions │
  82. │ /ort.py:49 in run │
  83. │ │
  84. │ 46 │ │ input_feed: dict[str, NDArray[np. │
  85. │ 47 │ │ run_options: Any = None, │
  86. │ 48 │ ) -> list[NDArray[np.float32]]: │
  87. │ ❱ 49 │ │ outputs: list[NDArray[np.float32] │
  88. │ run_options) │
  89. │ 50 │ │ return outputs │
  90. │ 51 │ │
  91. │ 52 │ @property │
  92. │ │
  93. │ /lsiopy/lib/python3.11/site-packages/onnxruntim │
  94. │ e/capi/onnxruntime_inference_collection.py:220 │
  95. │ in run │
  96. │ │
  97. │ 217 │ │ if not output_names: │
  98. │ 218 │ │ │ output_names = [output.name │
  99. │ 219 │ │ try: │
  100. │ ❱ 220 │ │ │ return self._sess.run(output │
  101. │ 221 │ │ except C.EPFail as err: │
  102. │ 222 │ │ │ if self._enable_fallback: │
  103. │ 223 │ │ │ │ print(f"EP Error: {err!s │
  104. ╰─────────────────────────────────────────────────╯
  105. Fail: [ONNXRuntimeError] : 1 : FAIL : Non-zero
  106. status code returned while running ReduceL2 node.
  107. Name:'ReduceL2_1624' Status Message: CUDNN failure
  108. 5000: CUDNN_STATUS_EXECUTION_FAILED ; GPU=0 ;
  109. hostname=dbbeeeb0e665 ;
  110. file=/onnxruntime_src/onnxruntime/core/providers/cu
  111. da/reduction/reduction_ops.cc ; line=571 ;
  112. expr=cudnnReduceTensor(
  113. CudaKernel::GetCudnnHandle(cuda_stream),
  114. reduce_desc, indices_cuda.get(), indices_bytes,
  115. workspace_cuda.get(), workspace_bytes, &one,
  116. input_tensor, reinterpret_cast<const
  117. CudaT*>(input.Data<T>()), &zero, output_tensor,
  118. p_output);
  119. [Nest] 678 - 07/24/2025, 10:24:31 AM WARN [Api:MachineLearningRepository~iszohiy5] Machine learning request to "http://127.0.0.1:3003" failed with status 500: Internal Server Error
  120. [Nest] 678 - 07/24/2025, 10:24:31 AM ERROR [Api:ErrorInterceptor~iszohiy5] Unknown error: Error: Machine learning request '{"clip":{"textual":{"modelName":"ViT-B-16-SigLIP-384__webli","options":{"language":"nl-NL"}}}}' failed for all URLs
  121. Error: Machine learning request '{"clip":{"textual":{"modelName":"ViT-B-16-SigLIP-384__webli","options":{"language":"nl-NL"}}}}' failed for all URLs
  122. at MachineLearningRepository.predict (/app/immich/server/dist/repositories/machine-learning.repository.js:98:15)
  123. at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
  124. at async MachineLearningRepository.encodeText (/app/immich/server/dist/repositories/machine-learning.repository.js:121:26)
  125. at async SearchService.searchSmart (/app/immich/server/dist/services/search.service.js:84:25)
Advertisement
Add Comment
Please, Sign In to add comment