runtime error

Exit code: 1. Reason: ead(2220632308 bytes read, 258963596 more expected) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 820, in generate yield from self.raw.stream(chunk_size, decode_content=True) File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 1066, in stream data = self.read(amt=amt, decode_content=decode_content) File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 983, in read data = self._raw_read(amt) File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 878, in _raw_read with self._error_catcher(): File "/usr/local/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.10/site-packages/urllib3/response.py", line 778, in _error_catcher raise ProtocolError(arg, e) from e urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(2220632308 bytes read, 258963596 more expected)', IncompleteRead(2220632308 bytes read, 258963596 more expected)) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 4, in <module> from apiServer import app File "/home/user/app/apiServer.py", line 10, in <module> model_handler = ModelHandler() File "/home/user/app/model_handler.py", line 35, in __init__ self._load_model() File "/home/user/app/model_handler.py", line 83, in _load_model self._download_model() File "/home/user/app/model_handler.py", line 57, in _download_model for chunk in response.iter_content(chunk_size=8192): File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 822, in generate raise ChunkedEncodingError(e) requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(2220632308 bytes read, 258963596 more expected)', IncompleteRead(2220632308 bytes read, 258963596 more expected))

Container logs:

Fetching error logs...