Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- PS C:\Users\varun> WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
- 2022-12-17T12:06:10,522 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Loading snapshot serializer plugin...
- 2022-12-17T12:06:10,549 [INFO ] main org.pytorch.serve.servingsdk.impl.PluginsManager - Initializing plugins manager...
- 2022-12-17T12:06:10,778 [INFO ] main org.pytorch.serve.ModelServer -
- Torchserve version: 0.7.0
- TS Home: C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages
- Current directory: C:\Users\varun
- Temp directory: C:\Users\varun\AppData\Local\Temp
- Metrics config path: C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml
- Number of GPUs: 0
- Number of CPUs: 12
- Max heap size: 4018 M
- Python executable: C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe
- Config file: logs\config\20221217115823053-startup.cfg
- Inference address: http://127.0.0.1:8080
- Management address: http://127.0.0.1:8081
- Metrics address: http://127.0.0.1:8082
- Model Store: C:\Users\varun\model_store
- Initial Models: bert=bert.mar
- Log dir: C:\Users\varun\logs
- Metrics dir: C:\Users\varun\logs
- Netty threads: 0
- Netty client threads: 0
- Default workers per model: 12
- Blacklist Regex: N/A
- Maximum Response Size: 6553500
- Maximum Request Size: 6553500
- Limit Maximum Image Pixels: true
- Prefer direct buffer: false
- Allowed Urls: [file://.*|http(s)?://.*]
- Custom python dependency for model allowed: false
- Metrics report format: prometheus
- Enable metrics API: true
- Workflow Store: C:\Users\varun\model_store
- Model config: N/A
- 2022-12-17T12:06:10,783 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Started restoring models from snapshot {
- "name": "20221217115823053-startup.cfg",
- "modelCount": 1,
- "created": 1671258503053,
- "models": {
- "bert": {
- "1.0": {
- "defaultVersion": true,
- "marName": "bert.mar",
- "minWorkers": 12,
- "maxWorkers": 12,
- "batchSize": 1,
- "maxBatchDelay": 100,
- "responseTimeout": 120
- }
- }
- }
- }
- 2022-12-17T12:06:10,790 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Validating snapshot 20221217115823053-startup.cfg
- 2022-12-17T12:06:10,791 [INFO ] main org.pytorch.serve.snapshot.SnapshotManager - Snapshot 20221217115823053-startup.cfg validated successfully
- 2022-12-17T12:06:15,993 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Adding new version 1.0 for model bert
- 2022-12-17T12:06:15,993 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model bert
- 2022-12-17T12:06:15,993 [DEBUG] main org.pytorch.serve.wlm.ModelVersionedRefs - Setting default version to 1.0 for model bert
- 2022-12-17T12:06:15,994 [INFO ] main org.pytorch.serve.wlm.ModelManager - Model bert loaded.
- 2022-12-17T12:06:15,994 [DEBUG] main org.pytorch.serve.wlm.ModelManager - updateModel: bert, count: 12
- 2022-12-17T12:06:16,004 [INFO ] main org.pytorch.serve.ModelServer - Initialize Inference server with: NioServerSocketChannel.
- 2022-12-17T12:06:16,008 [DEBUG] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9008, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9010, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9001, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,009 [DEBUG] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9006, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9000, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9004, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9003, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9009, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9005, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9007, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9011, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,008 [DEBUG] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - Worker cmdline: [C:\Users\varun\AppData\Local\Programs\Python\Python310\python.exe, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py, --sock-type, tcp, --port, 9002, --metrics-config, C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml]
- 2022-12-17T12:06:16,298 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,303 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,304 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - [PID]17944
- 2022-12-17T12:06:16,305 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,305 [DEBUG] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,306 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,313 [INFO ] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9008
- 2022-12-17T12:06:16,318 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,320 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,324 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,324 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - [PID]18564
- 2022-12-17T12:06:16,324 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,326 [DEBUG] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,326 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,327 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - [PID]9012
- 2022-12-17T12:06:16,327 [INFO ] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9002
- 2022-12-17T12:06:16,326 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,329 [DEBUG] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,330 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,330 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,330 [INFO ] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9011
- 2022-12-17T12:06:16,332 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,338 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,339 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - [PID]19204
- 2022-12-17T12:06:16,340 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,339 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,340 [DEBUG] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,340 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,341 [INFO ] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9009
- 2022-12-17T12:06:16,341 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,345 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,345 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,347 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - [PID]13056
- 2022-12-17T12:06:16,347 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - [PID]18316
- 2022-12-17T12:06:16,348 [DEBUG] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,348 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,348 [DEBUG] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,348 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,348 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,348 [INFO ] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9007
- 2022-12-17T12:06:16,348 [INFO ] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9003
- 2022-12-17T12:06:16,349 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,351 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,352 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,356 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,356 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - [PID]19908
- 2022-12-17T12:06:16,356 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,357 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,357 [DEBUG] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,357 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - [PID]20220
- 2022-12-17T12:06:16,357 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,357 [INFO ] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9005
- 2022-12-17T12:06:16,357 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,357 [DEBUG] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,358 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,358 [INFO ] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9006
- 2022-12-17T12:06:16,358 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,358 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,363 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,364 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,364 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - [PID]5712
- 2022-12-17T12:06:16,365 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,365 [DEBUG] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,365 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - [PID]14684
- 2022-12-17T12:06:16,365 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,365 [INFO ] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9001
- 2022-12-17T12:06:16,365 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,365 [DEBUG] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,365 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,365 [INFO ] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9010
- 2022-12-17T12:06:16,371 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,375 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,375 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - [PID]924
- 2022-12-17T12:06:16,375 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,375 [DEBUG] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,376 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,376 [INFO ] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9000
- 2022-12-17T12:06:16,376 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Listening on port: None
- 2022-12-17T12:06:16,379 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Successfully loaded C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages/ts/configs/metrics.yaml.
- 2022-12-17T12:06:16,379 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - [PID]8360
- 2022-12-17T12:06:16,379 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Torch worker started.
- 2022-12-17T12:06:16,379 [DEBUG] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-bert_1.0 State change null -> WORKER_STARTED
- 2022-12-17T12:06:16,381 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Python runtime: 3.10.8
- 2022-12-17T12:06:16,381 [INFO ] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Connecting to: /127.0.0.1:9004
- 2022-12-17T12:06:16,593 [INFO ] main org.pytorch.serve.ModelServer - Inference API bind to: http://127.0.0.1:8080
- 2022-12-17T12:06:16,593 [INFO ] main org.pytorch.serve.ModelServer - Initialize Management server with: NioServerSocketChannel.
- 2022-12-17T12:06:16,596 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9004).
- 2022-12-17T12:06:16,596 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9011).
- 2022-12-17T12:06:16,596 [INFO ] main org.pytorch.serve.ModelServer - Management API bind to: http://127.0.0.1:8081
- 2022-12-17T12:06:16,596 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9006).
- 2022-12-17T12:06:16,596 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9000).
- 2022-12-17T12:06:16,596 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9007).
- 2022-12-17T12:06:16,596 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9005).
- 2022-12-17T12:06:16,598 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9010).
- 2022-12-17T12:06:16,596 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9002).
- 2022-12-17T12:06:16,596 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9001).
- 2022-12-17T12:06:16,598 [INFO ] main org.pytorch.serve.ModelServer - Initialize Metrics server with: NioServerSocketChannel.
- 2022-12-17T12:06:16,596 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9008).
- 2022-12-17T12:06:16,596 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9003).
- 2022-12-17T12:06:16,596 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Connection accepted: ('127.0.0.1', 9009).
- 2022-12-17T12:06:16,600 [INFO ] main org.pytorch.serve.ModelServer - Metrics API bind to: http://127.0.0.1:8082
- 2022-12-17T12:06:16,602 [INFO ] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976602
- 2022-12-17T12:06:16,603 [INFO ] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,602 [INFO ] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976602
- 2022-12-17T12:06:16,603 [INFO ] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,603 [INFO ] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,603 [INFO ] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,603 [INFO ] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,602 [INFO ] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976602
- 2022-12-17T12:06:16,603 [INFO ] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,602 [INFO ] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976602
- 2022-12-17T12:06:16,603 [INFO ] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,603 [INFO ] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Flushing req. to backend at: 1671258976603
- 2022-12-17T12:06:16,684 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,687 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,686 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,688 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,686 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,684 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,684 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,684 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,688 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,690 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,694 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,694 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,695 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,695 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,695 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,694 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,701 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,728 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,695 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,695 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,695 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,695 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,694 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,730 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,731 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,729 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-2 org.pytorch.serve.wlm.WorkerThread - 9001 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,726 [INFO ] nioEventLoopGroup-5-8 org.pytorch.serve.wlm.WorkerThread - 9007 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-6 org.pytorch.serve.wlm.WorkerThread - 9000 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-11 org.pytorch.serve.wlm.WorkerThread - 9011 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-7 org.pytorch.serve.wlm.WorkerThread - 9009 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-4 org.pytorch.serve.wlm.WorkerThread - 9003 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-5 org.pytorch.serve.wlm.WorkerThread - 9010 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-9 org.pytorch.serve.wlm.WorkerThread - 9006 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-10 org.pytorch.serve.wlm.WorkerThread - 9002 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,726 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-1 org.pytorch.serve.wlm.WorkerThread - 9005 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-3 org.pytorch.serve.wlm.WorkerThread - 9004 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,725 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,725 [INFO ] nioEventLoopGroup-5-12 org.pytorch.serve.wlm.WorkerThread - 9008 Worker disconnected. WORKER_STARTED
- 2022-12-17T12:06:16,702 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - model_name: bert, batchSize: 1
- 2022-12-17T12:06:16,701 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,701 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,696 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,740 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Backend worker process died.
- 2022-12-17T12:06:16,739 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,739 [DEBUG] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,738 [DEBUG] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,738 [DEBUG] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,738 [DEBUG] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,737 [DEBUG] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,736 [INFO ] W-9010-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9010-bert_1.0-stderr
- 2022-12-17T12:06:16,746 [INFO ] W-9009-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9009-bert_1.0-stderr
- 2022-12-17T12:06:16,736 [INFO ] W-9006-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9006-bert_1.0-stderr
- 2022-12-17T12:06:16,734 [INFO ] W-9005-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9005-bert_1.0-stderr
- 2022-12-17T12:06:16,734 [DEBUG] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,733 [DEBUG] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,733 [DEBUG] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,732 [DEBUG] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,731 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,731 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,731 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,730 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,751 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,751 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,751 [DEBUG] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,751 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,750 [DEBUG] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,750 [DEBUG] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - System state is : WORKER_STARTED
- 2022-12-17T12:06:16,747 [INFO ] W-9003-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-bert_1.0-stderr
- 2022-12-17T12:06:16,746 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,745 [INFO ] W-9000-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-bert_1.0-stderr
- 2022-12-17T12:06:16,745 [INFO ] W-9011-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9011-bert_1.0-stderr
- 2022-12-17T12:06:16,742 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,742 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,742 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,741 [INFO ] W-9001-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-bert_1.0-stderr
- 2022-12-17T12:06:16,741 [INFO ] W-9004-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9004-bert_1.0-stderr
- 2022-12-17T12:06:16,741 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - Traceback (most recent call last):
- 2022-12-17T12:06:16,755 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,755 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,755 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,754 [INFO ] W-9002-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9002-bert_1.0-stderr
- 2022-12-17T12:06:16,754 [INFO ] W-9008-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9008-bert_1.0-stderr
- 2022-12-17T12:06:16,753 [INFO ] W-9007-bert_1.0-stderr org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9007-bert_1.0-stderr
- 2022-12-17T12:06:16,752 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,757 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,752 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,751 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,751 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,751 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,758 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,758 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,757 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,757 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,758 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,757 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,758 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,756 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,756 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,756 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 221, in <module>
- 2022-12-17T12:06:16,756 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,760 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,758 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,758 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,758 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,758 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,758 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,758 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,758 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,762 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,758 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,753 [DEBUG] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,743 [DEBUG] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,758 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,742 [DEBUG] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,763 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,764 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,752 [DEBUG] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,746 [DEBUG] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,750 [DEBUG] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,753 [DEBUG] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,765 [WARN ] W-9000-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,750 [DEBUG] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,751 [DEBUG] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,745 [DEBUG] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,765 [WARN ] W-9011-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,747 [DEBUG] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,750 [DEBUG] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Backend worker monitoring thread interrupted or backend worker process died.
- java.lang.InterruptedException: null
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2056) ~[?:?]
- at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.awaitNanos(AbstractQueuedSynchronizer.java:2133) ~[?:?]
- at java.util.concurrent.ArrayBlockingQueue.poll(ArrayBlockingQueue.java:432) ~[?:?]
- at org.pytorch.serve.wlm.WorkerThread.run(WorkerThread.java:191) [model-server.jar:?]
- at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) [?:?]
- at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
- at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
- at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
- at java.lang.Thread.run(Thread.java:834) [?:?]
- 2022-12-17T12:06:16,762 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,762 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,761 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,766 [WARN ] W-9003-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,761 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,761 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,761 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,761 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,767 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,760 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - worker.run_server()
- 2022-12-17T12:06:16,767 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,760 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,767 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,767 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,767 [DEBUG] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9003-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,766 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,766 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,766 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,766 [WARN ] W-9006-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,766 [WARN ] W-9005-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,768 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,766 [WARN ] W-9001-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,765 [DEBUG] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9000-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,765 [WARN ] W-9010-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,768 [DEBUG] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9001-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,764 [WARN ] W-9002-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,764 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,768 [DEBUG] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9002-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,764 [WARN ] W-9007-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,764 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 100, in load
- 2022-12-17T12:06:16,764 [WARN ] W-9008-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,764 [WARN ] W-9004-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,764 [WARN ] W-9009-bert_1.0 org.pytorch.serve.wlm.BatchAggregator - Load model failed: bert, error: Worker died.
- 2022-12-17T12:06:16,768 [DEBUG] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9004-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,768 [DEBUG] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9008-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,768 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - module, function_name = self._load_handler_file(handler)
- 2022-12-17T12:06:16,768 [DEBUG] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9007-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,768 [WARN ] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9002-bert_1.0-stderr
- 2022-12-17T12:06:16,768 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,768 [WARN ] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-bert_1.0-stderr
- 2022-12-17T12:06:16,768 [WARN ] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-bert_1.0-stderr
- 2022-12-17T12:06:16,769 [WARN ] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9002-bert_1.0-stdout
- 2022-12-17T12:06:16,768 [DEBUG] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9011-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,768 [DEBUG] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9005-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,768 [DEBUG] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9006-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,770 [WARN ] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9005-bert_1.0-stderr
- 2022-12-17T12:06:16,768 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,770 [WARN ] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9005-bert_1.0-stdout
- 2022-12-17T12:06:16,767 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,770 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,767 [WARN ] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9003-bert_1.0-stderr
- 2022-12-17T12:06:16,770 [INFO ] W-9005-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9005 in 1 seconds.
- 2022-12-17T12:06:16,767 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,767 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,767 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,767 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 189, in run_server
- 2022-12-17T12:06:16,770 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 154, in handle_connection
- 2022-12-17T12:06:16,767 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,767 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,771 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,771 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,770 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,770 [INFO ] W-9002-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9002 in 1 seconds.
- 2022-12-17T12:06:16,770 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,770 [WARN ] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9003-bert_1.0-stdout
- 2022-12-17T12:06:16,770 [INFO ] W-9003-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,770 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,770 [WARN ] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9006-bert_1.0-stderr
- 2022-12-17T12:06:16,772 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 100, in load
- 2022-12-17T12:06:16,769 [DEBUG] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9010-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,769 [WARN ] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9011-bert_1.0-stderr
- 2022-12-17T12:06:16,772 [WARN ] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9010-bert_1.0-stderr
- 2022-12-17T12:06:16,769 [WARN ] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9001-bert_1.0-stdout
- 2022-12-17T12:06:16,769 [WARN ] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9007-bert_1.0-stderr
- 2022-12-17T12:06:16,769 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,772 [WARN ] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9007-bert_1.0-stdout
- 2022-12-17T12:06:16,769 [INFO ] W-9005-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 145, in _load_handler_file
- 2022-12-17T12:06:16,769 [WARN ] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9008-bert_1.0-stderr
- 2022-12-17T12:06:16,769 [WARN ] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9004-bert_1.0-stderr
- 2022-12-17T12:06:16,768 [INFO ] W-9002-bert_1.0-stdout MODEL_LOG - service, result, code = self.load_model(msg)
- 2022-12-17T12:06:16,773 [WARN ] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9004-bert_1.0-stdout
- 2022-12-17T12:06:16,768 [DEBUG] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - W-9009-bert_1.0 State change WORKER_STARTED -> WORKER_STOPPED
- 2022-12-17T12:06:16,773 [INFO ] W-9005-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9005-bert_1.0-stdout
- 2022-12-17T12:06:16,773 [WARN ] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9009-bert_1.0-stderr
- 2022-12-17T12:06:16,773 [WARN ] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9008-bert_1.0-stdout
- 2022-12-17T12:06:16,772 [INFO ] W-9007-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9007 in 1 seconds.
- 2022-12-17T12:06:16,774 [INFO ] W-9008-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9008 in 1 seconds.
- 2022-12-17T12:06:16,772 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,772 [WARN ] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9000-bert_1.0-stdout
- 2022-12-17T12:06:16,772 [INFO ] W-9001-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9001 in 1 seconds.
- 2022-12-17T12:06:16,774 [INFO ] W-9000-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9000 in 1 seconds.
- 2022-12-17T12:06:16,772 [WARN ] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9010-bert_1.0-stdout
- 2022-12-17T12:06:16,772 [WARN ] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9011-bert_1.0-stdout
- 2022-12-17T12:06:16,774 [INFO ] W-9010-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9010 in 1 seconds.
- 2022-12-17T12:06:16,772 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - module, function_name = self._load_handler_file(handler)
- 2022-12-17T12:06:16,774 [INFO ] W-9011-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9011 in 1 seconds.
- 2022-12-17T12:06:16,772 [WARN ] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9006-bert_1.0-stdout
- 2022-12-17T12:06:16,772 [INFO ] W-9003-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9003-bert_1.0-stdout
- 2022-12-17T12:06:16,771 [INFO ] W-9003-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9003 in 1 seconds.
- 2022-12-17T12:06:16,771 [INFO ] W-9004-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 100, in load
- 2022-12-17T12:06:16,771 [INFO ] W-9000-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,771 [INFO ] W-9008-bert_1.0-stdout MODEL_LOG - self.handle_connection(cl_socket)
- 2022-12-17T12:06:16,771 [INFO ] W-9011-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 100, in load
- 2022-12-17T12:06:16,771 [INFO ] W-9001-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,771 [INFO ] W-9006-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 100, in load
- 2022-12-17T12:06:16,771 [INFO ] W-9007-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\model_service_worker.py", line 118, in load_model
- 2022-12-17T12:06:16,776 [INFO ] W-9011-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9011-bert_1.0-stdout
- 2022-12-17T12:06:16,775 [INFO ] W-9008-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9008-bert_1.0-stdout
- 2022-12-17T12:06:16,775 [INFO ] W-9000-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9000-bert_1.0-stdout
- 2022-12-17T12:06:16,775 [INFO ] W-9004-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9004-bert_1.0-stdout
- 2022-12-17T12:06:16,774 [INFO ] W-9006-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9006 in 1 seconds.
- 2022-12-17T12:06:16,774 [INFO ] W-9010-bert_1.0-stdout MODEL_LOG - service = model_loader.load(
- 2022-12-17T12:06:16,774 [WARN ] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerLifeCycle - terminateIOStreams() threadName=W-9009-bert_1.0-stdout
- 2022-12-17T12:06:16,777 [INFO ] W-9007-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9007-bert_1.0-stdout
- 2022-12-17T12:06:16,773 [INFO ] W-9002-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9002-bert_1.0-stdout
- 2022-12-17T12:06:16,773 [INFO ] W-9004-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9004 in 1 seconds.
- 2022-12-17T12:06:16,778 [INFO ] W-9010-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9010-bert_1.0-stdout
- 2022-12-17T12:06:16,777 [INFO ] W-9006-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9006-bert_1.0-stdout
- 2022-12-17T12:06:16,777 [INFO ] W-9009-bert_1.0-stdout MODEL_LOG - File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\model_loader.py", line 145, in _load_handler_file
- 2022-12-17T12:06:16,777 [INFO ] W-9001-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9001-bert_1.0-stdout
- 2022-12-17T12:06:16,778 [INFO ] W-9009-bert_1.0 org.pytorch.serve.wlm.WorkerThread - Retry worker: 9009 in 1 seconds.
- 2022-12-17T12:06:16,779 [INFO ] W-9009-bert_1.0-stdout org.pytorch.serve.wlm.WorkerLifeCycle - Stopped Scanner - W-9009-bert_1.0-stdout
- Model server started.
- 2022-12-17T12:06:17,308 [INFO ] pool-3-thread-1 TS_METRICS - CPUUtilization.Percent:0.0|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,309 [INFO ] pool-3-thread-1 TS_METRICS - DiskAvailable.Gigabytes:104.52069854736328|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,310 [INFO ] pool-3-thread-1 TS_METRICS - DiskUsage.Gigabytes:305.57988357543945|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,310 [INFO ] pool-3-thread-1 TS_METRICS - DiskUtilization.Percent:74.5|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,310 [INFO ] pool-3-thread-1 TS_METRICS - MemoryAvailable.Megabytes:7060.0078125|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,310 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUsed.Megabytes:9008.73046875|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,311 [INFO ] pool-3-thread-1 TS_METRICS - MemoryUtilization.Percent:56.1|#Level:Host|#hostname:JadeGnome,timestamp:1671258977
- 2022-12-17T12:06:17,370 [ERROR] Thread-1 org.pytorch.serve.metrics.MetricCollector - --- Logging error ---
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 688, in wrapper
- return fun(self, *args, **kwargs)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 942, in create_time
- user, system, created = cext.proc_times(self.pid)
- ProcessLookupError: [Errno 3] assume no such process (originated from GetExitCodeProcess != STILL_ACTIVE)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 361, in _init
- self.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 714, in create_time
- self._create_time = self._proc.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 690, in wrapper
- raise convert_oserror(err, pid=self.pid, name=self._name)
- psutil.NoSuchProcess: process no longer exists (pid=13056)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 20, in get_cpu_usage
- process = psutil.Process(int(pid))
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 332, in __init__
- self._init(pid)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 373, in _init
- raise NoSuchProcess(pid, msg='process PID not found')
- psutil.NoSuchProcess: process PID not found (pid=13056)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1104, in emit
- self.flush()
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1084, in flush
- self.stream.flush()
- OSError: [Errno 22] Invalid argument
- Call stack:
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\metrics\metric_collector.py", line 29, in <module>
- check_process_mem_usage(sys.stdin)
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 40, in check_process_mem_usage
- logging.info("%s:%d", process, get_cpu_usage(process))
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 22, in get_cpu_usage
- logging.error("Failed get process for pid: %s", pid, exc_info=True)
- Message: 'Failed get process for pid: %s'
- Arguments: ('13056',)
- --- Logging error ---
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1104, in emit
- self.flush()
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1084, in flush
- self.stream.flush()
- OSError: [Errno 22] Invalid argument
- Call stack:
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\metrics\metric_collector.py", line 29, in <module>
- check_process_mem_usage(sys.stdin)
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 40, in check_process_mem_usage
- logging.info("%s:%d", process, get_cpu_usage(process))
- Message: '%s:%d'
- Arguments: ('13056', 0)
- --- Logging error ---
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 688, in wrapper
- return fun(self, *args, **kwargs)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 942, in create_time
- user, system, created = cext.proc_times(self.pid)
- ProcessLookupError: [Errno 3] assume no such process (originated from GetExitCodeProcess != STILL_ACTIVE)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 361, in _init
- self.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 714, in create_time
- self._create_time = self._proc.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 690, in wrapper
- raise convert_oserror(err, pid=self.pid, name=self._name)
- psutil.NoSuchProcess: process no longer exists (pid=18564)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 20, in get_cpu_usage
- process = psutil.Process(int(pid))
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 332, in __init__
- self._init(pid)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 373, in _init
- raise NoSuchProcess(pid, msg='process PID not found')
- psutil.NoSuchProcess: process PID not found (pid=18564)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1104, in emit
- self.flush()
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1084, in flush
- self.stream.flush()
- OSError: [Errno 22] Invalid argument
- Call stack:
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\metrics\metric_collector.py", line 29, in <module>
- check_process_mem_usage(sys.stdin)
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 40, in check_process_mem_usage
- logging.info("%s:%d", process, get_cpu_usage(process))
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 22, in get_cpu_usage
- logging.error("Failed get process for pid: %s", pid, exc_info=True)
- Message: 'Failed get process for pid: %s'
- Arguments: ('18564',)
- --- Logging error ---
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1104, in emit
- self.flush()
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\logging\__init__.py", line 1084, in flush
- self.stream.flush()
- OSError: [Errno 22] Invalid argument
- Call stack:
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\Lib\site-packages\ts\metrics\metric_collector.py", line 29, in <module>
- check_process_mem_usage(sys.stdin)
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 40, in check_process_mem_usage
- logging.info("%s:%d", process, get_cpu_usage(process))
- Message: '%s:%d'
- Arguments: ('18564', 0)
- --- Logging error ---
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 688, in wrapper
- return fun(self, *args, **kwargs)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 942, in create_time
- user, system, created = cext.proc_times(self.pid)
- ProcessLookupError: [Errno 3] assume no such process (originated from GetExitCodeProcess != STILL_ACTIVE)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 361, in _init
- self.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 714, in create_time
- self._create_time = self._proc.create_time()
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\_pswindows.py", line 690, in wrapper
- raise convert_oserror(err, pid=self.pid, name=self._name)
- psutil.NoSuchProcess: process no longer exists (pid=19908)
- During handling of the above exception, another exception occurred:
- Traceback (most recent call last):
- File "C:\Users\varun\AppData\Local\Programs\Python\Python310\lib\site-packages\ts\metrics\process_memory_metric.py", line 20, in get_cpu_usage
- process = psutil.Process(int(pid))
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 332, in __init__
- self._init(pid)
- File "C:\Users\varun\AppData\Roaming\Python\Python310\site-packages\psutil\__init__.py", line 373, in _init
- raise NoSuchProcess(pid, msg='process PID not found')
- psutil.NoSuchProcess: process PID not found (pid=19908)
- During handling of the above exception, another exception occurred:
Add Comment
Please, Sign In to add comment