By type
*** Starting uWSGI 2.0.19.1 (64bit) on [Fri Apr 15 01:39:41 2022] *** compiled with version: 9.3.0 on 29 July 2021 10:04:23 os: Linux-4.19.0-17-cloud-amd64 #1 SMP Debian 4.19.194-3 (2021-07-18) nodename: onap-oof-has-api-6cd96d9485-qvptn machine: x86_64 clock source: unix pcre jit disabled detected number of CPU cores: 16 current working directory: /app writing pidfile to /run/conductor/conductor-uwsgi.pid detected binary path: /usr/local/bin/uwsgi your memory page size is 4096 bytes detected max file descriptor number: 1048576 lock engine: pthread robust mutexes thunder lock: disabled (you can enable it with --thunder-lock) uwsgi socket 0 bound to UNIX address /run/conductor/uwsgi.sock fd 3 uwsgi socket 1 bound to TCP address 0.0.0.0:8080 fd 4 Python version: 3.9.1 (default, Dec 11 2020, 14:29:41) [GCC 9.3.0] Python main interpreter initialized at 0x55939c2b2d80 python threads support enabled your server socket listen backlog is limited to 100 connections your mercy for graceful operations on workers is 60 seconds mapped 510440 bytes (498 KB) for 6 cores *** Operational MODE: preforking *** 2022-04-15 01:39:46,233||140653869100360|INFO|app|conductor.api.app: [-] Full WSGI config used: /usr/local/etc/conductor/api_paste.ini WSGI app 0 (mountpoint='') ready in 15 seconds on interpreter 0x55939c2b2d80 pid: 1 (default app) spawned uWSGI master process (pid: 1) spawned uWSGI worker 1 (pid: 148, cores: 1) spawned uWSGI worker 2 (pid: 149, cores: 1) spawned uWSGI worker 3 (pid: 150, cores: 1) spawned uWSGI worker 4 (pid: 151, cores: 1) spawned uWSGI worker 5 (pid: 152, cores: 1) spawned uWSGI worker 6 (pid: 153, cores: 1) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 154) Respawned uWSGI worker 2 (new pid: 155) Respawned uWSGI worker 3 (new pid: 156) Respawned uWSGI worker 4 (new pid: 157) Respawned uWSGI worker 5 (new pid: 158) Respawned uWSGI worker 6 (new pid: 159) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 160) Respawned uWSGI worker 2 (new pid: 161) Respawned uWSGI worker 3 (new pid: 162) Respawned uWSGI worker 4 (new pid: 163) Respawned uWSGI worker 5 (new pid: 164) Respawned uWSGI worker 6 (new pid: 165) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 166) Respawned uWSGI worker 2 (new pid: 167) Respawned uWSGI worker 3 (new pid: 168) Respawned uWSGI worker 4 (new pid: 169) Respawned uWSGI worker 5 (new pid: 170) Respawned uWSGI worker 6 (new pid: 171) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 172) Respawned uWSGI worker 2 (new pid: 173) Respawned uWSGI worker 3 (new pid: 174) Respawned uWSGI worker 4 (new pid: 175) Respawned uWSGI worker 5 (new pid: 176) Respawned uWSGI worker 6 (new pid: 177) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 178) Respawned uWSGI worker 2 (new pid: 179) Respawned uWSGI worker 3 (new pid: 180) Respawned uWSGI worker 4 (new pid: 181) Respawned uWSGI worker 5 (new pid: 182) Respawned uWSGI worker 6 (new pid: 183) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 184) Respawned uWSGI worker 2 (new pid: 185) Respawned uWSGI worker 3 (new pid: 186) Respawned uWSGI worker 4 (new pid: 187) Respawned uWSGI worker 5 (new pid: 188) Respawned uWSGI worker 6 (new pid: 189) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 190) Respawned uWSGI worker 2 (new pid: 191) Respawned uWSGI worker 3 (new pid: 192) Respawned uWSGI worker 4 (new pid: 193) Respawned uWSGI worker 5 (new pid: 194) Respawned uWSGI worker 6 (new pid: 195) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 196) Respawned uWSGI worker 2 (new pid: 197) Respawned uWSGI worker 3 (new pid: 198) Respawned uWSGI worker 4 (new pid: 199) Respawned uWSGI worker 5 (new pid: 200) Respawned uWSGI worker 6 (new pid: 201) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 202) Respawned uWSGI worker 2 (new pid: 203) Respawned uWSGI worker 3 (new pid: 204) Respawned uWSGI worker 4 (new pid: 205) Respawned uWSGI worker 5 (new pid: 206) Respawned uWSGI worker 6 (new pid: 207) [pid: 204|app: 0|req: 1/1] 10.233.66.155 () {42 vars in 581 bytes} [Fri Apr 15 02:29:11 2022] GET /nmaplowercheck1649989751 => generated 77 bytes in 11 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0) [pid: 207|app: 0|req: 1/2] 10.233.66.155 () {42 vars in 543 bytes} [Fri Apr 15 02:29:11 2022] GET /HNAP1 => generated 77 bytes in 9 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0) [pid: 204|app: 0|req: 2/3] 10.233.66.155 () {44 vars in 569 bytes} [Fri Apr 15 02:29:11 2022] POST /sdk => generated 77 bytes in 81 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0) [pid: 204|app: 0|req: 3/4] 10.233.66.155 () {42 vars in 553 bytes} [Fri Apr 15 02:29:11 2022] GET /evox/about => generated 77 bytes in 1 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0) [pid: 205|app: 0|req: 1/5] 10.233.66.155 () {36 vars in 377 bytes} [Fri Apr 15 02:29:19 2022] GET / => generated 750 bytes in 9 msecs (HTTP/1.0 200) 5 headers in 134 bytes (2 switches on core 0) [pid: 202|app: 0|req: 1/6] 10.233.66.155 () {38 vars in 405 bytes} [Fri Apr 15 02:29:19 2022] GET / => generated 752 bytes in 11 msecs (HTTP/1.1 200) 5 headers in 134 bytes (2 switches on core 0) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 208) Respawned uWSGI worker 2 (new pid: 209) Respawned uWSGI worker 3 (new pid: 210) Respawned uWSGI worker 4 (new pid: 211) Respawned uWSGI worker 5 (new pid: 212) Respawned uWSGI worker 6 (new pid: 213) [pid: 213|app: 0|req: 2/7] 10.233.66.155 () {42 vars in 581 bytes} [Fri Apr 15 02:32:14 2022] GET /nmaplowercheck1649989934 => generated 77 bytes in 10 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0) [pid: 211|app: 0|req: 2/8] 10.233.66.155 () {42 vars in 553 bytes} [Fri Apr 15 02:32:14 2022] GET /evox/about => generated 77 bytes in 6 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0) [pid: 212|app: 0|req: 1/9] 10.233.66.155 () {44 vars in 569 bytes} [Fri Apr 15 02:32:15 2022] POST /sdk => generated 77 bytes in 11 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0) [pid: 209|app: 0|req: 1/10] 10.233.66.155 () {42 vars in 543 bytes} [Fri Apr 15 02:32:15 2022] GET /HNAP1 => generated 77 bytes in 6 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0) [pid: 211|app: 0|req: 3/11] 10.233.66.155 () {36 vars in 377 bytes} [Fri Apr 15 02:32:21 2022] GET / => generated 750 bytes in 66 msecs (HTTP/1.0 200) 5 headers in 134 bytes (1 switches on core 0) [pid: 213|app: 0|req: 3/12] 10.233.66.155 () {38 vars in 405 bytes} [Fri Apr 15 02:32:21 2022] GET / => generated 752 bytes in 71 msecs (HTTP/1.1 200) 5 headers in 134 bytes (2 switches on core 0) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 219) Respawned uWSGI worker 2 (new pid: 220) Respawned uWSGI worker 3 (new pid: 221) Respawned uWSGI worker 4 (new pid: 222) Respawned uWSGI worker 5 (new pid: 223) Respawned uWSGI worker 6 (new pid: 224) 2022-04-15 02:36:03,464||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 1e68dd11-d4a1-4496-8930-d65cb23b041f on topic controller enqueued. [pid: 223|app: 0|req: 2/13] 10.233.71.22 () {46 vars in 586 bytes} [Fri Apr 15 02:35:59 2022] GET /v1/plans/healthcheck => generated 159 bytes in 56978 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) 2022-04-15 02:37:56,212||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 9f58f811-912e-4c9b-bbe9-48e6255c7f0e on topic controller enqueued. [pid: 219|app: 0|req: 2/14] 10.233.72.242 () {46 vars in 589 bytes} [Fri Apr 15 02:37:52 2022] GET /v1/plans/healthcheck => generated 159 bytes in 52179 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) 2022-04-15 02:39:53,160||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 08c0d04f-d7c1-4835-9e45-b888340225f0 on topic controller enqueued. 2022-04-15 02:39:59,285||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message aba53457-28b6-470b-a97e-1132cd526c5b on topic controller enqueued. worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) worker 5 lifetime reached, it was running for 301 second(s) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 348) Respawned uWSGI worker 2 (new pid: 349) Respawned uWSGI worker 3 (new pid: 350) Respawned uWSGI worker 4 (new pid: 351) 2022-04-15 02:40:36,858||140653869100360|ERROR|component|conductor.common.music.messaging.component: [-] Message 08c0d04f-d7c1-4835-9e45-b888340225f0 on topic controller returned an error 2022-04-15 02:40:38,760||140653869100360|CRITICAL|log|conductor: [-] Unhandled error Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/pecan/routing.py", line 141, in lookup_controller obj, remainder = find_object(obj, remainder, notfound_handlers, File "/usr/local/lib/python3.9/site-packages/pecan/routing.py", line 197, in find_object raise PecanNotFound pecan.routing.PecanNotFound During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/usr/local/lib/python3.9/site-packages/webob/dec.py", line 129, in __call__ resp = self.call_func(req, *args, **kw) File "/usr/local/lib/python3.9/site-packages/webob/dec.py", line 193, in call_func return self.func(req, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/oslo_middleware/base.py", line 124, in __call__ response = req.get_response(self.application) File "/usr/local/lib/python3.9/site-packages/webob/request.py", line 1313, in send status, headers, app_iter = self.call_application( File "/usr/local/lib/python3.9/site-packages/webob/request.py", line 1278, in call_application app_iter = application(self.environ, start_response) File "/usr/local/lib/python3.9/site-packages/webob/dec.py", line 129, in __call__ resp = self.call_func(req, *args, **kw) File "/usr/local/lib/python3.9/site-packages/webob/dec.py", line 193, in call_func return self.func(req, *args, **kwargs) File "/usr/local/lib/python3.9/site-packages/oslo_middleware/base.py", line 124, in __call__ response = req.get_response(self.application) File "/usr/local/lib/python3.9/site-packages/webob/request.py", line 1313, in send status, headers, app_iter = self.call_application( File "/usr/local/lib/python3.9/site-packages/webob/request.py", line 1278, in call_application app_iter = application(self.environ, start_response) File "/usr/local/lib/python3.9/site-packages/pecan/middleware/recursive.py", line 56, in __call__ return self.application(environ, start_response) File "/opt/has/conductor/conductor/api/middleware.py", line 130, in __call__ app_iter = self.app(environ, replacement_start_response) File "/usr/local/lib/python3.9/site-packages/pecan/core.py", line 852, in __call__ return super(Pecan, self).__call__(environ, start_response) File "/usr/local/lib/python3.9/site-packages/pecan/core.py", line 692, in __call__ controller, args, kwargs = self.find_controller(state) File "/usr/local/lib/python3.9/site-packages/pecan/core.py", line 870, in find_controller controller, args, kw = super(Pecan, self).find_controller(_state) File "/usr/local/lib/python3.9/site-packages/pecan/core.py", line 460, in find_controller controller, remainder = self.route(req, self.root, path) File "/usr/local/lib/python3.9/site-packages/pecan/core.py", line 293, in route node, remainder = lookup_controller(node, path, req) File "/usr/local/lib/python3.9/site-packages/pecan/routing.py", line 158, in lookup_controller result = handle_lookup_traversal(obj, remainder) File "/usr/local/lib/python3.9/site-packages/pecan/routing.py", line 175, in handle_lookup_traversal result = obj(*args) File "/opt/has/conductor/conductor/api/controllers/v1/plans.py", line 311, in _lookup return PlansItemController(uuid4), remainder File "/opt/has/conductor/conductor/api/controllers/v1/plans.py", line 190, in __init__ self.plan = self.plans_get(plan_id=self.uuid) File "/opt/has/conductor/conductor/api/controllers/v1/plans.py", line 95, in plans_get return self.plan_getid(plan_id) File "/opt/has/conductor/conductor/api/controllers/v1/plans.py", line 110, in plan_getid result = client.call(ctx, method, args) File "/opt/has/conductor/conductor/common/music/messaging/component.py", line 239, in call raise rpc_common.deserialize_remote_exception(failure, allowed) oslo_messaging.rpc.client.RemoteError: Remote error: EtcdClientException Failed to establish connection with ETCD. GRPC StatusCode.UNAVAILABLE ['Traceback (most recent call last):\n', ' File "/opt/has/conductor/conductor/common/etcd/api.py", line 60, in get_client\n return etcd3.client(host=self.host, port=self.port,\n', ' File "/usr/local/lib/python3.9/site-packages/etcd3/client.py", line 1178, in client\n return Etcd3Client(host=host,\n', ' File "/usr/local/lib/python3.9/site-packages/etcd3/client.py", line 148, in __init__\n resp = self.auth_stub.Authenticate(auth_request, self.timeout)\n', ' File "/usr/local/lib/python3.9/site-packages/grpc/_channel.py", line 946, in __call__\n return _end_unary_response_blocking(state, call, False, None)\n', ' File "/usr/local/lib/python3.9/site-packages/grpc/_channel.py", line 849, in _end_unary_response_blocking\n raise _InactiveRpcError(state)\n', 'grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:\n\tstatus = StatusCode.UNAVAILABLE\n\tdetails = "etcdserver: request timed out"\n\tdebug_error_string = "{"created":"@1649990416.459616152","description":"Error received from peer ipv4:10.233.76.227:2379","file":"src/core/lib/surface/call.cc","file_line":1069,"grpc_message":"etcdserver: request timed out","grpc_status":14}"\n>\n', '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/has/conductor/conductor/common/music/messaging/component.py", line 413, in _do\n result = method(msg.ctxt, msg.args)\n', ' File "/opt/has/conductor/conductor/controller/rpc.py", line 89, in plans_get\n plans = self.Plan.query.get_plan_by_col(\'id\', plan_id)\n', ' File "/opt/has/conductor/conductor/common/music/model/search.py", line 87, in get_plan_by_col\n rows = db_backend.DB_API.row_read(\n', ' File "/opt/has/conductor/conductor/common/etcd/api.py", line 130, in row_read\n schema = self.get_value(f\'{keyspace}/{table}\')\n', ' File "/opt/has/conductor/conductor/common/etcd/api.py", line 75, in get_value\n raw_value = self.get_raw_value(key)\n', ' File "/opt/has/conductor/conductor/common/etcd/api.py", line 72, in get_raw_value\n return self.get_client().get(key)[0]\n', ' File "/opt/has/conductor/conductor/common/etcd/api.py", line 69, in get_client\n raise EtcdClientException("Failed to establish connection with ETCD. GRPC {}".format(rpc_error.code()))\n', 'conductor.common.etcd.utils.EtcdClientException: Failed to establish connection with ETCD. GRPC StatusCode.UNAVAILABLE\n']. [pid: 224|app: 0|req: 4/15] 10.233.71.217 () {46 vars in 589 bytes} [Fri Apr 15 02:39:49 2022] GET /v1/plans/healthcheck => generated 0 bytes in 49622 msecs (HTTP/1.1 500) 0 headers in 0 bytes (1 switches on core 0) Respawned uWSGI worker 6 (new pid: 378) [pid: 223|app: 0|req: 3/16] 10.233.72.219 () {46 vars in 589 bytes} [Fri Apr 15 02:39:43 2022] GET /v1/plans/healthcheck => generated 159 bytes in 86821 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) Respawned uWSGI worker 5 (new pid: 400) 2022-04-15 02:41:31,348||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 3170c050-6ec9-4efc-87c5-0bc173d628c3 on topic controller enqueued. [pid: 400|app: 0|req: 4/17] 10.233.71.137 () {46 vars in 589 bytes} [Fri Apr 15 02:41:24 2022] GET /v1/plans/healthcheck => generated 159 bytes in 45929 msecs (HTTP/1.1 200) 5 headers in 134 bytes (2 switches on core 0) 2022-04-15 02:43:03,188||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message e053340f-48e0-4060-9fe7-ba53d8809603 on topic controller enqueued. [pid: 350|app: 0|req: 4/18] 10.233.72.87 () {46 vars in 586 bytes} [Fri Apr 15 02:42:57 2022] GET /v1/plans/healthcheck => generated 159 bytes in 57499 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) 2022-04-15 02:45:07,659||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 60e7a801-61be-4cd8-b352-2c3a528003fe on topic controller enqueued. worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 511) Respawned uWSGI worker 2 (new pid: 512) Respawned uWSGI worker 3 (new pid: 513) Respawned uWSGI worker 4 (new pid: 514) worker 6 lifetime reached, it was running for 301 second(s) [pid: 378|app: 0|req: 5/19] 10.233.72.67 () {46 vars in 586 bytes} [Fri Apr 15 02:44:53 2022] GET /v1/plans/healthcheck => generated 159 bytes in 64875 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) Respawned uWSGI worker 6 (new pid: 557) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 558) 2022-04-15 02:47:23,886||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 20ab69ec-6edd-4117-a5a8-86d1b19f611c on topic controller enqueued. [pid: 511|app: 0|req: 3/20] 10.233.71.55 () {46 vars in 586 bytes} [Fri Apr 15 02:47:14 2022] GET /v1/plans/healthcheck => generated 159 bytes in 49536 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) 2022-04-15 02:50:07,660||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message 8a613b52-f951-41fc-a94b-88b04fbbff6c on topic controller enqueued. worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 3 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 611) Respawned uWSGI worker 2 (new pid: 612) Respawned uWSGI worker 4 (new pid: 613) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 656) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 668) [pid: 513|app: 0|req: 5/21] 10.233.72.108 () {46 vars in 589 bytes} [Fri Apr 15 02:49:57 2022] GET /v1/plans/healthcheck => generated 159 bytes in 77979 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) Respawned uWSGI worker 3 (new pid: 669) 2022-04-15 02:52:53,286||140653869100360|INFO|component|conductor.common.music.messaging.component: [-] Message be244e87-a726-49f8-9e08-a92dbfeded63 on topic controller enqueued. [pid: 668|app: 0|req: 5/22] 10.233.72.225 () {46 vars in 589 bytes} [Fri Apr 15 02:52:43 2022] GET /v1/plans/healthcheck => generated 159 bytes in 54478 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 721) Respawned uWSGI worker 2 (new pid: 722) Respawned uWSGI worker 4 (new pid: 723) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 724) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 725) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 726) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 727) Respawned uWSGI worker 2 (new pid: 728) Respawned uWSGI worker 4 (new pid: 729) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 730) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 731) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 732) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 733) Respawned uWSGI worker 2 (new pid: 734) Respawned uWSGI worker 4 (new pid: 735) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 736) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 737) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 738) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 739) Respawned uWSGI worker 2 (new pid: 740) Respawned uWSGI worker 4 (new pid: 741) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 742) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 743) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 744) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 745) Respawned uWSGI worker 2 (new pid: 746) Respawned uWSGI worker 4 (new pid: 747) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 748) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 749) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 750) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 751) Respawned uWSGI worker 2 (new pid: 752) Respawned uWSGI worker 4 (new pid: 753) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 754) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 755) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 756) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 757) Respawned uWSGI worker 2 (new pid: 758) Respawned uWSGI worker 4 (new pid: 759) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 760) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 761) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 762) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 763) Respawned uWSGI worker 2 (new pid: 764) Respawned uWSGI worker 4 (new pid: 765) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 766) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 767) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 768) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 769) Respawned uWSGI worker 2 (new pid: 770) Respawned uWSGI worker 4 (new pid: 771) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 772) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 773) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 774) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 775) Respawned uWSGI worker 2 (new pid: 776) Respawned uWSGI worker 4 (new pid: 777) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 778) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 779) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 780) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 781) Respawned uWSGI worker 2 (new pid: 782) Respawned uWSGI worker 4 (new pid: 783) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 784) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 785) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 786) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 787) Respawned uWSGI worker 2 (new pid: 788) Respawned uWSGI worker 4 (new pid: 789) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 790) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 791) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 792) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 793) Respawned uWSGI worker 2 (new pid: 794) Respawned uWSGI worker 4 (new pid: 795) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 796) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 797) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 798) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 799) Respawned uWSGI worker 2 (new pid: 800) Respawned uWSGI worker 4 (new pid: 801) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 802) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 803) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 804) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 805) Respawned uWSGI worker 2 (new pid: 806) Respawned uWSGI worker 4 (new pid: 807) worker 6 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 6 (new pid: 808) worker 5 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 5 (new pid: 809) worker 3 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 3 (new pid: 810) worker 1 lifetime reached, it was running for 301 second(s) worker 2 lifetime reached, it was running for 301 second(s) worker 4 lifetime reached, it was running for 301 second(s) Respawned uWSGI worker 1 (new pid: 811) Respawned uWSGI worker 2 (new pid: 812) Respawned uWSGI worker 4 (new pid: 813)