Results

By type

          *** Starting uWSGI 2.0.20 (64bit) on [Fri May 13 06:44:33 2022] ***
compiled with version: 9.3.0 on 30 March 2022 11:08:40
os: Linux-5.4.0-96-generic #109-Ubuntu SMP Wed Jan 12 16:49:16 UTC 2022
nodename: onap-oof-has-api-69c868c78c-cvxrn
machine: x86_64
clock source: unix
pcre jit disabled
detected number of CPU cores: 8
current working directory: /app
writing pidfile to /run/conductor/conductor-uwsgi.pid
detected binary path: /usr/local/bin/uwsgi
your memory page size is 4096 bytes
detected max file descriptor number: 1048576
lock engine: pthread robust mutexes
thunder lock: disabled (you can enable it with --thunder-lock)
uwsgi socket 0 bound to UNIX address /run/conductor/uwsgi.sock fd 3
uwsgi socket 1 bound to TCP address 0.0.0.0:8080 fd 4
Python version: 3.9.1 (default, Dec 11 2020, 14:29:41)  [GCC 9.3.0]
Python main interpreter initialized at 0x557b51761dc0
python threads support enabled
your server socket listen backlog is limited to 100 connections
your mercy for graceful operations on workers is 60 seconds
mapped 510440 bytes (498 KB) for 6 cores
*** Operational MODE: preforking ***
2022-05-13 06:44:35,352||140084918299976|INFO|app|conductor.api.app: [-] Full WSGI config used: /usr/local/etc/conductor/api_paste.ini
WSGI app 0 (mountpoint='') ready in 2 seconds on interpreter 0x557b51761dc0 pid: 1 (default app)
spawned uWSGI master process (pid: 1)
spawned uWSGI worker 1 (pid: 102, cores: 1)
spawned uWSGI worker 2 (pid: 103, cores: 1)
spawned uWSGI worker 3 (pid: 104, cores: 1)
spawned uWSGI worker 4 (pid: 105, cores: 1)
spawned uWSGI worker 5 (pid: 106, cores: 1)
spawned uWSGI worker 6 (pid: 107, cores: 1)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 108)
Respawned uWSGI worker 2 (new pid: 109)
Respawned uWSGI worker 3 (new pid: 110)
Respawned uWSGI worker 4 (new pid: 111)
Respawned uWSGI worker 5 (new pid: 112)
Respawned uWSGI worker 6 (new pid: 113)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 114)
Respawned uWSGI worker 2 (new pid: 115)
Respawned uWSGI worker 3 (new pid: 116)
Respawned uWSGI worker 4 (new pid: 117)
Respawned uWSGI worker 5 (new pid: 118)
Respawned uWSGI worker 6 (new pid: 119)
[pid: 118|app: 0|req: 1/2] 10.233.64.97 () {44 vars in 572 bytes} [Fri May 13 06:56:22 2022] POST /sdk => generated 77 bytes in 7 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0)
[pid: 119|app: 0|req: 1/2] 10.233.64.97 () {42 vars in 584 bytes} [Fri May 13 06:56:22 2022] GET /nmaplowercheck1652424982 => generated 77 bytes in 7 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0)
[pid: 119|app: 0|req: 2/3] 10.233.64.97 () {42 vars in 546 bytes} [Fri May 13 06:56:22 2022] GET /HNAP1 => generated 77 bytes in 1 msecs (HTTP/1.1 404) 8 headers in 202 bytes (2 switches on core 0)
[pid: 119|app: 0|req: 3/4] 10.233.64.97 () {42 vars in 556 bytes} [Fri May 13 06:56:22 2022] GET /evox/about => generated 77 bytes in 1 msecs (HTTP/1.1 404) 8 headers in 202 bytes (1 switches on core 0)
[pid: 115|app: 0|req: 1/5] 10.233.64.97 () {36 vars in 374 bytes} [Fri May 13 06:56:30 2022] GET / => generated 750 bytes in 8 msecs (HTTP/1.0 200) 5 headers in 134 bytes (1 switches on core 0)
[pid: 118|app: 0|req: 2/6] 10.233.64.97 () {38 vars in 408 bytes} [Fri May 13 06:56:30 2022] GET / => generated 758 bytes in 2 msecs (HTTP/1.1 200) 5 headers in 134 bytes (2 switches on core 0)
invalid request block size: 22340 (max 4096)...skip
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 130)
Respawned uWSGI worker 2 (new pid: 131)
Respawned uWSGI worker 3 (new pid: 132)
Respawned uWSGI worker 4 (new pid: 133)
Respawned uWSGI worker 5 (new pid: 134)
Respawned uWSGI worker 6 (new pid: 135)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 141)
Respawned uWSGI worker 2 (new pid: 142)
Respawned uWSGI worker 3 (new pid: 143)
Respawned uWSGI worker 4 (new pid: 144)
Respawned uWSGI worker 5 (new pid: 145)
Respawned uWSGI worker 6 (new pid: 146)
2022-05-13 07:05:38,230||140084918299976|INFO|component|conductor.common.music.messaging.component: [-] Message e0b820b4-42a7-476e-bafc-a0d149d942f5 on topic controller enqueued.
[pid: 146|app: 0|req: 4/7] 10.233.69.97 () {46 vars in 586 bytes} [Fri May 13 07:05:38 2022] GET /v1/plans/healthcheck => generated 159 bytes in 3038 msecs (HTTP/1.1 200) 5 headers in 134 bytes (1 switches on core 0)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 170)
Respawned uWSGI worker 2 (new pid: 171)
Respawned uWSGI worker 3 (new pid: 172)
Respawned uWSGI worker 4 (new pid: 173)
Respawned uWSGI worker 5 (new pid: 174)
Respawned uWSGI worker 6 (new pid: 175)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 176)
Respawned uWSGI worker 2 (new pid: 177)
Respawned uWSGI worker 3 (new pid: 178)
Respawned uWSGI worker 4 (new pid: 179)
Respawned uWSGI worker 5 (new pid: 180)
Respawned uWSGI worker 6 (new pid: 181)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 182)
Respawned uWSGI worker 2 (new pid: 183)
Respawned uWSGI worker 3 (new pid: 184)
Respawned uWSGI worker 4 (new pid: 185)
Respawned uWSGI worker 5 (new pid: 186)
Respawned uWSGI worker 6 (new pid: 187)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 188)
Respawned uWSGI worker 2 (new pid: 189)
Respawned uWSGI worker 3 (new pid: 190)
Respawned uWSGI worker 4 (new pid: 191)
Respawned uWSGI worker 5 (new pid: 192)
Respawned uWSGI worker 6 (new pid: 193)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 194)
Respawned uWSGI worker 2 (new pid: 195)
Respawned uWSGI worker 3 (new pid: 196)
Respawned uWSGI worker 4 (new pid: 197)
Respawned uWSGI worker 5 (new pid: 198)
Respawned uWSGI worker 6 (new pid: 199)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 200)
Respawned uWSGI worker 2 (new pid: 201)
Respawned uWSGI worker 3 (new pid: 202)
Respawned uWSGI worker 4 (new pid: 203)
Respawned uWSGI worker 5 (new pid: 204)
Respawned uWSGI worker 6 (new pid: 205)
worker 1 lifetime reached, it was running for 301 second(s)
worker 2 lifetime reached, it was running for 301 second(s)
worker 3 lifetime reached, it was running for 301 second(s)
worker 4 lifetime reached, it was running for 301 second(s)
worker 5 lifetime reached, it was running for 301 second(s)
worker 6 lifetime reached, it was running for 301 second(s)
Respawned uWSGI worker 1 (new pid: 206)
Respawned uWSGI worker 2 (new pid: 207)
Respawned uWSGI worker 3 (new pid: 208)
Respawned uWSGI worker 4 (new pid: 209)
Respawned uWSGI worker 5 (new pid: 210)
Respawned uWSGI worker 6 (new pid: 211)