Triggered by Gerrit: https://gerrit.onap.org/r/c/cps/+/137709 Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-centos8-docker-8c-8g-24093 (centos8-docker-8c-8g) in workspace /w/workspace/cps-master-verify-java [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-TooQT8AlG0p5/agent.5430 SSH_AGENT_PID=5432 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/cps-master-verify-java@tmp/private_key_8204977208652163432.key (/w/workspace/cps-master-verify-java@tmp/private_key_8204977208652163432.key) [ssh-agent] Using credentials onap-jobbuiler (Gerrit user) The recommended git tool is: NONE using credential onap-jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://cloud.onap.org/mirror/cps.git > git init /w/workspace/cps-master-verify-java # timeout=10 Fetching upstream changes from git://cloud.onap.org/mirror/cps.git > git --version # timeout=10 > git --version # 'git version 2.39.1' using GIT_SSH to set credentials Gerrit user [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /w/workspace/cps-master-verify-java@tmp/jenkins-gitclient-ssh15858325123980883457.key Verifying host key using manually-configured host key entries > git fetch --tags --force --progress -- git://cloud.onap.org/mirror/cps.git +refs/heads/*:refs/remotes/origin/* # timeout=30 > git config remote.origin.url git://cloud.onap.org/mirror/cps.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://cloud.onap.org/mirror/cps.git # timeout=10 Fetching upstream changes from git://cloud.onap.org/mirror/cps.git using GIT_SSH to set credentials Gerrit user [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /w/workspace/cps-master-verify-java@tmp/jenkins-gitclient-ssh12483957780645741365.key Verifying host key using manually-configured host key entries > git fetch --tags --force --progress -- git://cloud.onap.org/mirror/cps.git refs/changes/09/137709/1 # timeout=30 > git rev-parse bfc5aebcd7f9426e2cf42276ab9d5b2384ecd67a^{commit} # timeout=10 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script Checking out Revision bfc5aebcd7f9426e2cf42276ab9d5b2384ecd67a (refs/changes/09/137709/1) > git config core.sparsecheckout # timeout=10 > git checkout -f bfc5aebcd7f9426e2cf42276ab9d5b2384ecd67a # timeout=30 Commit message: "Fix build issue with subscription" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk 9be8de1a367af3335c62773cd77d529300d8a021 # timeout=10 [cps-master-verify-java] $ /bin/bash /tmp/jenkins1924352991057325213.sh ---> python-tools-install.sh Setup pyenv: system * 3.8.13 (set by /opt/pyenv/version) * 3.9.13 (set by /opt/pyenv/version) * 3.10.6 (set by /opt/pyenv/version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-ckhe lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-ckhe/bin to PATH Generating Requirements File Python 3.10.6 pip 24.0 from /tmp/venv-ckhe/lib/python3.10/site-packages/pip (python 3.10) appdirs==1.4.4 argcomplete==3.3.0 aspy.yaml==1.3.0 attrs==23.2.0 autopage==0.5.2 beautifulsoup4==4.12.3 boto3==1.34.86 botocore==1.34.86 bs4==0.0.2 cachetools==5.3.3 certifi==2024.2.2 cffi==1.16.0 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.3.2 click==8.1.7 cliff==4.6.0 cmd2==2.4.3 cryptography==3.3.2 debtcollector==3.0.0 decorator==5.1.1 defusedxml==0.7.1 Deprecated==1.2.14 distlib==0.3.8 dnspython==2.6.1 docker==4.2.2 dogpile.cache==1.3.2 email_validator==2.1.1 filelock==3.13.4 future==1.0.0 gitdb==4.0.11 GitPython==3.1.43 google-auth==2.29.0 httplib2==0.22.0 identify==2.5.35 idna==3.7 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.3 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==2.4 jsonschema==4.21.1 jsonschema-specifications==2023.12.1 keystoneauth1==5.6.0 kubernetes==29.0.0 lftools==0.37.10 lxml==5.2.1 MarkupSafe==2.1.5 msgpack==1.0.8 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.2.1 netifaces==0.11.0 niet==1.4.2 nodeenv==1.8.0 oauth2client==4.1.3 oauthlib==3.2.2 openstacksdk==3.1.0 os-client-config==2.1.0 os-service-types==1.7.0 osc-lib==3.0.1 oslo.config==9.4.0 oslo.context==5.5.0 oslo.i18n==6.3.0 oslo.log==5.5.1 oslo.serialization==5.4.0 oslo.utils==7.1.0 packaging==24.0 pbr==6.0.0 platformdirs==4.2.0 prettytable==3.10.0 pyasn1==0.6.0 pyasn1_modules==0.4.0 pycparser==2.22 pygerrit2==2.0.15 PyGithub==2.3.0 pyinotify==0.9.6 PyJWT==2.8.0 PyNaCl==1.5.0 pyparsing==2.4.7 pyperclip==1.8.2 pyrsistent==0.20.0 python-cinderclient==9.5.0 python-dateutil==2.9.0.post0 python-heatclient==3.5.0 python-jenkins==1.8.2 python-keystoneclient==5.4.0 python-magnumclient==4.4.0 python-novaclient==18.6.0 python-openstackclient==6.6.0 python-swiftclient==4.5.0 PyYAML==6.0.1 referencing==0.34.0 requests==2.31.0 requests-oauthlib==2.0.0 requestsexceptions==1.4.0 rfc3986==2.0.0 rpds-py==0.18.0 rsa==4.9 ruamel.yaml==0.18.6 ruamel.yaml.clib==0.2.8 s3transfer==0.10.1 simplejson==3.19.2 six==1.16.0 smmap==5.0.1 soupsieve==2.5 stevedore==5.2.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.12.4 tqdm==4.66.2 typing_extensions==4.11.0 tzdata==2024.1 urllib3==1.26.18 virtualenv==20.25.3 wcwidth==0.2.13 websocket-client==1.7.0 wrapt==1.16.0 xdg==6.0.0 xmltodict==0.13.0 yq==3.4.1 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SET_JDK_VERSION=openjdk17 GIT_URL="git://cloud.onap.org/mirror" [EnvInject] - Variables injected successfully. [cps-master-verify-java] $ /bin/sh /tmp/jenkins14526476109625357127.sh ---> update-java-alternatives.sh ---> Updating Java version ---> RedHat type system detected openjdk version "17.0.6-ea" 2023-01-17 LTS OpenJDK Runtime Environment (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS) OpenJDK 64-Bit Server VM (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS, mixed mode, sharing) JAVA_HOME=/usr/lib/jvm/java-17-openjdk [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path '/tmp/java.env' [EnvInject] - Variables injected successfully. provisioning config files... copy managed file [global-settings] to file:/w/workspace/cps-master-verify-java@tmp/config6792991744261121000tmp copy managed file [cps-settings] to file:/w/workspace/cps-master-verify-java@tmp/config11672974313376744147tmp [EnvInject] - Injecting environment variables from a build step. Unpacking https://repo.maven.apache.org/maven2/org/apache/maven/apache-maven/3.6.3/apache-maven-3.6.3-bin.zip to /w/tools/hudson.tasks.Maven_MavenInstallation/mvn36 on prd-centos8-docker-8c-8g-24093 using settings config with name cps-settings Replacing all maven server entries not found in credentials list is true using global settings config with name global-settings Replacing all maven server entries not found in credentials list is true [cps-master-verify-java] $ /w/tools/hudson.tasks.Maven_MavenInstallation/mvn36/bin/mvn -s /tmp/settings5302839131066950994.xml -gs /tmp/global-settings10263176453730450070.xml -DGERRIT_BRANCH=master -DGERRIT_PATCHSET_REVISION=bfc5aebcd7f9426e2cf42276ab9d5b2384ecd67a -DGERRIT_HOST=gerrit.onap.org -DMVN=/w/tools/hudson.tasks.Maven_MavenInstallation/mvn36/bin/mvn -DGERRIT_CHANGE_OWNER_EMAIL=lee.anjella.macabuhay@est.tech "-DGERRIT_EVENT_ACCOUNT_NAME=Lee Anjella Macabuhay" -DGERRIT_CHANGE_URL=https://gerrit.onap.org/r/c/cps/+/137709 -DGERRIT_PATCHSET_UPLOADER_EMAIL=lee.anjella.macabuhay@est.tech "-DARCHIVE_ARTIFACTS= **/target/surefire-reports/*-output.txt" -DGERRIT_EVENT_TYPE=patchset-created -DSTACK_NAME=$JOB_NAME-$BUILD_NUMBER -DGERRIT_PROJECT=cps -DGERRIT_CHANGE_NUMBER=137709 -DGERRIT_SCHEME=ssh '-DGERRIT_PATCHSET_UPLOADER=\"Lee Anjella Macabuhay\" ' -DGERRIT_PORT=29418 -DGERRIT_CHANGE_PRIVATE_STATE=false -DGERRIT_REFSPEC=refs/changes/09/137709/1 "-DGERRIT_PATCHSET_UPLOADER_NAME=Lee Anjella Macabuhay" '-DGERRIT_CHANGE_OWNER=\"Lee Anjella Macabuhay\" ' -DPROJECT=cps -DGERRIT_CHANGE_COMMIT_MESSAGE=Rml4IGJ1aWxkIGlzc3VlIHdpdGggc3Vic2NyaXB0aW9uCgpJc3N1ZS1JZDogQ1BTLTIxNjQKQ2hhbmdlLUlkOiBJNDlhMThkZjA5MjdiNWJlZmY2MmExMmY5YmMyYTJlOGU0ODY1MzdjOQpTaWduZWQtb2ZmLWJ5OiBlbWFjbGVlIDxsZWUuYW5qZWxsYS5tYWNhYnVoYXlAZXN0LnRlY2g+Cg== -DGERRIT_NAME=Primary -DGERRIT_TOPIC= "-DGERRIT_CHANGE_SUBJECT=Fix build issue with subscription" '-DGERRIT_EVENT_ACCOUNT=\"Lee Anjella Macabuhay\" ' -DGERRIT_CHANGE_WIP_STATE=false -DGERRIT_CHANGE_ID=I49a18df0927b5beff62a12f9bc2a2e8e486537c9 -DGERRIT_EVENT_HASH=22949129 -DGERRIT_VERSION=3.7.2 -DGERRIT_EVENT_ACCOUNT_EMAIL=lee.anjella.macabuhay@est.tech -DGERRIT_PATCHSET_NUMBER=1 -DMAVEN_PARAMS= "-DGERRIT_CHANGE_OWNER_NAME=Lee Anjella Macabuhay" -DMAVEN_OPTS='' clean install -B -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn [INFO] Scanning for projects... [WARNING] [WARNING] Some problems were encountered while building the effective model for org.onap.cps:cps-rest:jar:3.4.8-SNAPSHOT [WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-resources-plugin is missing. @ line 177, column 21 [WARNING] [WARNING] Some problems were encountered while building the effective model for org.onap.cps:cps-ncmp-rest:jar:3.4.8-SNAPSHOT [WARNING] 'build.plugins.plugin.version' for org.apache.maven.plugins:maven-resources-plugin is missing. @ line 211, column 21 [WARNING] [WARNING] Some problems were encountered while building the effective model for org.onap.cps:cps-ncmp-rest-stub-service:jar:3.4.8-SNAPSHOT [WARNING] 'dependencies.dependency.(groupId:artifactId:type:classifier)' must be unique: org.spockframework:spock-core:jar -> duplicate declaration of version (?) @ line 66, column 21 [WARNING] [WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. [WARNING] [WARNING] For this reason, future Maven versions might no longer support building such malformed projects. [WARNING] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] org.onap.cps:cps-dependencies [pom] [INFO] cps-bom [pom] [INFO] checkstyle [jar] [INFO] spotbugs [jar] [INFO] cps-parent [pom] [INFO] cps-events [jar] [INFO] cps-path-parser [jar] [INFO] cps-service [jar] [INFO] cps-rest [jar] [INFO] cps-ncmp-events [jar] [INFO] cps-ncmp-service [jar] [INFO] cps-ncmp-rest [jar] [INFO] cps-ncmp-rest-stub [pom] [INFO] cps-ncmp-rest-stub-service [jar] [INFO] cps-ncmp-rest-stub-app [jar] [INFO] cps-ri [jar] [INFO] dmi-plugin-demo-and-csit-stub [pom] [INFO] dmi-plugin-demo-and-csit-stub-service [jar] [INFO] dmi-plugin-demo-and-csit-stub-app [jar] [INFO] integration-test [jar] [INFO] cps-application [jar] [INFO] jacoco-report [jar] [INFO] cps [pom] [INFO] [INFO] -------------------< org.onap.cps:cps-dependencies >-------------------- [INFO] Building org.onap.cps:cps-dependencies 3.4.8-SNAPSHOT [1/23] [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-dependencies --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-dependencies --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-dependencies/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-dependencies/3.4.8-SNAPSHOT/cps-dependencies-3.4.8-SNAPSHOT.pom [INFO] [INFO] ------------------------< org.onap.cps:cps-bom >------------------------ [INFO] Building cps-bom 3.4.8-SNAPSHOT [2/23] [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-bom --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-bom --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-bom/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-bom/3.4.8-SNAPSHOT/cps-bom-3.4.8-SNAPSHOT.pom [INFO] [INFO] ----------------------< org.onap.cps:checkstyle >----------------------- [INFO] Building checkstyle 3.4.8-SNAPSHOT [3/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ checkstyle --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ checkstyle --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 4 resources [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ checkstyle --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ checkstyle --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/checkstyle/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ checkstyle --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ checkstyle --- [INFO] No tests to run. [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ checkstyle --- [INFO] Building jar: /w/workspace/cps-master-verify-java/checkstyle/target/checkstyle-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (copyright-check) @ checkstyle --- ======================================================================================================================== Copyright Check Python Script: /w/workspace/cps-master-verify-java/cps-ncmp-service/src/test/groovy/org/onap/cps/ncmp/api/impl/events/cmsubscription/DmiCmNotificationSubscriptionCacheHandlerSpec.groovy | line 5 read "Licensed under the Apache License, Version 2.0 (the 'License');\n" /w/workspace/cps-master-verify-java/cps-ncmp-service/src/test/groovy/org/onap/cps/ncmp/api/impl/events/cmsubscription/DmiCmNotificationSubscriptionCacheHandlerSpec.groovy | line 5 expected 'Licensed under the Apache License, Version 2.0 (the "License");\n' /w/workspace/cps-master-verify-java/cps-ncmp-service/src/test/groovy/org/onap/cps/ncmp/api/impl/events/cmsubscription/DmiCmNotificationSubscriptionCacheHandlerSpec.groovy | line 12 read "distributed under the License is distributed on an 'AS IS' BASIS,\n" /w/workspace/cps-master-verify-java/cps-ncmp-service/src/test/groovy/org/onap/cps/ncmp/api/impl/events/cmsubscription/DmiCmNotificationSubscriptionCacheHandlerSpec.groovy | line 12 expected 'distributed under the License is distributed on an "AS IS" BASIS,\n' 2 issue(s) found after 2 altered file(s) checked ======================================================================================================================== [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ checkstyle --- [INFO] Installing /w/workspace/cps-master-verify-java/checkstyle/target/checkstyle-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/checkstyle/3.4.8-SNAPSHOT/checkstyle-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/checkstyle/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/checkstyle/3.4.8-SNAPSHOT/checkstyle-3.4.8-SNAPSHOT.pom [INFO] [INFO] -----------------------< org.onap.cps:spotbugs >------------------------ [INFO] Building spotbugs 3.4.8-SNAPSHOT [4/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spotbugs --- [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spotbugs --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 1 resource [INFO] [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ spotbugs --- [INFO] No sources to compile [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ spotbugs --- [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/spotbugs/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ spotbugs --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ spotbugs --- [INFO] No tests to run. [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ spotbugs --- [INFO] Building jar: /w/workspace/cps-master-verify-java/spotbugs/target/spotbugs-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ spotbugs --- [INFO] Installing /w/workspace/cps-master-verify-java/spotbugs/target/spotbugs-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/spotbugs/3.4.8-SNAPSHOT/spotbugs-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/spotbugs/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/spotbugs/3.4.8-SNAPSHOT/spotbugs-3.4.8-SNAPSHOT.pom [INFO] [INFO] ----------------------< org.onap.cps:cps-parent >----------------------- [INFO] Building cps-parent 3.4.8-SNAPSHOT [5/23] [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-parent --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-parent --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-parent --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-parent --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-parent/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-parent --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-parent/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-parent --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-parent --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-parent --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-parent >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-parent --- [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-parent <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-parent --- [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-parent --- [INFO] No sources specified for compilation. Skipping. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-parent --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-parent --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-parent --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-parent/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-parent --- [INFO] No tests to run. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-parent --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-parent --- [INFO] Skipping JaCoCo execution due to missing execution data file:/w/workspace/cps-master-verify-java/cps-parent/target/code-coverage/jacoco-ut.exec [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-parent --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-parent --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-parent/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-parent/3.4.8-SNAPSHOT/cps-parent-3.4.8-SNAPSHOT.pom [INFO] [INFO] ----------------------< org.onap.cps:cps-events >----------------------- [INFO] Building cps-events 3.4.8-SNAPSHOT [6/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-events --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-events --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-events --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-events --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-events/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-events --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-events/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jsonschema2pojo-maven-plugin:1.2.1:generate (default) @ cps-events --- [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-events --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-events --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-events --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-events --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-events/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-events/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-events --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 2 source files with javac [debug release 17] to target/classes [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-events >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-events --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-events <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-events --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-events --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-events/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-events --- [INFO] No sources to compile [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-events --- [INFO] No sources specified for compilation. Skipping. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-events --- [INFO] No tests to run. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-events --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-events --- [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cps-events --- [INFO] Building jar: /w/workspace/cps-master-verify-java/cps-events/target/cps-events-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-events --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-events/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-events --- [INFO] No tests to run. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-events --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-events --- [INFO] Skipping JaCoCo execution due to missing execution data file:/w/workspace/cps-master-verify-java/cps-events/target/code-coverage/jacoco-ut.exec [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-events --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-events --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-events/target/cps-events-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/cps-events/3.4.8-SNAPSHOT/cps-events-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/cps-events/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-events/3.4.8-SNAPSHOT/cps-events-3.4.8-SNAPSHOT.pom [INFO] [INFO] --------------------< org.onap.cps:cps-path-parser >-------------------- [INFO] Building cps-path-parser 3.4.8-SNAPSHOT [7/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-path-parser --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-path-parser --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-path-parser --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-path-parser --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-path-parser/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-path-parser --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-path-parser/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- antlr4-maven-plugin:4.9.2:antlr4 (default) @ cps-path-parser --- [INFO] ANTLR 4: Processing source directory /w/workspace/cps-master-verify-java/cps-path-parser/src/main/antlr4 [INFO] Processing grammar: org/onap/cps/cpspath/parser/antlr4/CpsPath.g4 [WARNING] warning(131): org/onap/cps/cpspath/parser/antlr4/CpsPath.g4:64:64: greedy block ()+ contains wildcard; the non-greedy syntax ()+? may be preferred [WARNING] /w/workspace/cps-master-verify-java/org/onap/cps/cpspath/parser/antlr4/CpsPath.g4 [64:64]: greedy block ()+ contains wildcard; the non-greedy syntax ()+? may be preferred [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-path-parser --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-path-parser --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-path-parser --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-path-parser --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-path-parser/src/main/resources [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-path-parser/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-path-parser/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-path-parser --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 9 source files with javac [debug release 17] to target/classes [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-path-parser >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-path-parser --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-path-parser <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-path-parser --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-path-parser --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-path-parser/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-path-parser --- [INFO] No sources to compile [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-path-parser --- [INFO] Using isolated classloader, without GMavenPlus classpath. [INFO] Using Groovy 3.0.18 to perform compileTests. [INFO] Compiled 4 files. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-path-parser --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running org.onap.cps.cpspath.parser.CpsPathQuerySpec line 1:0 extraneous input 'invalid-cps-path' expecting '/' line 1:25 mismatched input '5.0' expecting {IntegerLiteral, StringLiteral} line 1:26 mismatched input '"' expecting {IntegerLiteral, StringLiteral} line 1:26 mismatched input ''' expecting {IntegerLiteral, StringLiteral} line 1:29 mismatched input ']' expecting {'=', '>', '<', '>=', '<='} line 1:18 missing QName at '' [INFO] Tests run: 81, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.594 s - in org.onap.cps.cpspath.parser.CpsPathQuerySpec [INFO] Running org.onap.cps.cpspath.parser.CpsPathUtilSpec [INFO] Tests run: 27, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.037 s - in org.onap.cps.cpspath.parser.CpsPathUtilSpec [INFO] [INFO] Results: [INFO] [INFO] Tests run: 108, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-path-parser --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-path-parser/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-path-parser' with 6 classes [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-path-parser --- [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cps-path-parser --- [INFO] Building jar: /w/workspace/cps-master-verify-java/cps-path-parser/target/cps-path-parser-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-path-parser --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-path-parser/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-path-parser --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-path-parser --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-path-parser --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-path-parser/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-path-parser' with 6 classes [INFO] All coverage checks have been met. [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-path-parser --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-path-parser --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-path-parser/target/cps-path-parser-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/cps-path-parser/3.4.8-SNAPSHOT/cps-path-parser-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/cps-path-parser/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-path-parser/3.4.8-SNAPSHOT/cps-path-parser-3.4.8-SNAPSHOT.pom [INFO] [INFO] ----------------------< org.onap.cps:cps-service >---------------------- [INFO] Building cps-service 3.4.8-SNAPSHOT [8/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-service --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-service --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-service --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-service --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-service/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-service --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-service/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-service --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-service --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-service --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-service --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-service/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-service/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-service --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 68 source files with javac [debug release 17] to target/classes [INFO] /w/workspace/cps-master-verify-java/cps-service/src/main/java/org/onap/cps/utils/PrefixResolver.java: Some input files use unchecked or unsafe operations. [INFO] /w/workspace/cps-master-verify-java/cps-service/src/main/java/org/onap/cps/utils/PrefixResolver.java: Recompile with -Xlint:unchecked for details. [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-service >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-service --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-service <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-service --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-service --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 30 resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-service --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 2 source files with javac [debug release 17] to target/test-classes [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-service --- [INFO] Using isolated classloader, without GMavenPlus classpath. [INFO] Using Groovy 3.0.18 to perform compileTests. [INFO] Compiled 67 files. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-service --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running org.onap.cps.utils.DataMapUtilsSpec [INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.724 s - in org.onap.cps.utils.DataMapUtilsSpec [INFO] Running org.onap.cps.utils.GsonSpec first-container a-leaf a-Value last-container x-leaf x-value [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.022 s - in org.onap.cps.utils.GsonSpec [INFO] Running org.onap.cps.utils.JsonObjectMapperSpec 09:56:26.836 [main] ERROR org.onap.cps.utils.JsonObjectMapper -- Parsing error occurred while converting JSON object to bytes. 09:56:26.850 [main] ERROR org.onap.cps.utils.JsonObjectMapper -- Parsing error occurred while converting Object to JSON string. 09:56:26.886 [main] ERROR org.onap.cps.utils.JsonObjectMapper -- Parsing error occurred while converting JSON content to specific class type. 09:56:26.890 [main] ERROR org.onap.cps.utils.JsonObjectMapper -- Found structurally incompatible object while converting into value type. 09:56:26.920 [main] ERROR org.onap.cps.utils.JsonObjectMapper -- Parsing error occurred while converting JSON content to Json Node. [INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.656 s - in org.onap.cps.utils.JsonObjectMapperSpec [INFO] Running org.onap.cps.utils.JsonParserStreamSpec [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.746 s - in org.onap.cps.utils.JsonParserStreamSpec [INFO] Running org.onap.cps.utils.PrefixResolverSpec [INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.208 s - in org.onap.cps.utils.PrefixResolverSpec [INFO] Running org.onap.cps.utils.XmlFileUtilsSpec [Fatal Error] :1:1: Content is not allowed in prolog. [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.207 s - in org.onap.cps.utils.XmlFileUtilsSpec [INFO] Running org.onap.cps.utils.YangParserHelperSpec [Fatal Error] :1:7: XML document structures must start and end within the same entity. [Fatal Error] :1:76: The end-tag for element type "bookstore" must end with a '>' delimiter. [Fatal Error] :1:1: Premature end of file. line 1:1 no viable alternative at input '/' line 1:1 no viable alternative at input '/[' line 1:11 missing QName at '' [INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 s - in org.onap.cps.utils.YangParserHelperSpec [INFO] Running org.onap.cps.utils.YangParserSpec [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 s - in org.onap.cps.utils.YangParserSpec [INFO] Running org.onap.cps.utils.YangUtilsSpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.024 s - in org.onap.cps.utils.YangUtilsSpec [INFO] Running org.onap.cps.aop.CpsLoggingAspectServiceSpec [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.037 s - in org.onap.cps.aop.CpsLoggingAspectServiceSpec [INFO] Running org.onap.cps.api.impl.CpsAnchorServiceImplSpec [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.032 s - in org.onap.cps.api.impl.CpsAnchorServiceImplSpec [INFO] Running org.onap.cps.api.impl.CpsDataServiceImplSpec [Fatal Error] :1:13: XML document structures must start and end within the same entity. [Fatal Error] :1:13: XML document structures must start and end within the same entity. [INFO] Tests run: 55, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.247 s - in org.onap.cps.api.impl.CpsDataServiceImplSpec [INFO] Running org.onap.cps.api.impl.CpsDataspaceServiceImplSpec [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.onap.cps.api.impl.CpsDataspaceServiceImplSpec [INFO] Running org.onap.cps.api.impl.CpsDeltaServiceImplSpec [INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.013 s - in org.onap.cps.api.impl.CpsDeltaServiceImplSpec [INFO] Running org.onap.cps.api.impl.CpsModuleServiceImplSpec [INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.096 s - in org.onap.cps.api.impl.CpsModuleServiceImplSpec [INFO] Running org.onap.cps.api.impl.CpsQueryServiceImplSpec [INFO] Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.012 s - in org.onap.cps.api.impl.CpsQueryServiceImplSpec [INFO] Running org.onap.cps.api.impl.E2ENetworkSliceSpec [INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.35 s - in org.onap.cps.api.impl.E2ENetworkSliceSpec [INFO] Running org.onap.cps.api.impl.YangTextSchemaSourceSetCacheSpec 09:56:23,297 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.4.14 09:56:23,300 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Here is a list of configurators discovered as a service, by rank: 09:56:23,300 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:56:23,300 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - They will be invoked in order until ExecutionStatus.DO_NOT_INVOKE_NEXT_IF_ANY is returned. 09:56:23,300 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Constructed configurator of type class org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:56:23,310 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - org.springframework.boot.logging.logback.RootLogLevelConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:56:23,310 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 09:56:23,312 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 09:56:23,314 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 09:56:23,314 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 09:56:23,314 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:56:23,314 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 09:56:23,315 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 09:56:23,316 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 09:56:23,316 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.xml] 09:56:23,316 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:56:23,316 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Trying to configure with ch.qos.logback.classic.BasicConfigurator 09:56:23,317 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - Constructed configurator of type class ch.qos.logback.classic.BasicConfigurator 09:56:23,317 |-INFO in ch.qos.logback.classic.BasicConfigurator@2f8892c5 - Setting up default configuration. 09:56:23,338 |-INFO in ch.qos.logback.classic.util.ContextInitializer@79703b86 - ch.qos.logback.classic.BasicConfigurator.configure() call lasted 21 milliseconds. ExecutionStatus=NEUTRAL 09:56:29,606 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@4ae707c - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:56:29,606 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@288aaeaf - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:56:29,609 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:56:29,609 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:56:29,609 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:56:29,609 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:56:29,609 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:56:29,611 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@4ae707c - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:56:29,611 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@288aaeaf - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:56:29,618 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:56:29,618 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:56:29,622 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:56:29,657 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434189656) 09:56:29,657 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [file:/w/workspace/cps-master-verify-java/cps-service/target/classes/logback-spring.xml] 09:56:29,657 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:56:29,664 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:56:29,665 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:56:29,665 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:56:29,665 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:56:29,665 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:56:29,665 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:56:29,665 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:56:29,665 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:56:29,666 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:56:29,666 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:56:29,666 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:56:29,666 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:56:29,666 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:56:29,666 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:56:29,666 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:56:29,666 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:56:29,666 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:56:29,666 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:56:29,672 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:56:29,695 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:56:29,695 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:56:29,702 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:56:29,708 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:56:30,227 |-INFO in ch.qos.logback.classic.pattern.DateConverter@3babb257 - Setting zoneId to "UTC" 09:56:30,236 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:56:30,236 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:56:30,552 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:56:30,554 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:56:30,554 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:56:30,554 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:56:30,555 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:56:30,555 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@50de1c33 - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:56:30,555 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:56:30,555 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@4c7108ce - End of configuration. 09:56:30,555 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@31733839 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:56:30.706Z INFO 6638 --- [ main] o.c.a.i.YangTextSchemaSourceSetCacheSpec : Starting YangTextSchemaSourceSetCacheSpec using Java 17.0.6-ea with PID 6638 (started by jenkins in /w/workspace/cps-master-verify-java/cps-service) 2024-04-18T09:56:30.707Z INFO 6638 --- [ main] o.c.a.i.YangTextSchemaSourceSetCacheSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:56:31.135Z INFO 6638 --- [ main] o.c.a.i.YangTextSchemaSourceSetCacheSpec : Started YangTextSchemaSourceSetCacheSpec in 1.818 seconds (process running for 8.692) [ERROR] OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended [INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.524 s - in org.onap.cps.api.impl.YangTextSchemaSourceSetCacheSpec [INFO] Running org.onap.cps.cache.AnchorDataCacheConfigSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:56:31.842Z INFO 6638 --- [ main] o.o.cps.cache.AnchorDataCacheConfigSpec : Starting AnchorDataCacheConfigSpec using Java 17.0.6-ea with PID 6638 (started by jenkins in /w/workspace/cps-master-verify-java/cps-service) 2024-04-18T09:56:31.842Z INFO 6638 --- [ main] o.o.cps.cache.AnchorDataCacheConfigSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:56:31.959Z WARN 6638 --- [ main] c.h.i.impl.HazelcastInstanceFactory : Hazelcast is starting in a Java modular environment (Java 9 and newer) but without proper access to required Java packages. Use additional Java arguments to provide Hazelcast access to Java internal API. The internal API access is used to get the best performance results. Arguments to be used: --add-modules java.se --add-exports java.base/jdk.internal.ref=ALL-UNNAMED --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.management/sun.management=ALL-UNNAMED --add-opens jdk.management/com.sun.management.internal=ALL-UNNAMED 2024-04-18T09:56:32.058Z INFO 6638 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:56:32.058Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:56:32.058Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5701 2024-04-18T09:56:32.058Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Cluster name: cps-and-ncmp-test-caches 2024-04-18T09:56:32.058Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:56:32.062Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:56:32.262Z INFO 6638 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] TPC: disabled 2024-04-18T09:56:32.491Z INFO 6638 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:56:32.609Z INFO 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Using Multicast discovery 2024-04-18T09:56:32.621Z WARN 6638 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:56:32.933Z INFO 6638 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:56:32.944Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] [10.30.106.178]:5701 is STARTING 2024-04-18T09:56:33.158Z INFO 6638 --- [ main] c.h.i.cluster.impl.MulticastJoiner : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Trying to join to discovered node: [10.30.106.210]:5701 2024-04-18T09:56:33.206Z INFO 6638 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:57179 and /10.30.106.210:5701 2024-04-18T09:56:34.538Z INFO 6638 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=1, /10.30.106.178:57179->/10.30.106.210:5701, qualifier=null, endpoint=[10.30.106.210]:5701, remoteUuid=62b107f1-49fe-4282-bfd6-c691ff7c3228, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:56:37.681Z INFO 6638 --- [ main] c.h.internal.cluster.ClusterService : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] Members {size:1, ver:1} [ Member [10.30.106.178]:5701 - c06fe5b4-e910-4dbb-a7e0-2fe60d3bf453 this ] 2024-04-18T09:56:37.701Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5701 [cps-and-ncmp-test-caches] [5.3.6] [10.30.106.178]:5701 is STARTED 2024-04-18T09:56:37.744Z INFO 6638 --- [ main] o.o.cps.cache.AnchorDataCacheConfigSpec : Started AnchorDataCacheConfigSpec in 5.937 seconds (process running for 15.301) 2024-04-18T09:56:37.800Z INFO 6638 --- [ main] o.onap.cps.cache.HazelcastCacheConfig : Enabling kubernetes mode with service-name : test-service-name [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.998 s - in org.onap.cps.cache.AnchorDataCacheConfigSpec [INFO] Running org.onap.cps.cache.AnchorDataCacheEntrySpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.onap.cps.cache.AnchorDataCacheEntrySpec [INFO] Running org.onap.cps.cache.HazelcastCacheConfigSpec 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5702 [my cluster] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [my cluster] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [my cluster] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5702 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [my cluster] [5.3.6] Cluster name: my cluster 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [my cluster] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:56:37.822Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [my cluster] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:56:37.828Z INFO 6638 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5702 [my cluster] [5.3.6] TPC: disabled 2024-04-18T09:56:37.840Z INFO 6638 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5702 [my cluster] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:56:37.848Z INFO 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5702 [my cluster] [5.3.6] Using Multicast discovery 2024-04-18T09:56:37.848Z WARN 6638 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5702 [my cluster] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:56:37.869Z INFO 6638 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5702 [my cluster] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:56:37.869Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5702 [my cluster] [5.3.6] [10.30.106.178]:5702 is STARTING 2024-04-18T09:56:41.060Z INFO 6638 --- [ main] c.h.internal.cluster.ClusterService : [10.30.106.178]:5702 [my cluster] [5.3.6] Members {size:1, ver:1} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e this ] 2024-04-18T09:56:41.060Z WARN 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5702 [my cluster] [5.3.6] Config seed port is 5701 and cluster size is 1. Some of the ports seem occupied! 2024-04-18T09:56:41.061Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5702 [my cluster] [5.3.6] [10.30.106.178]:5702 is STARTED 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5703 [my cluster] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [my cluster] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [my cluster] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5703 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [my cluster] [5.3.6] Cluster name: my cluster 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [my cluster] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:56:41.072Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [my cluster] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:56:41.079Z INFO 6638 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5703 [my cluster] [5.3.6] TPC: disabled 2024-04-18T09:56:41.094Z INFO 6638 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5703 [my cluster] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:56:41.096Z INFO 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5703 [my cluster] [5.3.6] Using Multicast discovery 2024-04-18T09:56:41.096Z WARN 6638 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5703 [my cluster] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:56:41.119Z INFO 6638 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5703 [my cluster] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:56:41.120Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5703 [my cluster] [5.3.6] [10.30.106.178]:5703 is STARTING 2024-04-18T09:56:41.340Z INFO 6638 --- [ main] c.h.i.cluster.impl.MulticastJoiner : [10.30.106.178]:5703 [my cluster] [5.3.6] Trying to join to discovered node: [10.30.106.178]:5702 2024-04-18T09:56:41.344Z INFO 6638 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.178:37379 2024-04-18T09:56:41.345Z INFO 6638 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:37379 and /10.30.106.178:5702 2024-04-18T09:56:46.460Z INFO 6638 --- [ration.thread-0] c.h.internal.cluster.ClusterService : [10.30.106.178]:5702 [my cluster] [5.3.6] Members {size:2, ver:2} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e this Member [10.30.106.178]:5703 - 10131b91-9fe1-4b86-8b6e-74e4fa55a589 ] 2024-04-18T09:56:46.470Z INFO 6638 --- [ration.thread-2] c.h.internal.cluster.ClusterService : [10.30.106.178]:5703 [my cluster] [5.3.6] Members {size:2, ver:2} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e Member [10.30.106.178]:5703 - 10131b91-9fe1-4b86-8b6e-74e4fa55a589 this ] 2024-04-18T09:56:46.470Z WARN 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5703 [my cluster] [5.3.6] Config seed port is 5701 and cluster size is 2. Some of the ports seem occupied! 2024-04-18T09:56:46.471Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5703 [my cluster] [5.3.6] [10.30.106.178]:5703 is STARTED 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5704 [my cluster] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5704 [my cluster] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5704 [my cluster] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5704 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5704 [my cluster] [5.3.6] Cluster name: my cluster 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5704 [my cluster] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:56:46.484Z INFO 6638 --- [ main] com.hazelcast.system : [10.30.106.178]:5704 [my cluster] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:56:46.490Z INFO 6638 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5704 [my cluster] [5.3.6] TPC: disabled 2024-04-18T09:56:46.500Z INFO 6638 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5704 [my cluster] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:56:46.501Z INFO 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5704 [my cluster] [5.3.6] Using Multicast discovery 2024-04-18T09:56:46.502Z WARN 6638 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5704 [my cluster] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:56:46.512Z INFO 6638 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5704 [my cluster] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:56:46.513Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5704 [my cluster] [5.3.6] [10.30.106.178]:5704 is STARTING 2024-04-18T09:56:46.629Z INFO 6638 --- [ main] c.h.i.cluster.impl.MulticastJoiner : [10.30.106.178]:5704 [my cluster] [5.3.6] Trying to join to discovered node: [10.30.106.178]:5702 2024-04-18T09:56:46.631Z INFO 6638 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.178:33803 2024-04-18T09:56:46.632Z INFO 6638 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5704 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:33803 and /10.30.106.178:5702 2024-04-18T09:56:51.739Z INFO 6638 --- [ration.thread-0] c.h.internal.cluster.ClusterService : [10.30.106.178]:5702 [my cluster] [5.3.6] Members {size:3, ver:3} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e this Member [10.30.106.178]:5703 - 10131b91-9fe1-4b86-8b6e-74e4fa55a589 Member [10.30.106.178]:5704 - 8c9e4411-ba9b-4bad-b929-c34cad338580 ] 2024-04-18T09:56:51.742Z INFO 6638 --- [ration.thread-0] c.h.internal.cluster.ClusterService : [10.30.106.178]:5704 [my cluster] [5.3.6] Members {size:3, ver:3} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e Member [10.30.106.178]:5703 - 10131b91-9fe1-4b86-8b6e-74e4fa55a589 Member [10.30.106.178]:5704 - 8c9e4411-ba9b-4bad-b929-c34cad338580 this ] 2024-04-18T09:56:51.743Z INFO 6638 --- [cached.thread-1] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5704 [my cluster] [5.3.6] Connecting to /10.30.106.178:5703, timeout: 10000, bind-any: true 2024-04-18T09:56:51.744Z INFO 6638 --- [ration.thread-2] c.h.internal.cluster.ClusterService : [10.30.106.178]:5703 [my cluster] [5.3.6] Members {size:3, ver:3} [ Member [10.30.106.178]:5702 - 4f962d49-7dc5-4a16-95f9-d0dddf87135e Member [10.30.106.178]:5703 - 10131b91-9fe1-4b86-8b6e-74e4fa55a589 this Member [10.30.106.178]:5704 - 8c9e4411-ba9b-4bad-b929-c34cad338580 ] 2024-04-18T09:56:51.744Z INFO 6638 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:5703 and /10.30.106.178:53323 2024-04-18T09:56:51.745Z INFO 6638 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5704 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:5704 and /10.30.106.178:47751 2024-04-18T09:56:51.745Z INFO 6638 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:47751 and /10.30.106.178:5704 2024-04-18T09:56:51.745Z INFO 6638 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5704 [my cluster] [5.3.6] Initialized new cluster connection between /10.30.106.178:53323 and /10.30.106.178:5703 2024-04-18T09:56:52.743Z WARN 6638 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5704 [my cluster] [5.3.6] Config seed port is 5701 and cluster size is 3. Some of the ports seem occupied! 2024-04-18T09:56:52.743Z INFO 6638 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5704 [my cluster] [5.3.6] [10.30.106.178]:5704 is STARTED [INFO] Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.945 s - in org.onap.cps.cache.HazelcastCacheConfigSpec [INFO] Running org.onap.cps.config.AsyncConfigSpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.026 s - in org.onap.cps.config.AsyncConfigSpec [INFO] Running org.onap.cps.config.CacheConfigSpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 s - in org.onap.cps.config.CacheConfigSpec [INFO] Running org.onap.cps.events.EventsPublisherSpec 2024-04-18T09:56:53.022Z DEBUG 6638 --- [ main] org.onap.cps.events.EventsPublisher : Successfully published event to topic : some-topic , Event : some-value 2024-04-18T09:56:53.029Z ERROR 6638 --- [ main] org.onap.cps.events.EventsPublisher : Unable to publish event to topic : some-topic due to some exception 2024-04-18T09:56:53.042Z DEBUG 6638 --- [ main] org.onap.cps.events.EventsPublisher : Successfully published event to topic : some-topic , Event : some-value 2024-04-18T09:56:53.049Z DEBUG 6638 --- [ main] org.onap.cps.events.EventsPublisher : Successfully published event to topic : some-topic , Event : some-value 2024-04-18T09:56:53.055Z DEBUG 6638 --- [ main] org.onap.cps.events.EventsPublisher : Successfully published event to topic : some-topic , Event : some-value 2024-04-18T09:56:53.056Z DEBUG 6638 --- [ main] org.onap.cps.events.EventsPublisher : Successfully published event to topic : some-topic , Event : some-value 2024-04-18T09:56:53.056Z ERROR 6638 --- [ main] org.onap.cps.events.EventsPublisher : Unable to publish event to topic : some-topic due to some exception [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.263 s - in org.onap.cps.events.EventsPublisherSpec [INFO] Running org.onap.cps.spi.FetchDescendantsOptionSpec [INFO] Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.013 s - in org.onap.cps.spi.FetchDescendantsOptionSpec [INFO] Running org.onap.cps.spi.PaginationOptionSpec [INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 s - in org.onap.cps.spi.PaginationOptionSpec [INFO] Running org.onap.cps.spi.exceptions.CpsExceptionsSpec [INFO] Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.021 s - in org.onap.cps.spi.exceptions.CpsExceptionsSpec [INFO] Running org.onap.cps.spi.model.ConditionPropertiesSpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 s - in org.onap.cps.spi.model.ConditionPropertiesSpec [INFO] Running org.onap.cps.spi.model.DataNodeBuilderSpec 2024-04-18T09:56:53.301Z WARN 6638 --- [ main] org.onap.cps.spi.model.DataNodeBuilder : Unsupported NormalizedNode type detected: class jdk.proxy2.$Proxy145 [INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.206 s - in org.onap.cps.spi.model.DataNodeBuilderSpec [INFO] Running org.onap.cps.spi.model.DeltaReportBuilderSpec [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 s - in org.onap.cps.spi.model.DeltaReportBuilderSpec [INFO] Running org.onap.cps.yang.YangTextSchemaSourceSetBuilderSpec 2024-04-18T09:56:53.378Z ERROR 6638 --- [ main] o.o.y.y.p.s.reactor.BuildGlobalContext : Failed to parse YANG from source SourceSpecificContext [source=YangStatementStreamSource{identifier=RevisionSourceIdentifier [name=invalid-missing-import.yang]}, current=SOURCE_PRE_LINKAGE, finished=INIT]: Imported module [missing-module] was not found. [at :8:5] [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.039 s - in org.onap.cps.yang.YangTextSchemaSourceSetBuilderSpec [INFO] [INFO] Results: [INFO] [INFO] Tests run: 309, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-service --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-service/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-service' with 56 classes [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-service --- [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cps-service --- [INFO] Building jar: /w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-service --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-service/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-service --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-service --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-service --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-service/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-service' with 56 classes [INFO] All coverage checks have been met. [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-service --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-service --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/cps-service/3.4.8-SNAPSHOT/cps-service-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/cps-service/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-service/3.4.8-SNAPSHOT/cps-service-3.4.8-SNAPSHOT.pom [INFO] [INFO] -----------------------< org.onap.cps:cps-rest >------------------------ [INFO] Building cps-rest 3.4.8-SNAPSHOT [9/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-rest --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-rest --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-rest --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-rest --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-rest/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-rest --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-rest/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- openapi-generator-maven-plugin:6.6.0:generate (code-gen) @ cps-rest --- [INFO] Generating with dryRun=false [INFO] Output directory (/w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi) does not exist, or is inaccessible. No file (.openapi-generator-ignore) will be evaluated. [INFO] OpenAPI Generator: spring (server) [INFO] Generator 'spring' is considered stable. [INFO] ---------------------------------- [INFO] Set base package to invoker package (org.onap.cps.rest.controller) [INFO] Environment variable JAVA_POST_PROCESS_FILE not defined so the Java code may not be properly formatted. To define it, try 'export JAVA_POST_PROCESS_FILE="/usr/local/bin/clang-format -i"' (Linux/Mac) [INFO] NOTE: To enable file post-processing, 'enablePostProcessFile' must be set to `true` (--enable-post-process-file for CLI). [WARNING] Multiple schemas found in the OAS 'content' section, returning only the first one (application/json) [INFO] Processing operation createDataspace [INFO] Processing operation deleteDataspace [INFO] Processing operation createDataspaceV2 [INFO] Processing operation getAllDataspaces [INFO] Processing operation getDataspace [INFO] Processing operation createAnchor [INFO] Processing operation createAnchorV2 [INFO] Processing operation getAnchors [INFO] Processing operation getAnchor [INFO] Processing operation deleteAnchor [INFO] Processing operation createSchemaSet [INFO] Processing operation createSchemaSetV2 [INFO] Processing operation getSchemaSets [INFO] Processing operation getSchemaSet [INFO] Processing operation deleteSchemaSet [INFO] Processing operation getNodeByDataspaceAndAnchor [INFO] Processing operation getNodeByDataspaceAndAnchorV2 [INFO] Processing operation replaceNode [INFO] Processing operation createNode [WARNING] Multiple schemas found in the OAS 'content' section, returning only the first one (application/json) [INFO] Processing operation deleteDataNode [INFO] Processing operation updateNodeLeaves [INFO] Processing operation deleteListOrListElement [INFO] Processing operation replaceListContent [INFO] Processing operation addListElements [INFO] Processing operation getDeltaByDataspaceAndAnchors [INFO] Processing operation getNodesByDataspaceAndAnchorAndCpsPath [INFO] Processing operation getNodesByDataspaceAndAnchorAndCpsPathV2 [INFO] Processing operation getNodesByDataspaceAndCpsPath [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] Model MultipartFile not generated since it's marked as unused (due to form parameters) and `skipFormModel` (global property) set to true (default) [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/model/AnchorDetails.java [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/model/DataspaceDetails.java [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/model/ErrorMessage.java [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/model/ModuleReferences.java [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/model/SchemaSetDetails.java [WARNING] Multiple schemas found in the OAS 'content' section, returning only the first one (application/json) [WARNING] Multiple MediaTypes found, using only the first one [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/api/CpsAdminApi.java [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/api/CpsDataApi.java [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/src/gen/java/org/onap/cps/rest/api/CpsQueryApi.java [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] Skipping generation of supporting files. ################################################################################ # Thanks for using OpenAPI Generator. # # Please consider donation to help us maintain this project 🙏 # # https://opencollective.com/openapi_generator/donate # ################################################################################ [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-rest --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-rest --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-rest --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-rest --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-rest/src/main/resources [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-rest/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-rest --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 15 source files with javac [debug release 17] to target/classes [WARNING] /w/workspace/cps-master-verify-java/cps-rest/src/main/java/org/onap/cps/rest/controller/CpsRestInputMapper.java:[41,22] Unmapped target property: "name". Mapping from Collection element "ModuleReference moduleReferences" to "ModuleReferences moduleReferences". [INFO] /w/workspace/cps-master-verify-java/cps-rest/src/main/java/org/onap/cps/rest/controller/DataRestController.java: Some input files use or override a deprecated API. [INFO] /w/workspace/cps-master-verify-java/cps-rest/src/main/java/org/onap/cps/rest/controller/DataRestController.java: Recompile with -Xlint:deprecation for details. [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-rest >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-rest --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-rest <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-rest --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- openapi-generator-maven-plugin:6.6.0:generate (openapi-yaml-gen) @ cps-rest --- [INFO] Generating with dryRun=false [INFO] No .openapi-generator-ignore file found. [INFO] OpenAPI Generator: openapi-yaml (documentation) [INFO] Generator 'openapi-yaml' is considered stable. [INFO] Output file [outputFile=openapi.yaml] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] Model MultipartFile not generated since it's marked as unused (due to form parameters) and `skipFormModel` (global property) set to true (default) [WARNING] Multiple schemas found in the OAS 'content' section, returning only the first one (application/json) [WARNING] Multiple MediaTypes found, using only the first one [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] 'host' (OAS 2.0) or 'servers' (OAS 3.0) not defined in the spec. Default to [http://localhost] for server URL [http://localhost/cps/api] [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/README.md [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/openapi.yaml [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/.openapi-generator-ignore [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/.openapi-generator/VERSION [INFO] writing file /w/workspace/cps-master-verify-java/cps-rest/target/generated-sources/openapi/.openapi-generator/FILES ################################################################################ # Thanks for using OpenAPI Generator. # # Please consider donation to help us maintain this project 🙏 # # https://opencollective.com/openapi_generator/donate # ################################################################################ [INFO] [INFO] --- maven-resources-plugin:2.6:copy-resources (copy-resources) @ cps-rest --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-rest --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 6 resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-rest --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 2 source files with javac [debug release 17] to target/test-classes [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-rest --- [INFO] Using isolated classloader, without GMavenPlus classpath. [INFO] Using Groovy 3.0.18 to perform compileTests. [INFO] Compiled 27 files. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-rest --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- 09:57:06.928 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.AdminRestControllerSpec]: AdminRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.156 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.AdminRestControllerSpec 09:57:07.294 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.DataRestControllerSpec]: DataRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.313 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.DataRestControllerSpec 09:57:07.332 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.QueryRestControllerSpec]: QueryRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.343 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.QueryRestControllerSpec 09:57:07.358 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec]: CpsRestExceptionHandlerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.383 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec 09:57:07.430 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.AdminRestControllerSpec]: AdminRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.440 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.AdminRestControllerSpec 09:57:07.444 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.DataRestControllerSpec]: DataRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.453 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.DataRestControllerSpec 09:57:07.456 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.controller.QueryRestControllerSpec]: QueryRestControllerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.464 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.controller.QueryRestControllerSpec 09:57:07.466 [main] INFO org.springframework.test.context.support.AnnotationConfigContextLoaderUtils -- Could not detect default configuration classes for test class [org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec]: CpsRestExceptionHandlerSpec does not declare any static, non-private, non-final, nested classes annotated with @Configuration. 09:57:07.474 [main] INFO org.springframework.boot.test.context.SpringBootTestContextBootstrapper -- Found @SpringBootConfiguration org.onap.cps.TestApplication for test class org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec [INFO] Running org.onap.cps.rest.controller.AdminRestControllerSpec 09:57:06,743 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.4.14 09:57:06,746 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Here is a list of configurators discovered as a service, by rank: 09:57:06,746 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:57:06,746 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - They will be invoked in order until ExecutionStatus.DO_NOT_INVOKE_NEXT_IF_ANY is returned. 09:57:06,746 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Constructed configurator of type class org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:57:06,756 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - org.springframework.boot.logging.logback.RootLogLevelConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:06,756 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 09:57:06,757 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 09:57:06,759 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 09:57:06,759 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 09:57:06,759 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:06,759 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 09:57:06,760 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 09:57:06,761 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 09:57:06,761 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.xml] 09:57:06,761 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:06,761 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Trying to configure with ch.qos.logback.classic.BasicConfigurator 09:57:06,762 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - Constructed configurator of type class ch.qos.logback.classic.BasicConfigurator 09:57:06,762 |-INFO in ch.qos.logback.classic.BasicConfigurator@371fdf43 - Setting up default configuration. 09:57:06,780 |-INFO in ch.qos.logback.classic.util.ContextInitializer@6ea4b4b2 - ch.qos.logback.classic.BasicConfigurator.configure() call lasted 18 milliseconds. ExecutionStatus=NEUTRAL 09:57:08,821 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@37f687c1 - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:57:08,876 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@297da2c3 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:57:08,876 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@37f687c1 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:57:08,879 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:57:08,879 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:57:08,879 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:57:08,880 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:57:08,880 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:57:08,882 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@297da2c3 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:57:08,882 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@37f687c1 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:57:08,891 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:57:08,891 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:57:08,897 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:57:08,937 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434228935) 09:57:08,937 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:57:08,937 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:57:08,947 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:57:08,947 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:57:08,948 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:57:08,948 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:57:08,948 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:57:08,948 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:57:08,949 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:57:08,949 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:57:08,949 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:57:08,949 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:57:08,949 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:57:08,949 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:57:08,949 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:57:08,958 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:57:08,983 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:57:08,983 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:57:08,997 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:57:09,006 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:57:09,382 |-INFO in ch.qos.logback.classic.pattern.DateConverter@2b9bacb8 - Setting zoneId to "UTC" 09:57:09,393 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:57:09,393 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:57:09,691 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:57:09,692 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:57:09,692 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:57:09,692 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:57:09,693 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:57:09,693 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@6237927a - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:57:09,693 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:57:09,693 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@2850fe85 - End of configuration. 09:57:09,694 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@6c7f3af6 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:57:09.841Z INFO 6974 --- [ main] o.o.c.r.c.AdminRestControllerSpec : Starting AdminRestControllerSpec using Java 17.0.6-ea with PID 6974 (started by jenkins in /w/workspace/cps-master-verify-java/cps-rest) 2024-04-18T09:57:09.842Z INFO 6974 --- [ main] o.o.c.r.c.AdminRestControllerSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:57:11.348Z INFO 6974 --- [ main] o.s.b.t.m.w.SpringBootMockServletContext : Initializing Spring TestDispatcherServlet '' 2024-04-18T09:57:11.348Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Initializing Servlet '' 2024-04-18T09:57:11.349Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Completed initialization in 1 ms 2024-04-18T09:57:11.375Z INFO 6974 --- [ main] o.o.c.r.c.AdminRestControllerSpec : Started AdminRestControllerSpec in 3.072 seconds (process running for 5.468) [ERROR] OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended 2024-04-18T09:57:12.591Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred org.onap.cps.spi.exceptions.AlreadyDefinedException: Already defined exception at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:286) at org.onap.cps.rest.controller.AdminRestControllerSpec.$spock_feature_0_1(AdminRestControllerSpec.groovy:103) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:58) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.SimpleFeatureNode.lambda$around$0(SimpleFeatureNode.java:52) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:52) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) Caused by: java.lang.RuntimeException: null at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:268) ... 88 common frames omitted 2024-04-18T09:57:12.842Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred org.onap.cps.spi.exceptions.ModelValidationException: Invalid ZIP archive content. at org.onap.cps.rest.utils.MultipartFileUtil.extractYangResourcesMapFromZipArchive(MultipartFileUtil.java:98) at org.onap.cps.rest.utils.MultipartFileUtil.extractYangResourcesMap(MultipartFileUtil.java:64) at org.onap.cps.rest.controller.AdminRestController.createSchemaSet(AdminRestController.java:110) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.AdminRestController$$SpringCGLIB$$0.createSchemaSet() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.AdminRestControllerSpec.$spock_feature_0_7(AdminRestControllerSpec.groovy:216) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) Caused by: java.lang.IllegalArgumentException: Multiple entries with same key: component.yang=fake component content 2 and component.yang=fake component content 1 at com.google.common.collect.ImmutableMap.conflictException(ImmutableMap.java:378) at com.google.common.collect.ImmutableMap.checkNoConflict(ImmutableMap.java:372) at com.google.common.collect.RegularImmutableMap.checkNoConflictInKeyBucket(RegularImmutableMap.java:246) at com.google.common.collect.RegularImmutableMap.fromEntryArrayCheckingBucketOverflow(RegularImmutableMap.java:133) at com.google.common.collect.RegularImmutableMap.fromEntryArray(RegularImmutableMap.java:95) at com.google.common.collect.ImmutableMap$Builder.build(ImmutableMap.java:572) at com.google.common.collect.ImmutableMap$Builder.buildOrThrow(ImmutableMap.java:600) at com.google.common.collect.ImmutableMap$Builder.build(ImmutableMap.java:587) at org.onap.cps.rest.utils.MultipartFileUtil.extractYangResourcesMapFromZipArchive(MultipartFileUtil.java:90) ... 150 common frames omitted 2024-04-18T09:57:12.998Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred org.springframework.web.multipart.support.MissingServletRequestPartException: Required part 'file' is not present. at org.springframework.web.servlet.mvc.method.annotation.RequestPartMethodArgumentResolver.resolveArgument(RequestPartMethodArgumentResolver.java:168) at org.springframework.web.method.support.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:122) at org.springframework.web.method.support.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:224) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:178) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:139) at org.onap.cps.rest.controller.AdminRestControllerSpec.$spock_feature_0_9(AdminRestControllerSpec.groovy:251) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:13.013Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred org.springframework.web.multipart.support.MissingServletRequestPartException: Required part 'file' is not present. at org.springframework.web.servlet.mvc.method.annotation.RequestPartMethodArgumentResolver.resolveArgument(RequestPartMethodArgumentResolver.java:168) at org.springframework.web.method.support.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:122) at org.springframework.web.method.support.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:224) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:178) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.AdminRestControllerSpec.$spock_feature_0_9(AdminRestControllerSpec.groovy:251) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) [INFO] Tests run: 33, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.297 s - in org.onap.cps.rest.controller.AdminRestControllerSpec [INFO] Running org.onap.cps.rest.controller.CpsRestInputMapperSpec [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01 s - in org.onap.cps.rest.controller.CpsRestInputMapperSpec [INFO] Running org.onap.cps.rest.controller.DataRestControllerSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:57:13.247Z INFO 6974 --- [ main] o.o.c.r.c.DataRestControllerSpec : Starting DataRestControllerSpec using Java 17.0.6-ea with PID 6974 (started by jenkins in /w/workspace/cps-master-verify-java/cps-rest) 2024-04-18T09:57:13.248Z INFO 6974 --- [ main] o.o.c.r.c.DataRestControllerSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:57:13.644Z INFO 6974 --- [ main] o.s.b.t.m.w.SpringBootMockServletContext : Initializing Spring TestDispatcherServlet '' 2024-04-18T09:57:13.644Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Initializing Servlet '' 2024-04-18T09:57:13.644Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Completed initialization in 0 ms 2024-04-18T09:57:13.650Z INFO 6974 --- [ main] o.o.c.r.c.DataRestControllerSpec : Started DataRestControllerSpec in 0.44 seconds (process running for 7.743) 2024-04-18T09:57:13.763Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.createNode(DataRestController.java:76) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.createNode() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_1(DataRestControllerSpec.groovy:138) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:13.915Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.addListElements(DataRestController.java:98) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.addListElements() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_3(DataRestControllerSpec.groovy:193) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:13.937Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.addListElements(DataRestController.java:98) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.addListElements() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPost(FrameworkServlet.java:914) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:547) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_4(DataRestControllerSpec.groovy:216) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:14.190Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.updateNodeLeaves(DataRestController.java:138) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.updateNodeLeaves() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:888) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_12(DataRestControllerSpec.groovy:384) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:14.219Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.replaceNode(DataRestController.java:148) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.replaceNode() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPut(FrameworkServlet.java:925) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:550) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_14(DataRestControllerSpec.groovy:429) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:14.237Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.replaceListContent(DataRestController.java:158) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.replaceListContent() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doPut(FrameworkServlet.java:925) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:550) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_15(DataRestControllerSpec.groovy:454) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) [INFO] Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.126 s - in org.onap.cps.rest.controller.DataRestControllerSpec [INFO] Running org.onap.cps.rest.controller.QueryRestControllerSpec 2024-04-18T09:57:14.250Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.deleteListOrListElement(DataRestController.java:166) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.deleteListOrListElement() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doDelete(FrameworkServlet.java:936) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:553) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_16(DataRestControllerSpec.groovy:473) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:14.262Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred jakarta.validation.ValidationException: observed-timestamp must be in 'yyyy-MM-dd'T'HH:mm:ss.SSSZ' format at org.onap.cps.rest.controller.DataRestController.toOffsetDateTime(DataRestController.java:197) at org.onap.cps.rest.controller.DataRestController.deleteDataNode(DataRestController.java:89) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:351) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.validation.beanvalidation.MethodValidationInterceptor.invoke(MethodValidationInterceptor.java:174) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:765) at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:717) at org.onap.cps.rest.controller.DataRestController$$SpringCGLIB$$0.deleteDataNode() at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:255) at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:188) at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:118) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:925) at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:830) at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1089) at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:979) at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1014) at org.springframework.web.servlet.FrameworkServlet.doDelete(FrameworkServlet.java:936) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:553) at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:885) at org.springframework.test.web.servlet.TestDispatcherServlet.service(TestDispatcherServlet.java:72) at jakarta.servlet.http.HttpServlet.service(HttpServlet.java:614) at org.springframework.mock.web.MockFilterChain$ServletFilterProxy.doFilter(MockFilterChain.java:165) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.RequestContextFilter.doFilterInternal(RequestContextFilter.java:100) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.FormContentFilter.doFilterInternal(FormContentFilter.java:93) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:201) at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:116) at org.springframework.test.web.servlet.setup.MockMvcFilterDecorator.doFilter(MockMvcFilterDecorator.java:151) at org.springframework.mock.web.MockFilterChain.doFilter(MockFilterChain.java:132) at org.springframework.test.web.servlet.MockMvc.perform(MockMvc.java:201) at org.springframework.test.web.servlet.MockMvc$perform.call(Unknown Source) at org.onap.cps.rest.controller.DataRestControllerSpec.$spock_feature_0_17(DataRestControllerSpec.groovy:495) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.IterationNode.around(IterationNode.java:13) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.NodeTestTask$DefaultDynamicTestExecutor.execute(NodeTestTask.java:226) at org.spockframework.runtime.ParameterizedFeatureChildExecutor.execute(ParameterizedFeatureChildExecutor.java:104) at org.spockframework.runtime.PlatformParameterizedSpecRunner$1.runIteration(PlatformParameterizedSpecRunner.java:72) at org.spockframework.runtime.extension.IDataDriver.lambda$static$0(IDataDriver.java:37) at org.spockframework.runtime.PlatformParameterizedSpecRunner.runParameterizedFeature(PlatformParameterizedSpecRunner.java:47) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:40) at org.spockframework.runtime.ParameterizedFeatureNode.execute(ParameterizedFeatureNode.java:16) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:12) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:57:14.315Z INFO 6974 --- [ main] o.o.c.r.c.QueryRestControllerSpec : Starting QueryRestControllerSpec using Java 17.0.6-ea with PID 6974 (started by jenkins in /w/workspace/cps-master-verify-java/cps-rest) 2024-04-18T09:57:14.316Z INFO 6974 --- [ main] o.o.c.r.c.QueryRestControllerSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:57:14.578Z INFO 6974 --- [ main] o.s.b.t.m.w.SpringBootMockServletContext : Initializing Spring TestDispatcherServlet '' 2024-04-18T09:57:14.578Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Initializing Servlet '' 2024-04-18T09:57:14.578Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Completed initialization in 0 ms 2024-04-18T09:57:14.583Z INFO 6974 --- [ main] o.o.c.r.c.QueryRestControllerSpec : Started QueryRestControllerSpec in 0.301 seconds (process running for 8.676) [INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.465 s - in org.onap.cps.rest.controller.QueryRestControllerSpec [INFO] Running org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:57:14.793Z INFO 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandlerSpec : Starting CpsRestExceptionHandlerSpec using Java 17.0.6-ea with PID 6974 (started by jenkins in /w/workspace/cps-master-verify-java/cps-rest) 2024-04-18T09:57:14.793Z INFO 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandlerSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:57:15.070Z INFO 6974 --- [ main] o.s.b.t.m.w.SpringBootMockServletContext : Initializing Spring TestDispatcherServlet '' 2024-04-18T09:57:15.070Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Initializing Servlet '' 2024-04-18T09:57:15.070Z INFO 6974 --- [ main] o.s.t.web.servlet.TestDispatcherServlet : Completed initialization in 0 ms 2024-04-18T09:57:15.075Z INFO 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandlerSpec : Started CpsRestExceptionHandlerSpec in 0.325 seconds (process running for 9.168) 2024-04-18T09:57:15.100Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred java.lang.IllegalStateException: some error message at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:277) at org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec.$spock_feature_0_0(CpsRestExceptionHandlerSpec.groovy:108) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:58) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.SimpleFeatureNode.lambda$around$0(SimpleFeatureNode.java:52) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:52) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) 2024-04-18T09:57:15.125Z ERROR 6974 --- [ main] o.o.c.r.e.CpsRestExceptionHandler : Exception occurred org.onap.cps.spi.exceptions.AlreadyDefinedException: Already defined exception at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:304) at org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec.$spock_feature_0_3(CpsRestExceptionHandlerSpec.groovy:135) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at org.spockframework.util.ReflectionUtil.invokeMethod(ReflectionUtil.java:196) at org.spockframework.runtime.model.MethodInfo.lambda$new$0(MethodInfo.java:49) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeatureMethod(PlatformSpecRunner.java:324) at org.spockframework.runtime.IterationNode.execute(IterationNode.java:50) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:58) at org.spockframework.runtime.SimpleFeatureNode.execute(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.IterationNode.lambda$around$0(IterationNode.java:67) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunIteration$5(PlatformSpecRunner.java:236) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runIteration(PlatformSpecRunner.java:218) at org.spockframework.runtime.IterationNode.around(IterationNode.java:67) at org.spockframework.runtime.SimpleFeatureNode.lambda$around$0(SimpleFeatureNode.java:52) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.FeatureNode.lambda$around$0(FeatureNode.java:41) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunFeature$4(PlatformSpecRunner.java:199) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runFeature(PlatformSpecRunner.java:192) at org.spockframework.runtime.FeatureNode.around(FeatureNode.java:41) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:52) at org.spockframework.runtime.SimpleFeatureNode.around(SimpleFeatureNode.java:15) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.spockframework.runtime.SpockNode.sneakyInvoke(SpockNode.java:40) at org.spockframework.runtime.SpecNode.lambda$around$0(SpecNode.java:63) at org.spockframework.runtime.PlatformSpecRunner.lambda$createMethodInfoForDoRunSpec$0(PlatformSpecRunner.java:61) at org.spockframework.runtime.model.MethodInfo.invoke(MethodInfo.java:156) at org.spockframework.runtime.PlatformSpecRunner.invokeRaw(PlatformSpecRunner.java:407) at org.spockframework.runtime.PlatformSpecRunner.invoke(PlatformSpecRunner.java:390) at org.spockframework.runtime.PlatformSpecRunner.runSpec(PlatformSpecRunner.java:55) at org.spockframework.runtime.SpecNode.around(SpecNode.java:63) at org.spockframework.runtime.SpecNode.around(SpecNode.java:11) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141) at org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137) at org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139) at org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73) at org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138) at org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95) at org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35) at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57) at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:198) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:169) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:93) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:58) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:141) at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:57) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:103) at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:85) at org.junit.platform.launcher.core.DelegatingLauncher.execute(DelegatingLauncher.java:47) at org.junit.platform.launcher.core.SessionPerRequestLauncher.execute(SessionPerRequestLauncher.java:63) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.execute(JUnitPlatformProvider.java:188) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154) at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:128) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:428) at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:162) at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:562) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:548) Caused by: java.lang.Throwable: null at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.codehaus.groovy.reflection.CachedConstructor.invoke(CachedConstructor.java:72) at org.codehaus.groovy.runtime.callsite.ConstructorSite$ConstructorSiteNoUnwrapNoCoerce.callConstructor(ConstructorSite.java:105) at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallConstructor(CallSiteArray.java:59) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:263) at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callConstructor(AbstractCallSite.java:268) ... 88 common frames omitted [INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.442 s - in org.onap.cps.rest.exceptions.CpsRestExceptionHandlerSpec [INFO] Running org.onap.cps.rest.utils.MultipartFileUtilSpec [INFO] Tests run: 27, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.025 s - in org.onap.cps.rest.utils.MultipartFileUtilSpec [INFO] Running org.onap.cps.rest.utils.ZipFileSizeValidatorSpec [INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.016 s - in org.onap.cps.rest.utils.ZipFileSizeValidatorSpec [INFO] [INFO] Results: [INFO] [INFO] Tests run: 167, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-rest --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-rest/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-rest' with 6 classes [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-rest --- [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cps-rest --- [INFO] Building jar: /w/workspace/cps-master-verify-java/cps-rest/target/cps-rest-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-rest --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-rest/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-rest --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-rest --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-rest --- [INFO] Loading execution data file /w/workspace/cps-master-verify-java/cps-rest/target/code-coverage/jacoco-ut.exec [INFO] Analyzed bundle 'cps-rest' with 6 classes [INFO] All coverage checks have been met. [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-rest --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-rest --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-rest/target/cps-rest-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/cps-rest/3.4.8-SNAPSHOT/cps-rest-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/cps-rest/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-rest/3.4.8-SNAPSHOT/cps-rest-3.4.8-SNAPSHOT.pom [INFO] [INFO] --------------------< org.onap.cps:cps-ncmp-events >-------------------- [INFO] Building cps-ncmp-events 3.4.8-SNAPSHOT [10/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-ncmp-events --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-ncmp-events --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-ncmp-events --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-ncmp-events --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-events/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-ncmp-events --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-events/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jsonschema2pojo-maven-plugin:1.2.1:generate (default) @ cps-ncmp-events --- [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-ncmp-events --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-ncmp-events --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-ncmp-events --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-ncmp-events --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 12 resources [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-ncmp-events/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-ncmp-events/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-ncmp-events --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 43 source files with javac [debug release 17] to target/classes [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-ncmp-events >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-ncmp-events --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-ncmp-events <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-ncmp-events --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-ncmp-events --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-ncmp-events/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-ncmp-events --- [INFO] No sources to compile [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-ncmp-events --- [INFO] No sources specified for compilation. Skipping. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-ncmp-events --- [INFO] No tests to run. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-unit-test) @ cps-ncmp-events --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- exec-maven-plugin:1.6.0:exec (generate-csv) @ cps-ncmp-events --- [INFO] [INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ cps-ncmp-events --- [INFO] Building jar: /w/workspace/cps-master-verify-java/cps-ncmp-events/target/cps-ncmp-events-3.4.8-SNAPSHOT.jar [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-integration-test) @ cps-ncmp-events --- [INFO] failsafeArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-events/target/code-coverage/jacoco-it.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:integration-test (integration-tests) @ cps-ncmp-events --- [INFO] No tests to run. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:report (post-integration-test) @ cps-ncmp-events --- [INFO] Skipping JaCoCo execution due to missing execution data file. [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:check (coverage-check) @ cps-ncmp-events --- [INFO] Skipping JaCoCo execution due to missing execution data file:/w/workspace/cps-master-verify-java/cps-ncmp-events/target/code-coverage/jacoco-ut.exec [INFO] [INFO] --- maven-failsafe-plugin:3.0.0-M4:verify (integration-tests) @ cps-ncmp-events --- [INFO] [INFO] --- maven-install-plugin:2.4:install (default-install) @ cps-ncmp-events --- [INFO] Installing /w/workspace/cps-master-verify-java/cps-ncmp-events/target/cps-ncmp-events-3.4.8-SNAPSHOT.jar to /home/jenkins/.m2/repository/org/onap/cps/cps-ncmp-events/3.4.8-SNAPSHOT/cps-ncmp-events-3.4.8-SNAPSHOT.jar [INFO] Installing /w/workspace/cps-master-verify-java/cps-ncmp-events/pom.xml to /home/jenkins/.m2/repository/org/onap/cps/cps-ncmp-events/3.4.8-SNAPSHOT/cps-ncmp-events-3.4.8-SNAPSHOT.pom [INFO] [INFO] -------------------< org.onap.cps:cps-ncmp-service >-------------------- [INFO] Building cps-ncmp-service 3.4.8-SNAPSHOT [11/23] [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ cps-ncmp-service --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-property) @ cps-ncmp-service --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M2:enforce (enforce-no-snapshots) @ cps-ncmp-service --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (pre-unit-test) @ cps-ncmp-service --- [INFO] surefireArgLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-service/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- jacoco-maven-plugin:0.8.10:prepare-agent (default-prepare-agent) @ cps-ncmp-service --- [INFO] argLine set to -javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-service/target/jacoco.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/* [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-license) @ cps-ncmp-service --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (onap-java-style) @ cps-ncmp-service --- [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-checkstyle-plugin:3.1.1:check (cps-java-style) @ cps-ncmp-service --- [INFO] Starting audit... Audit done. [INFO] You have 0 Checkstyle violations. [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ cps-ncmp-service --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-ncmp-service/target/generated-sources/license [INFO] skip non existing resourceDirectory /w/workspace/cps-master-verify-java/cps-ncmp-service/target/generated-resources/licenses [INFO] [INFO] --- maven-compiler-plugin:3.11.0:compile (default-compile) @ cps-ncmp-service --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 124 source files with javac [debug release 17] to target/classes [WARNING] /w/workspace/cps-master-verify-java/cps-ncmp-service/src/main/java/org/onap/cps/ncmp/api/impl/events/lcm/LcmEventHeaderMapper.java:[34,20] Unmapped target properties: "withEventId, withEventCorrelationId, withEventTime, withEventSource, withEventType, withEventSchema, withEventSchemaVersion". [WARNING] /w/workspace/cps-master-verify-java/cps-ncmp-service/src/main/java/org/onap/cps/ncmp/api/impl/async/NcmpAsyncRequestResponseEventMapper.java:[53,35] Unmapped target properties: "withEventId, withEventCorrelationId, withEventTime, withEventTarget, withEventType, withEventSchema, withEventSchemaVersion, event, withEvent, withForwardedEvent". [INFO] /w/workspace/cps-master-verify-java/cps-ncmp-service/src/main/java/org/onap/cps/ncmp/api/impl/events/cmsubscription/service/CmNotificationSubscriptionPersistenceServiceImpl.java: Some input files use unchecked or unsafe operations. [INFO] /w/workspace/cps-master-verify-java/cps-ncmp-service/src/main/java/org/onap/cps/ncmp/api/impl/events/cmsubscription/service/CmNotificationSubscriptionPersistenceServiceImpl.java: Recompile with -Xlint:unchecked for details. [INFO] [INFO] >>> spotbugs-maven-plugin:4.4.2:check (analyze-compile) > :spotbugs @ cps-ncmp-service >>> [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:spotbugs (spotbugs) @ cps-ncmp-service --- [INFO] Fork Value is true [java] WARNING: A terminally deprecated method in java.lang.System has been called [java] WARNING: System::setSecurityManager has been called by edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue (file:/home/jenkins/.m2/repository/com/github/spotbugs/spotbugs/4.2.3/spotbugs-4.2.3.jar) [java] WARNING: Please consider reporting this to the maintainers of edu.umd.cs.findbugs.ba.jsr305.TypeQualifierValue [java] WARNING: System::setSecurityManager will be removed in a future release [INFO] Done SpotBugs Analysis.... [INFO] [INFO] <<< spotbugs-maven-plugin:4.4.2:check (analyze-compile) < :spotbugs @ cps-ncmp-service <<< [INFO] [INFO] [INFO] --- spotbugs-maven-plugin:4.4.2:check (analyze-compile) @ cps-ncmp-service --- [INFO] BugInstance size is 0 [INFO] Error size is 0 [INFO] No errors/warnings found [INFO] [INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ cps-ncmp-service --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 12 resources [INFO] [INFO] --- maven-compiler-plugin:3.11.0:testCompile (default-testCompile) @ cps-ncmp-service --- [INFO] Changes detected - recompiling the module! :dependency [INFO] Compiling 2 source files with javac [debug release 17] to target/test-classes [INFO] [INFO] --- gmavenplus-plugin:1.9.0:compileTests (default) @ cps-ncmp-service --- [INFO] Using isolated classloader, without GMavenPlus classpath. [INFO] Using Groovy 3.0.18 to perform compileTests. [INFO] Compiled 188 files. [INFO] [INFO] --- maven-surefire-plugin:3.0.0-M5:test (default-test) @ cps-ncmp-service --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running org.onap.cps.ncmp.init.AbstractModelLoaderSpec 09:57:38.144 [main] ERROR org.onap.cps.ncmp.init.AbstractModelLoader -- Onboarding model for NCMP failed: test message 09:57:38.193 [main] WARN org.onap.cps.ncmp.init.AbstractModelLoader -- Creating new schema set failed as schema set already exists 09:57:38.199 [main] DEBUG org.onap.cps.ncmp.init.AbstractModelLoader -- Onboarding failed as unable to read file: models/no such yang file 09:57:38.199 [main] ERROR org.onap.cps.ncmp.init.AbstractModelLoader -- Creating schema set failed: Onboarding failed as unable to read file: models/no such yang file 09:57:38.211 [main] WARN org.onap.cps.ncmp.init.AbstractModelLoader -- Deleting schema set failed: test message 09:57:38.224 [main] WARN org.onap.cps.ncmp.init.AbstractModelLoader -- Creating new anchor failed as anchor already exists 09:57:38.229 [main] ERROR org.onap.cps.ncmp.init.AbstractModelLoader -- Creating anchor failed: test message 09:57:38.265 [main] WARN org.onap.cps.ncmp.init.AbstractModelLoader -- Creating new data node 'new node' failed as data node already exists 09:57:38.270 [main] ERROR org.onap.cps.ncmp.init.AbstractModelLoader -- Creating data node failed: test message 09:57:38.277 [main] ERROR org.onap.cps.ncmp.init.AbstractModelLoader -- Updating schema set failed: test message [INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.391 s - in org.onap.cps.ncmp.init.AbstractModelLoaderSpec [INFO] Running org.onap.cps.ncmp.init.CmDataSubscriptionModelLoaderSpec 09:57:38.310 [main] INFO org.onap.cps.ncmp.init.CmDataSubscriptionModelLoader -- Subscription Models onboarded successfully 09:57:38.315 [main] INFO org.onap.cps.ncmp.init.CmDataSubscriptionModelLoader -- Creating new child data node 'some datastore' for data node 'datastores' failed as data node already exists 09:57:38.319 [main] ERROR org.onap.cps.ncmp.init.CmDataSubscriptionModelLoader -- Creating data node failed: test message [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.03 s - in org.onap.cps.ncmp.init.CmDataSubscriptionModelLoaderSpec [INFO] Running org.onap.cps.ncmp.init.InventoryModelLoaderSpec 09:57:38.333 [main] INFO org.onap.cps.ncmp.init.InventoryModelLoader -- Inventory Model updated successfully [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 s - in org.onap.cps.ncmp.init.InventoryModelLoaderSpec [INFO] Running org.onap.cps.ncmp.api.impl.DataJobServiceImplSpec 09:57:38.470 [main] INFO org.onap.cps.ncmp.api.impl.DataJobServiceImpl -- data job id for read operation is: some-job-id 09:57:38.485 [main] INFO org.onap.cps.ncmp.api.impl.DataJobServiceImpl -- data job id for write operation is: some-job-id [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.145 s - in org.onap.cps.ncmp.api.impl.DataJobServiceImplSpec [INFO] Running org.onap.cps.ncmp.api.impl.NetworkCmProxyCmHandleQueryServiceSpec [INFO] Tests run: 28, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.186 s - in org.onap.cps.ncmp.api.impl.NetworkCmProxyCmHandleQueryServiceSpec [INFO] Running org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImplRegistrationSpec 09:57:39.020 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [cmhandle-3] into locked (for upgrade) state. 09:57:39.029 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [cmhandle-3] into locked (for upgrade) state. 09:57:39.033 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.040 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to find data node for cm handle id : cmhandle-3 , caused by : DataNode not found 09:57:39.040 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.043 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to upgrade cm handle id: cmhandle-3, caused by : some error message 09:57:39.043 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.045 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.047 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.049 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.066 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.071 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.073 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.074 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.078 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.080 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.082 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.087 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.091 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.097 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.105 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.108 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.117 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle batch, retrying on each cm handle 09:57:39.118 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle id : cmhandle2 , caused by : Failed 09:57:39.124 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.130 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle batch, retrying on each cm handle 09:57:39.130 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle id : cmhandle , caused by : Failed 09:57:39.131 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.136 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle batch, retrying on each cm handle 09:57:39.136 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to find dataNode for cmHandleId : cmhandle , caused by : DataNode not found 09:57:39.136 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.140 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle batch, retrying on each cm handle 09:57:39.140 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle id: cmhandle, caused by: 09:57:39.140 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. 09:57:39.142 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle batch, retrying on each cm handle 09:57:39.142 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Unable to de-register cm-handle id : cmhandle , caused by : Failed 09:57:39.142 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Moving cm handles : [] into locked (for upgrade) state. [INFO] Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.449 s - in org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImplRegistrationSpec [INFO] Running org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImplSpec 09:57:39.422 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImpl -- Data-Sync Enabled flag is already: true [INFO] Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.277 s - in org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServiceImplSpec [INFO] Running org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandlerSpec 09:57:39.442 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.446 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='newPubProp1'] , key : newPubProp1 and value : pub-val 09:57:39.446 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='newPubProp1'] with leaves (name : newPubProp1 , value : pub-val) 09:57:39.495 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.495 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp4'] , key : publicProp4 and value : newPubVal 09:57:39.495 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp4'] with leaves (name : publicProp4 , value : newPubVal) 09:57:39.497 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.499 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.502 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.502 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/additional-properties[@name='newAdditionalProp1'] , key : newAdditionalProp1 and value : add-value 09:57:39.502 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/additional-properties[@name='newAdditionalProp1'] with leaves (name : newAdditionalProp1 , value : add-value) 09:57:39.504 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.504 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/additional-properties[@name='additionalProp1'] , key : additionalProp1 and value : newValue 09:57:39.504 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/additional-properties[@name='additionalProp1'] with leaves (name : additionalProp1 , value : newValue) 09:57:39.506 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.507 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.509 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.511 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.512 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Deleting dataNode with xpath : [/dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp3']] 09:57:39.512 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Deleting dataNode with xpath : [/dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp4']] 09:57:39.514 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.516 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Unable to find dataNode for cmHandleId : cmHandleId , caused by : DataNode not found 09:57:39.519 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Unable to update cmHandle : cmHandleId , caused by : Failed 09:57:39.521 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Unable to update cm handle : cmHandleId with spaces, caused by : Name Validation Error. 09:57:39.523 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.524 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp1'] , key : publicProp1 and value : value 09:57:39.524 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp1'] with leaves (name : publicProp1 , value : value) 09:57:39.524 [main] ERROR org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Unable to find dataNode for cmHandleId : myHandle1 , caused by : DataNode not found 09:57:39.525 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : null) 09:57:39.525 [main] INFO org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Creating a new DataNode with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp1'] , key : publicProp1 and value : value 09:57:39.525 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Building a new node with xpath /dmi-registry/cm-handles[@id='myHandle1']/public-properties[@name='publicProp1'] with leaves (name : publicProp1 , value : value) 09:57:39.532 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating alternate-id for cmHandle myHandle1 with value : alt-1) 09:57:39.535 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating data-producer-identifier for cmHandle cmHandleId with value : someDataProducerIdentifier) 09:57:39.536 [main] DEBUG org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Updating data-producer-identifier for cmHandle cmHandleId with value : someDataProducerIdentifier) 09:57:39.537 [main] WARN org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandler -- Unable to update dataProducerIdentifier for cmHandle myHandle1. Value for dataProducerIdentifier has been set previously. [INFO] Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.1 s - in org.onap.cps.ncmp.api.impl.NetworkCmProxyDataServicePropertyHandlerSpec [INFO] Running org.onap.cps.ncmp.api.impl.NetworkCmProxyQueryServiceImplSpec [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 s - in org.onap.cps.ncmp.api.impl.NetworkCmProxyQueryServiceImplSpec [INFO] Running org.onap.cps.ncmp.api.impl.async.NcmpAsyncRequestResponseEventMapperSpec [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.045 s - in org.onap.cps.ncmp.api.impl.async.NcmpAsyncRequestResponseEventMapperSpec [INFO] Running org.onap.cps.ncmp.api.impl.async.RecordFilterStrategiesSpec [INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.018 s - in org.onap.cps.ncmp.api.impl.async.RecordFilterStrategiesSpec 09:57:39.724 [main] INFO org.testcontainers.images.PullPolicy -- Image pull policy will be performed by: DefaultPullPolicy() 09:57:39.726 [main] INFO org.testcontainers.utility.ImageNameSubstitutor -- Image name substitution will be performed by: DefaultImageNameSubstitutor (composite of 'ConfigurationFileImageNameSubstitutor' and 'PrefixingImageNameSubstitutor') [INFO] Running org.onap.cps.ncmp.api.impl.async.NcmpAsyncRequestResponseEventProducerIntegrationSpec 09:57:40.480 [main] INFO org.testcontainers.dockerclient.DockerClientProviderStrategy -- Found Docker environment with local Unix socket (unix:///var/run/docker.sock) 09:57:40.492 [main] INFO org.testcontainers.DockerClientFactory -- Docker host IP address is localhost 09:57:40.514 [main] INFO org.testcontainers.DockerClientFactory -- Connected to docker: Server Version: 23.0.1 API Version: 1.42 Operating System: CentOS Stream 8 Total Memory: 31890 MB 09:57:40.518 [main] WARN org.testcontainers.utility.ResourceReaper -- ******************************************************************************** Ryuk has been disabled. This can cause unexpected behavior in your environment. ******************************************************************************** 09:57:40.522 [main] INFO org.testcontainers.DockerClientFactory -- Checking the system... 09:57:40.524 [main] INFO org.testcontainers.DockerClientFactory -- ✔︎ Docker server version should be at least 1.6.0 09:57:40.555 [main] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling docker image: registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1. Please be patient; this may take some time but only needs to be done once. 09:57:40.562 [main] INFO org.testcontainers.utility.RegistryAuthLocator -- Failure when attempting to lookup auth config. Please ignore if you don't have images in an authenticated registry. Details: (dockerImageName: registry.nordix.org/onaptest/confluentinc/cp-kafka:latest, configFile: /home/jenkins/.docker/config.json, configEnv: DOCKER_AUTH_CONFIG). Falling back to docker-java default behaviour. Exception message: Status 404: No config supplied. Checked in order: /home/jenkins/.docker/config.json (file not found), DOCKER_AUTH_CONFIG (not set) 09:57:42.751 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Starting to pull image 09:57:42.784 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 0 downloaded, 0 extracted, (0 bytes/0 bytes) 09:57:43.268 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 10 pending, 1 downloaded, 0 extracted, (1 KB/? MB) 09:57:44.921 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 9 pending, 2 downloaded, 0 extracted, (38 MB/? MB) 09:57:45.497 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 8 pending, 3 downloaded, 0 extracted, (63 MB/? MB) 09:57:45.844 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 7 pending, 4 downloaded, 0 extracted, (80 MB/? MB) 09:57:46.383 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 6 pending, 5 downloaded, 0 extracted, (91 MB/? MB) 09:57:46.885 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 6 pending, 5 downloaded, 1 extracted, (104 MB/? MB) 09:57:46.915 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 5 pending, 6 downloaded, 1 extracted, (104 MB/? MB) 09:57:46.967 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 5 pending, 6 downloaded, 2 extracted, (108 MB/? MB) 09:57:47.437 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 4 pending, 7 downloaded, 2 extracted, (126 MB/? MB) 09:57:48.411 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 3 pending, 8 downloaded, 2 extracted, (170 MB/? MB) 09:57:48.929 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 2 pending, 9 downloaded, 2 extracted, (183 MB/? MB) 09:57:51.970 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 1 pending, 10 downloaded, 2 extracted, (316 MB/? MB) 09:57:54.263 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 2 extracted, (365 MB/370 MB) 09:57:59.307 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 3 extracted, (367 MB/370 MB) 09:57:59.462 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 4 extracted, (368 MB/370 MB) 09:57:59.551 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 5 extracted, (368 MB/370 MB) 09:57:59.877 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 6 extracted, (368 MB/370 MB) 09:57:59.962 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 7 extracted, (368 MB/370 MB) 09:58:00.040 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 8 extracted, (368 MB/370 MB) 09:58:00.117 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 9 extracted, (368 MB/370 MB) 09:58:00.735 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 10 extracted, (370 MB/370 MB) 09:58:00.811 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pulling image layers: 0 pending, 11 downloaded, 11 extracted, (370 MB/370 MB) 09:58:00.832 [docker-java-stream-729162289] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Pull complete. 11 layers, pulled in 18s (downloaded 370 MB at 20 MB/s) 09:58:00.832 [main] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Image registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 pull took PT20.276456602S 09:58:00.878 [main] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Creating container for image: registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 09:58:06.294 [main] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 is starting: ce4e9d6c11bbcf4265ccfd10a755890337cfb647b304a97358dcbc9a2063366a 09:58:11.892 [main] INFO tc.registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 -- Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 started in PT11.013970089S 09:58:12.042 [main] INFO org.apache.kafka.clients.consumer.ConsumerConfig -- ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = earliest bootstrap.servers = [PLAINTEXT://localhost:32768] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-test-1 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = test group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.StringDeserializer 09:58:12.175 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka version: 3.6.1 09:58:12.176 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka commitId: 5e3c2b738d253ff5 09:58:12.176 [main] INFO org.apache.kafka.common.utils.AppInfoParser -- Kafka startTimeMs: 1713434292173 09:57:34,846 |-INFO in ch.qos.logback.classic.LoggerContext[default] - This is logback-classic version 1.4.14 09:57:34,849 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Here is a list of configurators discovered as a service, by rank: 09:57:34,849 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:57:34,849 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - They will be invoked in order until ExecutionStatus.DO_NOT_INVOKE_NEXT_IF_ANY is returned. 09:57:34,849 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Constructed configurator of type class org.springframework.boot.logging.logback.RootLogLevelConfigurator 09:57:34,859 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - org.springframework.boot.logging.logback.RootLogLevelConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:34,859 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Trying to configure with ch.qos.logback.classic.joran.SerializedModelConfigurator 09:57:34,860 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Constructed configurator of type class ch.qos.logback.classic.joran.SerializedModelConfigurator 09:57:34,862 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.scmo] 09:57:34,862 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.scmo] 09:57:34,862 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - ch.qos.logback.classic.joran.SerializedModelConfigurator.configure() call lasted 2 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:34,862 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Trying to configure with ch.qos.logback.classic.util.DefaultJoranConfigurator 09:57:34,863 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Constructed configurator of type class ch.qos.logback.classic.util.DefaultJoranConfigurator 09:57:34,864 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 09:57:34,864 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback.xml] 09:57:34,864 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - ch.qos.logback.classic.util.DefaultJoranConfigurator.configure() call lasted 1 milliseconds. ExecutionStatus=INVOKE_NEXT_IF_ANY 09:57:34,864 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Trying to configure with ch.qos.logback.classic.BasicConfigurator 09:57:34,865 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - Constructed configurator of type class ch.qos.logback.classic.BasicConfigurator 09:57:34,865 |-INFO in ch.qos.logback.classic.BasicConfigurator@2791a2b4 - Setting up default configuration. 09:57:34,885 |-INFO in ch.qos.logback.classic.util.ContextInitializer@34c8140f - ch.qos.logback.classic.BasicConfigurator.configure() call lasted 20 milliseconds. ExecutionStatus=NEUTRAL 09:58:12,440 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@1b81fb54 - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:58:12,505 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:58:12,505 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@1b81fb54 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:58:12,509 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:58:12,509 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:58:12,509 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:58:12,509 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:58:12,509 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:58:12,511 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:58:12,511 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@1b81fb54 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:58:12,519 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:58:12,519 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:58:12,523 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:58:12,559 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434292557) 09:58:12,559 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:58:12,559 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:58:12,565 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:58:12,565 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:58:12,566 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:58:12,566 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:58:12,567 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:58:12,567 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:12,573 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:58:12,592 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:58:12,592 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:12,599 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:58:12,606 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:58:13,096 |-INFO in ch.qos.logback.classic.pattern.DateConverter@2234ab2e - Setting zoneId to "UTC" 09:58:13,106 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:58:13,106 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:58:13,399 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:58:13,400 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:58:13,400 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:58:13,401 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:58:13,401 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:58:13,402 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@81a3c9b - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:58:13,402 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:58:13,402 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@367baf67 - End of configuration. 09:58:13,402 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@7b8b7337 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:13.504Z INFO 7140 --- [ main] uestResponseEventProducerIntegrationSpec : Starting NcmpAsyncRequestResponseEventProducerIntegrationSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:13.505Z INFO 7140 --- [ main] uestResponseEventProducerIntegrationSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:13.676Z INFO 7140 --- [ main] uestResponseEventProducerIntegrationSpec : Started NcmpAsyncRequestResponseEventProducerIntegrationSpec in 1.488 seconds (process running for 39.748) [ERROR] OpenJDK 64-Bit Server VM warning: Sharing is only supported for boot loader classes because bootstrap classpath has been appended 2024-04-18T09:58:14.254Z INFO 7140 --- [ main] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-test-1, groupId=test] Subscribed to topic(s): test-topic 2024-04-18T09:58:14.302Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:14.303Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32768] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer 2024-04-18T09:58:14.334Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:14.335Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:14.335Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:14.335Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434294334 2024-04-18T09:58:14.852Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Error while fetching metadata with correlation id 1 : {test-topic=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:14.854Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.Metadata : [Producer clientId=producer-1] Cluster ID: m2DXnzv7RLKDzgO21Ohnhw 2024-04-18T09:58:14.983Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Error while fetching metadata with correlation id 3 : {test-topic=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:15.153Z INFO 7140 --- [ main] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-test-1, groupId=test] Cluster ID: m2DXnzv7RLKDzgO21Ohnhw 2024-04-18T09:58:15.271Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Discovered group coordinator localhost:32768 (id: 2147483646 rack: null) 2024-04-18T09:58:15.279Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] (Re-)joining group 2024-04-18T09:58:15.315Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Request joining group due to: need to re-join with the given member-id: consumer-test-1-888a4213-17bf-4748-b157-b1b07f7f1ea2 2024-04-18T09:58:15.316Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:15.316Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] (Re-)joining group 2024-04-18T09:58:15.346Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Successfully joined group with generation Generation{generationId=1, memberId='consumer-test-1-888a4213-17bf-4748-b157-b1b07f7f1ea2', protocol='range'} 2024-04-18T09:58:15.356Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Finished assignment for group at generation 1: {consumer-test-1-888a4213-17bf-4748-b157-b1b07f7f1ea2=Assignment(partitions=[test-topic-0])} 2024-04-18T09:58:15.386Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Successfully synced group in generation Generation{generationId=1, memberId='consumer-test-1-888a4213-17bf-4748-b157-b1b07f7f1ea2', protocol='range'} 2024-04-18T09:58:15.387Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Notifying assignor about the new Assignment(partitions=[test-topic-0]) 2024-04-18T09:58:15.391Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Adding newly assigned partitions: test-topic-0 2024-04-18T09:58:15.407Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Found no committed offset for partition test-topic-0 2024-04-18T09:58:15.425Z INFO 7140 --- [ main] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-test-1, groupId=test] Resetting offset for partition test-topic-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32768 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:15.583Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:15.593Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node -1 disconnected. 2024-04-18T09:58:15.669Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:15.669Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Cancelled in-flight FETCH request with correlation id 13 due to node 1 being disconnected (elapsed time since creation: 190ms, elapsed time since send: 190ms, request timeout: 30000ms) 2024-04-18T09:58:15.669Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node -1 disconnected. 2024-04-18T09:58:15.669Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 2147483646 disconnected. 2024-04-18T09:58:15.670Z INFO 7140 --- [t-thread | test] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-test-1, groupId=test] Error sending fetch request (sessionId=954140020, epoch=1) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:15.672Z INFO 7140 --- [t-thread | test] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-1, groupId=test] Group coordinator localhost:32768 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:15.686Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:15.687Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:15.774Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:15.775Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 36.077 s - in org.onap.cps.ncmp.api.impl.async.NcmpAsyncRequestResponseEventProducerIntegrationSpec [INFO] Running org.onap.cps.ncmp.api.impl.async.DataOperationEventConsumerSpec 2024-04-18T09:58:15.826Z INFO 7140 --- [ main] t.r.n.o.2.1 : Creating container for image: registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 2024-04-18T09:58:15.840Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:15.840Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:15.866Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 is starting: 497bf225e8bb549b9f6ecea3ef361c8f8b0161f16c3393ff0bd4f35738d357b6 2024-04-18T09:58:15.876Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:15.876Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:16.092Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:16.093Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:16.178Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:16.179Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:16.497Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:16.498Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:16.592Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:16.592Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:17.205Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:17.205Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:17.505Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:17.505Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:18.114Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:18.114Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:18.509Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:18.509Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:19.124Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:19.124Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:19.413Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:19.414Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:20.285Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:20.286Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:20.518Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:20.519Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:21.241Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:21.241Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:21.401Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 started in PT5.575216736S 2024-04-18T09:58:21.406Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = earliest bootstrap.servers = [PLAINTEXT://localhost:32770] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-test-2 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = test group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class io.cloudevents.kafka.CloudEventDeserializer 2024-04-18T09:58:21.411Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:21.412Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:21.412Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434301411 09:58:21,448 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Worker thread will flush remaining events before exiting. 09:58:21,449 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Queue flush finished successfully within timeout. 09:58:21,450 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@26c07595 - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:58:21,453 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:58:21,453 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@26c07595 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:58:21,456 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:58:21,456 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:58:21,456 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:58:21,456 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:58:21,456 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:58:21,457 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:58:21,457 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@26c07595 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:58:21,460 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:58:21,460 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:58:21,460 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:58:21,461 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434301461) 09:58:21,461 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:58:21,461 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:58:21,462 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:58:21,462 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:58:21,462 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:58:21,462 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:58:21,462 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:58:21,462 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:58:21,462 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:58:21,462 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:58:21,462 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:58:21,462 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:58:21,463 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:58:21,463 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:58:21,463 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:58:21,463 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:58:21,463 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:58:21,463 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:58:21,463 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:58:21,463 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:21,469 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:58:21,473 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:58:21,473 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:21,473 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:58:21,473 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:58:21,482 |-INFO in ch.qos.logback.classic.pattern.DateConverter@6f8d320c - Setting zoneId to "UTC" 09:58:21,482 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:58:21,482 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:58:21,488 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:58:21,488 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:58:21,488 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:58:21,488 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:58:21,488 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:58:21,488 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@52181f10 - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:58:21,488 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:58:21,488 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@46daaaab - End of configuration. 09:58:21,488 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@63130338 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:21.495Z INFO 7140 --- [ main] c.n.a.i.a.DataOperationEventConsumerSpec : Starting DataOperationEventConsumerSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:21.495Z INFO 7140 --- [ main] c.n.a.i.a.DataOperationEventConsumerSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:21.528Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:21.528Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:21.672Z INFO 7140 --- [ main] c.n.a.i.a.DataOperationEventConsumerSpec : Started DataOperationEventConsumerSpec in 0.257 seconds (process running for 47.744) 2024-04-18T09:58:21.681Z INFO 7140 --- [ main] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer-test-2, groupId=test] Subscribed to topic(s): client-topic 2024-04-18T09:58:21.747Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:21.747Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32770] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-2 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:21.751Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:21.751Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:21.751Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:21.751Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434301751 2024-04-18T09:58:21.862Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Error while fetching metadata with correlation id 1 : {client-topic=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:21.863Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.Metadata : [Producer clientId=producer-2] Cluster ID: O74acv3FSqCAywmFaZQ6tg 2024-04-18T09:58:21.993Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Error while fetching metadata with correlation id 3 : {client-topic=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:22.124Z INFO 7140 --- [ main] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-test-2, groupId=test] Cluster ID: O74acv3FSqCAywmFaZQ6tg 2024-04-18T09:58:22.186Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:22.186Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:22.241Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Discovered group coordinator localhost:32770 (id: 2147483646 rack: null) 2024-04-18T09:58:22.242Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] (Re-)joining group 2024-04-18T09:58:22.262Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Request joining group due to: need to re-join with the given member-id: consumer-test-2-f365ba1c-b66c-49b2-82b4-504c746bab97 2024-04-18T09:58:22.262Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:22.262Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] (Re-)joining group 2024-04-18T09:58:22.287Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Successfully joined group with generation Generation{generationId=1, memberId='consumer-test-2-f365ba1c-b66c-49b2-82b4-504c746bab97', protocol='range'} 2024-04-18T09:58:22.288Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Finished assignment for group at generation 1: {consumer-test-2-f365ba1c-b66c-49b2-82b4-504c746bab97=Assignment(partitions=[client-topic-0])} 2024-04-18T09:58:22.305Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Successfully synced group in generation Generation{generationId=1, memberId='consumer-test-2-f365ba1c-b66c-49b2-82b4-504c746bab97', protocol='range'} 2024-04-18T09:58:22.305Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Notifying assignor about the new Assignment(partitions=[client-topic-0]) 2024-04-18T09:58:22.305Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Adding newly assigned partitions: client-topic-0 2024-04-18T09:58:22.315Z INFO 7140 --- [ main] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Found no committed offset for partition client-topic-0 2024-04-18T09:58:22.329Z INFO 7140 --- [ main] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-test-2, groupId=test] Resetting offset for partition client-topic-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32770 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:22.398Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = earliest bootstrap.servers = [PLAINTEXT://localhost:32770] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-test-3 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = test group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class io.cloudevents.kafka.CloudEventDeserializer 2024-04-18T09:58:22.402Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:22.402Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:22.402Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434302402 2024-04-18T09:58:22.408Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = earliest bootstrap.servers = [PLAINTEXT://localhost:32770] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-test-4 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = true exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = test group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class io.cloudevents.kafka.CloudEventDeserializer 2024-04-18T09:58:22.410Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:22.410Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:22.410Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434302410 2024-04-18T09:58:22.475Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:22.476Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node -1 disconnected. 2024-04-18T09:58:22.494Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:22.494Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Cancelled in-flight FETCH request with correlation id 13 due to node 1 being disconnected (elapsed time since creation: 122ms, elapsed time since send: 122ms, request timeout: 30000ms) 2024-04-18T09:58:22.495Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node -1 disconnected. 2024-04-18T09:58:22.495Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 2147483646 disconnected. 2024-04-18T09:58:22.495Z INFO 7140 --- [t-thread | test] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-test-2, groupId=test] Error sending fetch request (sessionId=1241872458, epoch=1) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:22.495Z INFO 7140 --- [t-thread | test] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-test-2, groupId=test] Group coordinator localhost:32770 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:22.577Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:22.577Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:22.596Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:22.596Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:22.638Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:22.638Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:22.679Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:22.679Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. [INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.876 s - in org.onap.cps.ncmp.api.impl.async.DataOperationEventConsumerSpec [INFO] Running org.onap.cps.ncmp.api.impl.async.FilterStrategiesIntegrationSpec 2024-04-18T09:58:22.711Z INFO 7140 --- [ main] t.r.n.o.2.1 : Creating container for image: registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 2024-04-18T09:58:22.756Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 is starting: f5c2a70e9a0403e2ef2b8da0c437a36134cb14e7d134bff11689eb5a3578ac19 2024-04-18T09:58:22.798Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:22.798Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:22.934Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:22.934Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:23.101Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:23.101Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:23.149Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:23.149Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:23.388Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:23.388Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:23.607Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:23.607Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:23.650Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:23.650Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:24.229Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:24.229Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:24.244Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:24.244Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:24.421Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:24.421Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:24.858Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:24.858Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:25.085Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:25.085Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:25.401Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:25.401Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:25.426Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:25.426Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:25.940Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:25.941Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:25.964Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:25.964Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:26.560Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:26.560Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:26.632Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:26.632Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:26.970Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:26.970Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:27.071Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:27.071Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:27.527Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:27.528Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:27.830Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:27.830Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:27.838Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:27.838Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:28.083Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 started in PT5.37176898S 09:58:28,133 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Worker thread will flush remaining events before exiting. 09:58:28,134 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Queue flush finished successfully within timeout. 09:58:28,135 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@fd27d5e - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:58:28,138 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:58:28,138 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@fd27d5e - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:58:28,140 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:58:28,140 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:58:28,140 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:58:28,140 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:58:28,140 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:58:28,140 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:58:28,140 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@fd27d5e - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:58:28,142 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:58:28,142 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:58:28,142 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:58:28,143 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434308143) 09:58:28,143 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:58:28,143 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:58:28,152 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:58:28,152 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:58:28,152 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:28,153 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:58:28,154 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:58:28,154 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:28,154 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:58:28,154 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:58:28,160 |-INFO in ch.qos.logback.classic.pattern.DateConverter@81b2f0e - Setting zoneId to "UTC" 09:58:28,161 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:58:28,161 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:58:28,164 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:58:28,164 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:58:28,164 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:58:28,164 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:58:28,166 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:58:28,166 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@3ae58b76 - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:58:28,166 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:58:28,166 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@2b843043 - End of configuration. 09:58:28,166 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@37a080e0 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:28.173Z INFO 7140 --- [ main] .n.a.i.a.FilterStrategiesIntegrationSpec : Starting FilterStrategiesIntegrationSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:28.173Z INFO 7140 --- [ main] .n.a.i.a.FilterStrategiesIntegrationSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:28.282Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:28.283Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:28.695Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:28.695Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:28.747Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:28.748Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:28.855Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:28.855Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:28.914Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = latest bootstrap.servers = [PLAINTEXT://localhost:32772] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-ncmp-data-operation-event-group-5 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = ncmp-data-operation-event-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 2024-04-18T09:58:28.917Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:28.918Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:28.918Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434308917 2024-04-18T09:58:28.919Z INFO 7140 --- [ main] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Subscribed to topic(s): ncmp-async-m2m 2024-04-18T09:58:28.933Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = latest bootstrap.servers = [PLAINTEXT://localhost:32772] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-ncmp-async-rest-request-event-group-6 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = ncmp-async-rest-request-event-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 2024-04-18T09:58:28.947Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:28.947Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:28.947Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434308947 2024-04-18T09:58:28.947Z INFO 7140 --- [ main] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Subscribed to topic(s): ncmp-async-m2m 2024-04-18T09:58:28.957Z INFO 7140 --- [ main] .n.a.i.a.FilterStrategiesIntegrationSpec : Started FilterStrategiesIntegrationSpec in 0.846 seconds (process running for 55.029) 2024-04-18T09:58:29.035Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Error while fetching metadata with correlation id 2 : {ncmp-async-m2m=UNKNOWN_TOPIC_OR_PARTITION} 2024-04-18T09:58:29.035Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:29.070Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Error while fetching metadata with correlation id 2 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:29.070Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:29.167Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Error while fetching metadata with correlation id 4 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:29.199Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Error while fetching metadata with correlation id 4 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:29.278Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Discovered group coordinator localhost:32772 (id: 2147483646 rack: null) 2024-04-18T09:58:29.279Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] (Re-)joining group 2024-04-18T09:58:29.304Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Discovered group coordinator localhost:32772 (id: 2147483646 rack: null) 2024-04-18T09:58:29.306Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Request joining group due to: need to re-join with the given member-id: consumer-ncmp-async-rest-request-event-group-6-ed07ea2f-9625-4f1d-a47a-161624dfa47c 2024-04-18T09:58:29.306Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] (Re-)joining group 2024-04-18T09:58:29.306Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:29.306Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] (Re-)joining group 2024-04-18T09:58:29.321Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Request joining group due to: need to re-join with the given member-id: consumer-ncmp-data-operation-event-group-5-6ad0ae06-9138-433c-b228-a2e565abf125 2024-04-18T09:58:29.322Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:29.322Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] (Re-)joining group 2024-04-18T09:58:29.337Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Successfully joined group with generation Generation{generationId=1, memberId='consumer-ncmp-async-rest-request-event-group-6-ed07ea2f-9625-4f1d-a47a-161624dfa47c', protocol='range'} 2024-04-18T09:58:29.337Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Finished assignment for group at generation 1: {consumer-ncmp-async-rest-request-event-group-6-ed07ea2f-9625-4f1d-a47a-161624dfa47c=Assignment(partitions=[ncmp-async-m2m-0])} 2024-04-18T09:58:29.340Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Successfully joined group with generation Generation{generationId=1, memberId='consumer-ncmp-data-operation-event-group-5-6ad0ae06-9138-433c-b228-a2e565abf125', protocol='range'} 2024-04-18T09:58:29.341Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Finished assignment for group at generation 1: {consumer-ncmp-data-operation-event-group-5-6ad0ae06-9138-433c-b228-a2e565abf125=Assignment(partitions=[ncmp-async-m2m-0])} 2024-04-18T09:58:29.389Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:29.389Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:29.400Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Successfully synced group in generation Generation{generationId=1, memberId='consumer-ncmp-data-operation-event-group-5-6ad0ae06-9138-433c-b228-a2e565abf125', protocol='range'} 2024-04-18T09:58:29.400Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Notifying assignor about the new Assignment(partitions=[ncmp-async-m2m-0]) 2024-04-18T09:58:29.400Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Adding newly assigned partitions: ncmp-async-m2m-0 2024-04-18T09:58:29.402Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Successfully synced group in generation Generation{generationId=1, memberId='consumer-ncmp-async-rest-request-event-group-6-ed07ea2f-9625-4f1d-a47a-161624dfa47c', protocol='range'} 2024-04-18T09:58:29.402Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Notifying assignor about the new Assignment(partitions=[ncmp-async-m2m-0]) 2024-04-18T09:58:29.402Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Adding newly assigned partitions: ncmp-async-m2m-0 2024-04-18T09:58:29.408Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:29.409Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:29.411Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:29.411Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:29.424Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Resetting offset for partition ncmp-async-m2m-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32772 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:29.428Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Resetting offset for partition ncmp-async-m2m-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32772 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:29.449Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: partitions assigned: [ncmp-async-m2m-0] 2024-04-18T09:58:29.449Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: partitions assigned: [ncmp-async-m2m-0] 2024-04-18T09:58:29.479Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:29.480Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-3 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:29.484Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:29.485Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:29.485Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:29.485Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434309484 2024-04-18T09:58:29.492Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.Metadata : [Producer clientId=producer-3] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:29.657Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:29.657Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:29.709Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:29.709Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:29.761Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:29.761Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:29.806Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:29.806Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-4 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer 2024-04-18T09:58:29.809Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:29.809Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:29.809Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:29.809Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434309809 2024-04-18T09:58:29.822Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.Metadata : [Producer clientId=producer-4] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:29.870Z ERROR 7140 --- [ntainer#0-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff FixedBackOff{interval=0, currentAttempts=1, maxAttempts=0} exhausted for ncmp-async-m2m-0@1 org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2954) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.checkDeser(KafkaMessageListenerContainer.java:3002) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2854) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.lambda$doInvokeRecordListener$55(KafkaMessageListenerContainer.java:2777) at io.micrometer.observation.Observation.observe(Observation.java:499) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2776) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2625) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2511) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:2153) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1493) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1458) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1328) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize at org.springframework.kafka.support.serializer.SerializationUtils.deserializationException(SerializationUtils.java:158) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:218) at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:73) at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:300) at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263) at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340) at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306) at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1262) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1186) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1159) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollConsumer(KafkaMessageListenerContainer.java:1664) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1639) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1437) ... 3 common frames omitted Caused by: io.cloudevents.rw.CloudEventRWException: Could not parse. Unknown encoding. Invalid content type or spec version at io.cloudevents.rw.CloudEventRWException.newUnknownEncodingException(CloudEventRWException.java:201) at io.cloudevents.core.message.impl.MessageUtils.parseStructuredOrBinaryMessage(MessageUtils.java:80) at io.cloudevents.kafka.KafkaMessageFactory.createReader(KafkaMessageFactory.java:65) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:60) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:34) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:215) ... 14 common frames omitted 2024-04-18T09:58:29.941Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:29.942Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-5 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:29.944Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:29.945Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:29.945Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:29.945Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434309944 2024-04-18T09:58:29.953Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.Metadata : [Producer clientId=producer-5] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:30.268Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:30.269Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-6 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:30.272Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:30.272Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:30.272Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:30.272Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434310272 2024-04-18T09:58:30.281Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.Metadata : [Producer clientId=producer-6] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:30.394Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:30.394Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:30.513Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:30.514Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:30.592Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:30.592Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-7 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:30.595Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:30.595Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:30.596Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:30.596Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434310595 2024-04-18T09:58:30.601Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.Metadata : [Producer clientId=producer-7] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:30.664Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:30.664Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:30.765Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:30.765Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:30.912Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:30.912Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32772] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-8 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer 2024-04-18T09:58:30.915Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:30.915Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:30.915Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:30.915Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434310915 2024-04-18T09:58:30.921Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.Metadata : [Producer clientId=producer-8] Cluster ID: 8n_pjzTGSXyhIyNVr2prVA 2024-04-18T09:58:30.935Z ERROR 7140 --- [ntainer#1-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff FixedBackOff{interval=0, currentAttempts=1, maxAttempts=0} exhausted for ncmp-async-m2m-0@5 org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message Endpoint handler details: Method [public void org.onap.cps.ncmp.api.impl.async.AsyncRestRequestResponseEventConsumer.consumeAndForward(org.onap.cps.ncmp.event.model.DmiAsyncRequestResponseEvent)] Bean [org.onap.cps.ncmp.api.impl.async.AsyncRestRequestResponseEventConsumer@38514b24] at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2950) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2895) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2859) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.lambda$doInvokeRecordListener$55(KafkaMessageListenerContainer.java:2777) at io.micrometer.observation.Observation.observe(Observation.java:499) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2776) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2625) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2511) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:2153) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1493) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1458) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1328) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.lang.Thread.run(Thread.java:833) Suppressed: org.springframework.kafka.listener.ListenerExecutionFailedException: Restored Stack Trace at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.checkAckArg(MessagingMessageListenerAdapter.java:402) at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:380) at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:92) at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:53) at org.springframework.kafka.listener.adapter.FilteringMessageListenerAdapter.onMessage(FilteringMessageListenerAdapter.java:73) at org.springframework.kafka.listener.adapter.FilteringMessageListenerAdapter.onMessage(FilteringMessageListenerAdapter.java:37) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2881) Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot handle message at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:380) at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:92) at org.springframework.kafka.listener.adapter.RecordMessagingMessageListenerAdapter.onMessage(RecordMessagingMessageListenerAdapter.java:53) at org.springframework.kafka.listener.adapter.FilteringMessageListenerAdapter.onMessage(FilteringMessageListenerAdapter.java:73) at org.springframework.kafka.listener.adapter.FilteringMessageListenerAdapter.onMessage(FilteringMessageListenerAdapter.java:37) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeOnMessage(KafkaMessageListenerContainer.java:2881) ... 12 common frames omitted Caused by: org.springframework.messaging.converter.MessageConversionException: Cannot convert from [java.lang.String] to [org.onap.cps.ncmp.event.model.DmiAsyncRequestResponseEvent] for GenericMessage [payload=simple string event, headers={kafka_offset=5, kafka_consumer=org.springframework.kafka.core.DefaultKafkaConsumerFactory$ExtendedKafkaConsumer@48310a76, kafka_timestampType=CREATE_TIME, kafka_receivedPartitionId=0, kafka_receivedTopic=ncmp-async-m2m, kafka_receivedTimestamp=1713434310921, kafka_groupId=ncmp-async-rest-request-event-group}] at org.springframework.messaging.handler.annotation.support.PayloadMethodArgumentResolver.resolveArgument(PayloadMethodArgumentResolver.java:151) at org.springframework.kafka.annotation.KafkaNullAwarePayloadArgumentResolver.resolveArgument(KafkaNullAwarePayloadArgumentResolver.java:48) at org.springframework.messaging.handler.invocation.HandlerMethodArgumentResolverComposite.resolveArgument(HandlerMethodArgumentResolverComposite.java:118) at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.getMethodArgumentValues(InvocableHandlerMethod.java:147) at org.springframework.messaging.handler.invocation.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:115) at org.springframework.kafka.listener.adapter.HandlerAdapter.invoke(HandlerAdapter.java:56) at org.springframework.kafka.listener.adapter.MessagingMessageListenerAdapter.invokeHandler(MessagingMessageListenerAdapter.java:376) ... 17 common frames omitted 2024-04-18T09:58:30.946Z ERROR 7140 --- [ntainer#0-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff FixedBackOff{interval=0, currentAttempts=1, maxAttempts=0} exhausted for ncmp-async-m2m-0@5 org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2954) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.checkDeser(KafkaMessageListenerContainer.java:3002) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2854) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.lambda$doInvokeRecordListener$55(KafkaMessageListenerContainer.java:2777) at io.micrometer.observation.Observation.observe(Observation.java:499) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2776) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2625) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2511) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:2153) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1493) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1458) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1328) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize at org.springframework.kafka.support.serializer.SerializationUtils.deserializationException(SerializationUtils.java:158) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:218) at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:73) at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:300) at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263) at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340) at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306) at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1262) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1186) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1159) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollConsumer(KafkaMessageListenerContainer.java:1664) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1639) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1437) ... 3 common frames omitted Caused by: io.cloudevents.rw.CloudEventRWException: Could not parse. Unknown encoding. Invalid content type or spec version at io.cloudevents.rw.CloudEventRWException.newUnknownEncodingException(CloudEventRWException.java:201) at io.cloudevents.core.message.impl.MessageUtils.parseStructuredOrBinaryMessage(MessageUtils.java:80) at io.cloudevents.kafka.KafkaMessageFactory.createReader(KafkaMessageFactory.java:65) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:60) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:34) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:215) ... 14 common frames omitted 2024-04-18T09:58:31.281Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Node -1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node -1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node -1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node -1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:31.282Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Cancelled in-flight FETCH request with correlation id 27 due to node 1 being disconnected (elapsed time since creation: 339ms, elapsed time since send: 339ms, request timeout: 30000ms) 2024-04-18T09:58:31.283Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Node -1 disconnected. 2024-04-18T09:58:31.283Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node -1 disconnected. 2024-04-18T09:58:31.283Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Node 2147483646 disconnected. 2024-04-18T09:58:31.283Z INFO 7140 --- [ntainer#0-0-C-1] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Error sending fetch request (sessionId=264781322, epoch=6) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:31.283Z INFO 7140 --- [ion-event-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Group coordinator localhost:32772 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:31.283Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:31.284Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node -1 disconnected. 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Node 1 disconnected. 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Cancelled in-flight FETCH request with correlation id 27 due to node 1 being disconnected (elapsed time since creation: 353ms, elapsed time since send: 353ms, request timeout: 30000ms) 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Cancelled in-flight METADATA request with correlation id 29 due to node 1 being disconnected (elapsed time since creation: 254ms, elapsed time since send: 254ms, request timeout: 30000ms) 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Node 2147483646 disconnected. 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Error sending fetch request (sessionId=184214438, epoch=6) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:31.285Z INFO 7140 --- [est-event-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Group coordinator localhost:32772 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:31.287Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node -1 disconnected. 2024-04-18T09:58:31.383Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:31.383Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.383Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:31.383Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Node 1 disconnected. 2024-04-18T09:58:31.383Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.383Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.383Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:31.384Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.384Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:31.384Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.384Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:31.385Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.385Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Node 1 disconnected. 2024-04-18T09:58:31.385Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.387Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:31.387Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.468Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:31.468Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:31.483Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:31.484Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.484Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Node 1 disconnected. 2024-04-18T09:58:31.484Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.485Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:31.485Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.488Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:31.488Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.498Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:31.498Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:31.527Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Revoke previously assigned partitions ncmp-async-m2m-0 2024-04-18T09:58:31.528Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: partitions revoked: [ncmp-async-m2m-0] 2024-04-18T09:58:31.529Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.529Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.529Z INFO 7140 --- [ntainer#0-0-C-1] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Unsubscribed all topics or patterns and assigned partitions 2024-04-18T09:58:31.529Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Node 1 disconnected. 2024-04-18T09:58:31.530Z WARN 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.530Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Revoke previously assigned partitions ncmp-async-m2m-0 2024-04-18T09:58:31.530Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: partitions revoked: [ncmp-async-m2m-0] 2024-04-18T09:58:31.530Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.530Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.530Z INFO 7140 --- [ntainer#1-0-C-1] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Unsubscribed all topics or patterns and assigned partitions 2024-04-18T09:58:31.531Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.531Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-6, groupId=ncmp-async-rest-request-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.532Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed 2024-04-18T09:58:31.533Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter 2024-04-18T09:58:31.533Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed 2024-04-18T09:58:31.534Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:31.534Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.534Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:31.534Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:31.534Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.534Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.535Z INFO 7140 --- [ntainer#1-0-C-1] o.a.kafka.common.utils.AppInfoParser : App info kafka.consumer for consumer-ncmp-async-rest-request-event-group-6 unregistered 2024-04-18T09:58:31.535Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: Consumer stopped 2024-04-18T09:58:31.536Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.536Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-5, groupId=ncmp-data-operation-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:31.536Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed 2024-04-18T09:58:31.536Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter 2024-04-18T09:58:31.536Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed 2024-04-18T09:58:31.538Z INFO 7140 --- [ntainer#0-0-C-1] o.a.kafka.common.utils.AppInfoParser : App info kafka.consumer for consumer-ncmp-data-operation-event-group-5 unregistered 2024-04-18T09:58:31.538Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: Consumer stopped [INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.83 s - in org.onap.cps.ncmp.api.impl.async.FilterStrategiesIntegrationSpec [INFO] Running org.onap.cps.ncmp.api.impl.async.SerializationIntegrationSpec 2024-04-18T09:58:31.548Z INFO 7140 --- [ main] t.r.n.o.2.1 : Creating container for image: registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 2024-04-18T09:58:31.568Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:31.568Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:31.604Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 is starting: bd0752b9b9ed7c44aff0d4d113932067fb502aba5ab115f1825032ead27c7421 2024-04-18T09:58:31.684Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:31.686Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.736Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:31.736Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:31.736Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.736Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.737Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:31.737Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.738Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:31.738Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.739Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:31.739Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:31.869Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:31.869Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:32.088Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:32.088Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.088Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:32.089Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.138Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:32.138Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.189Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:32.189Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.191Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:32.191Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.191Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:32.191Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:32.535Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:32.535Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:32.602Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:32.602Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:32.676Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:32.676Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:32.954Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:32.954Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.000Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:33.000Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.012Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:33.012Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.044Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:33.044Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.062Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:33.062Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.063Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:33.063Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:33.076Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:33.076Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:33.395Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:33.395Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:33.605Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:33.605Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:33.738Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:33.738Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:33.908Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:33.908Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.005Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:34.005Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.048Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:34.049Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.079Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:34.079Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:34.116Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:34.116Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.167Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:34.167Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.217Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:34.217Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:34.501Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:34.501Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:34.809Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:34.809Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:34.846Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:34.846Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:34.920Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:34.920Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.082Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:35.082Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.126Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:35.126Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.185Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:35.185Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:35.185Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:35.185Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.204Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:35.204Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.233Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:35.233Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.355Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:35.355Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:35.751Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:35.751Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:35.785Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:35.785Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:35.912Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:35.912Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:35.990Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:35.990Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.159Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:36.159Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.237Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:36.237Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.254Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:36.254Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.298Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:36.298Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.396Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:36.396Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:36.510Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:36.510Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:36.705Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:36.705Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:36.841Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:36.841Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.852Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:36.852Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:36.859Z INFO 7140 --- [ main] t.r.n.o.2.1 : Container registry.nordix.org/onaptest/confluentinc/cp-kafka:6.2.1 started in PT5.311421992S 09:58:36,887 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Worker thread will flush remaining events before exiting. 09:58:36,887 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Queue flush finished successfully within timeout. 09:58:36,888 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@78b62985 - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:58:36,890 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:58:36,890 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@78b62985 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:58:36,892 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:58:36,892 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:58:36,892 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:58:36,892 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:58:36,892 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:58:36,892 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:58:36,892 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@78b62985 - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:58:36,895 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:58:36,895 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:58:36,895 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:58:36,895 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434316895) 09:58:36,895 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:58:36,895 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:58:36,896 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:58:36,896 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:58:36,896 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:36,896 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:58:36,897 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:58:36,897 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:36,897 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:58:36,897 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:58:36,902 |-INFO in ch.qos.logback.classic.pattern.DateConverter@17087df1 - Setting zoneId to "UTC" 09:58:36,902 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:58:36,902 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:58:36,905 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:58:36,905 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:58:36,905 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:58:36,905 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:58:36,906 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:58:36,906 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@63bbd5b1 - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:58:36,906 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:58:36,906 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@12a79800 - End of configuration. 09:58:36,906 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@5b9461d0 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:36.910Z INFO 7140 --- [ main] o.c.n.a.i.a.SerializationIntegrationSpec : Starting SerializationIntegrationSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:36.910Z INFO 7140 --- [ main] o.c.n.a.i.a.SerializationIntegrationSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:36.916Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:36.916Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:37.157Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = latest bootstrap.servers = [PLAINTEXT://localhost:32774] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-ncmp-data-operation-event-group-7 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = ncmp-data-operation-event-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 2024-04-18T09:58:37.159Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:37.159Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:37.159Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434317159 2024-04-18T09:58:37.159Z INFO 7140 --- [ main] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Subscribed to topic(s): ncmp-async-m2m 2024-04-18T09:58:37.160Z INFO 7140 --- [ main] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.include.jmx.reporter = true auto.offset.reset = latest bootstrap.servers = [PLAINTEXT://localhost:32774] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer-ncmp-async-rest-request-event-group-8 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = ncmp-async-rest-request-event-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 500 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [class org.apache.kafka.clients.consumer.RangeAssignor, class org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.springframework.kafka.support.serializer.ErrorHandlingDeserializer 2024-04-18T09:58:37.168Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:37.168Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:37.168Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434317167 2024-04-18T09:58:37.168Z INFO 7140 --- [ main] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Subscribed to topic(s): ncmp-async-m2m 2024-04-18T09:58:37.169Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:37.169Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.171Z INFO 7140 --- [ main] o.c.n.a.i.a.SerializationIntegrationSpec : Started SerializationIntegrationSpec in 0.305 seconds (process running for 63.244) 2024-04-18T09:58:37.236Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Error while fetching metadata with correlation id 2 : {ncmp-async-m2m=UNKNOWN_TOPIC_OR_PARTITION} 2024-04-18T09:58:37.236Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Cluster ID: vtNWy151Q3mvPMYUL824eQ 2024-04-18T09:58:37.269Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Error while fetching metadata with correlation id 2 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:37.269Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Cluster ID: vtNWy151Q3mvPMYUL824eQ 2024-04-18T09:58:37.282Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:37.282Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.351Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Error while fetching metadata with correlation id 4 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:37.391Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Error while fetching metadata with correlation id 4 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:37.399Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:37.399Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.415Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:37.415Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:37.421Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:37.421Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.458Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Error while fetching metadata with correlation id 6 : {ncmp-async-m2m=LEADER_NOT_AVAILABLE} 2024-04-18T09:58:37.467Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Discovered group coordinator localhost:32774 (id: 2147483646 rack: null) 2024-04-18T09:58:37.468Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] (Re-)joining group 2024-04-18T09:58:37.470Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:37.470Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:37.487Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Request joining group due to: need to re-join with the given member-id: consumer-ncmp-async-rest-request-event-group-8-d413a327-b7e7-4705-bf61-ae6c96457a52 2024-04-18T09:58:37.487Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:37.487Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] (Re-)joining group 2024-04-18T09:58:37.504Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Discovered group coordinator localhost:32774 (id: 2147483646 rack: null) 2024-04-18T09:58:37.504Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] (Re-)joining group 2024-04-18T09:58:37.510Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Request joining group due to: need to re-join with the given member-id: consumer-ncmp-data-operation-event-group-7-4b0e2ad1-5dce-40b6-abdd-73f4d439f343 2024-04-18T09:58:37.510Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Request joining group due to: rebalance failed due to 'The group member needs to have a valid member id before actually entering a consumer group.' (MemberIdRequiredException) 2024-04-18T09:58:37.510Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] (Re-)joining group 2024-04-18T09:58:37.515Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Successfully joined group with generation Generation{generationId=1, memberId='consumer-ncmp-async-rest-request-event-group-8-d413a327-b7e7-4705-bf61-ae6c96457a52', protocol='range'} 2024-04-18T09:58:37.521Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Successfully joined group with generation Generation{generationId=1, memberId='consumer-ncmp-data-operation-event-group-7-4b0e2ad1-5dce-40b6-abdd-73f4d439f343', protocol='range'} 2024-04-18T09:58:37.521Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Finished assignment for group at generation 1: {consumer-ncmp-data-operation-event-group-7-4b0e2ad1-5dce-40b6-abdd-73f4d439f343=Assignment(partitions=[ncmp-async-m2m-0])} 2024-04-18T09:58:37.566Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Finished assignment for group at generation 1: {consumer-ncmp-async-rest-request-event-group-8-d413a327-b7e7-4705-bf61-ae6c96457a52=Assignment(partitions=[ncmp-async-m2m-0])} 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Successfully synced group in generation Generation{generationId=1, memberId='consumer-ncmp-async-rest-request-event-group-8-d413a327-b7e7-4705-bf61-ae6c96457a52', protocol='range'} 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Successfully synced group in generation Generation{generationId=1, memberId='consumer-ncmp-data-operation-event-group-7-4b0e2ad1-5dce-40b6-abdd-73f4d439f343', protocol='range'} 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Notifying assignor about the new Assignment(partitions=[ncmp-async-m2m-0]) 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Notifying assignor about the new Assignment(partitions=[ncmp-async-m2m-0]) 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Adding newly assigned partitions: ncmp-async-m2m-0 2024-04-18T09:58:37.573Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Adding newly assigned partitions: ncmp-async-m2m-0 2024-04-18T09:58:37.584Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:37.586Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:37.587Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:37.588Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:37.588Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32774] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-9 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class io.cloudevents.kafka.CloudEventSerializer 2024-04-18T09:58:37.591Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:37.591Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:37.591Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:37.591Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434317591 2024-04-18T09:58:37.592Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Found no committed offset for partition ncmp-async-m2m-0 2024-04-18T09:58:37.601Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Resetting offset for partition ncmp-async-m2m-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32774 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:37.602Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.SubscriptionState : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Resetting offset for partition ncmp-async-m2m-0 to position FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[localhost:32774 (id: 1 rack: null)], epoch=0}}. 2024-04-18T09:58:37.605Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.Metadata : [Producer clientId=producer-9] Cluster ID: vtNWy151Q3mvPMYUL824eQ 2024-04-18T09:58:37.618Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:37.618Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:37.622Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: partitions assigned: [ncmp-async-m2m-0] 2024-04-18T09:58:37.628Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: partitions assigned: [ncmp-async-m2m-0] 2024-04-18T09:58:37.717Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : Idempotence will be disabled because retries is set to 0. 2024-04-18T09:58:37.717Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 auto.include.jmx.reporter = true batch.size = 16384 bootstrap.servers = [PLAINTEXT://localhost:32774] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-10 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.adaptive.partitioning.enable = true partitioner.availability.timeout.ms = 0 partitioner.class = null partitioner.ignore.keys = false receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 0 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.connect.timeout.ms = null sasl.login.read.timeout.ms = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.login.retry.backoff.max.ms = 10000 sasl.login.retry.backoff.ms = 100 sasl.mechanism = GSSAPI sasl.oauthbearer.clock.skew.seconds = 30 sasl.oauthbearer.expected.audience = null sasl.oauthbearer.expected.issuer = null sasl.oauthbearer.jwks.endpoint.refresh.ms = 3600000 sasl.oauthbearer.jwks.endpoint.retry.backoff.max.ms = 10000 sasl.oauthbearer.jwks.endpoint.retry.backoff.ms = 100 sasl.oauthbearer.jwks.endpoint.url = null sasl.oauthbearer.scope.claim.name = scope sasl.oauthbearer.sub.claim.name = sub sasl.oauthbearer.token.endpoint.url = null security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2, TLSv1.3] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.3 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.springframework.kafka.support.serializer.JsonSerializer 2024-04-18T09:58:37.721Z INFO 7140 --- [ main] o.a.k.clients.producer.ProducerConfig : These configurations '[batch-size]' were supplied but are not used yet. 2024-04-18T09:58:37.721Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 3.6.1 2024-04-18T09:58:37.721Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: 5e3c2b738d253ff5 2024-04-18T09:58:37.721Z INFO 7140 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1713434317721 2024-04-18T09:58:37.740Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.Metadata : [Producer clientId=producer-10] Cluster ID: vtNWy151Q3mvPMYUL824eQ 2024-04-18T09:58:37.758Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:37.758Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.759Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:37.760Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:37.776Z ERROR 7140 --- [ntainer#0-0-C-1] o.s.kafka.listener.DefaultErrorHandler : Backoff FixedBackOff{interval=0, currentAttempts=1, maxAttempts=0} exhausted for ncmp-async-m2m-0@1 org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.decorateException(KafkaMessageListenerContainer.java:2954) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.checkDeser(KafkaMessageListenerContainer.java:3002) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeOnMessage(KafkaMessageListenerContainer.java:2854) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.lambda$doInvokeRecordListener$55(KafkaMessageListenerContainer.java:2777) at io.micrometer.observation.Observation.observe(Observation.java:499) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeRecordListener(KafkaMessageListenerContainer.java:2776) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doInvokeWithRecords(KafkaMessageListenerContainer.java:2625) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeRecordListener(KafkaMessageListenerContainer.java:2511) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeListener(KafkaMessageListenerContainer.java:2153) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.invokeIfHaveRecords(KafkaMessageListenerContainer.java:1493) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1458) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.run(KafkaMessageListenerContainer.java:1328) at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804) at java.base/java.lang.Thread.run(Thread.java:833) Caused by: org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize at org.springframework.kafka.support.serializer.SerializationUtils.deserializationException(SerializationUtils.java:158) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:218) at org.apache.kafka.common.serialization.Deserializer.deserialize(Deserializer.java:73) at org.apache.kafka.clients.consumer.internals.CompletedFetch.parseRecord(CompletedFetch.java:300) at org.apache.kafka.clients.consumer.internals.CompletedFetch.fetchRecords(CompletedFetch.java:263) at org.apache.kafka.clients.consumer.internals.AbstractFetch.fetchRecords(AbstractFetch.java:340) at org.apache.kafka.clients.consumer.internals.AbstractFetch.collectFetch(AbstractFetch.java:306) at org.apache.kafka.clients.consumer.KafkaConsumer.pollForFetches(KafkaConsumer.java:1262) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1186) at org.apache.kafka.clients.consumer.KafkaConsumer.poll(KafkaConsumer.java:1159) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollConsumer(KafkaMessageListenerContainer.java:1664) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.doPoll(KafkaMessageListenerContainer.java:1639) at org.springframework.kafka.listener.KafkaMessageListenerContainer$ListenerConsumer.pollAndInvoke(KafkaMessageListenerContainer.java:1437) ... 3 common frames omitted Caused by: io.cloudevents.rw.CloudEventRWException: Could not parse. Unknown encoding. Invalid content type or spec version at io.cloudevents.rw.CloudEventRWException.newUnknownEncodingException(CloudEventRWException.java:201) at io.cloudevents.core.message.impl.MessageUtils.parseStructuredOrBinaryMessage(MessageUtils.java:80) at io.cloudevents.kafka.KafkaMessageFactory.createReader(KafkaMessageFactory.java:65) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:60) at io.cloudevents.kafka.CloudEventDeserializer.deserialize(CloudEventDeserializer.java:34) at org.springframework.kafka.support.serializer.ErrorHandlingDeserializer.deserialize(ErrorHandlingDeserializer.java:215) ... 14 common frames omitted 2024-04-18T09:58:37.915Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:37.915Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Node 2147483646 disconnected. 2024-04-18T09:58:37.915Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node -1 disconnected. 2024-04-18T09:58:37.915Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:37.916Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node -1 disconnected. 2024-04-18T09:58:37.916Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Node 1 disconnected. 2024-04-18T09:58:37.916Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Cancelled in-flight FETCH request with correlation id 19 due to node 1 being disconnected (elapsed time since creation: 143ms, elapsed time since send: 143ms, request timeout: 30000ms) 2024-04-18T09:58:37.916Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Node -1 disconnected. 2024-04-18T09:58:37.916Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Node 2147483646 disconnected. 2024-04-18T09:58:37.916Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Node 1 disconnected. 2024-04-18T09:58:37.917Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Cancelled in-flight FETCH request with correlation id 20 due to node 1 being disconnected (elapsed time since creation: 139ms, elapsed time since send: 139ms, request timeout: 30000ms) 2024-04-18T09:58:37.917Z INFO 7140 --- [ntainer#0-0-C-1] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Error sending fetch request (sessionId=68292263, epoch=2) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:37.917Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Cancelled in-flight METADATA request with correlation id 22 due to node 1 being disconnected (elapsed time since creation: 40ms, elapsed time since send: 40ms, request timeout: 30000ms) 2024-04-18T09:58:37.917Z INFO 7140 --- [ion-event-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Group coordinator localhost:32774 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:37.917Z INFO 7140 --- [est-event-group] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Node -1 disconnected. 2024-04-18T09:58:37.917Z INFO 7140 --- [est-event-group] o.a.kafka.clients.FetchSessionHandler : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Error sending fetch request (sessionId=1697430125, epoch=2) to node 1: org.apache.kafka.common.errors.DisconnectException: null 2024-04-18T09:58:37.917Z INFO 7140 --- [est-event-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Group coordinator localhost:32774 (id: 2147483646 rack: null) is unavailable or invalid due to cause: coordinator unavailable. isDisconnected: true. Rediscovery will be attempted. 2024-04-18T09:58:38.016Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:38.016Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:38.016Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.016Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.017Z INFO 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Node 1 disconnected. 2024-04-18T09:58:38.017Z WARN 7140 --- [ntainer#0-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.017Z INFO 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Node 1 disconnected. 2024-04-18T09:58:38.017Z WARN 7140 --- [ntainer#1-0-C-1] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Revoke previously assigned partitions ncmp-async-m2m-0 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: partitions revoked: [ncmp-async-m2m-0] 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Unsubscribed all topics or patterns and assigned partitions 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-data-operation-event-group-7, groupId=ncmp-data-operation-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter 2024-04-18T09:58:38.094Z INFO 7140 --- [ntainer#0-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Revoke previously assigned partitions ncmp-async-m2m-0 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: partitions revoked: [ncmp-async-m2m-0] 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] fkaConsumerFactory$ExtendedKafkaConsumer : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Unsubscribed all topics or patterns and assigned partitions 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Resetting generation and member id due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.095Z INFO 7140 --- [ntainer#1-0-C-1] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer-ncmp-async-rest-request-event-group-8, groupId=ncmp-async-rest-request-event-group] Request joining group due to: consumer pro-actively leaving the group 2024-04-18T09:58:38.096Z INFO 7140 --- [ntainer#0-0-C-1] o.a.kafka.common.utils.AppInfoParser : App info kafka.consumer for consumer-ncmp-data-operation-event-group-7 unregistered 2024-04-18T09:58:38.096Z INFO 7140 --- [ntainer#0-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-data-operation-event-group: Consumer stopped 2024-04-18T09:58:38.096Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics scheduler closed 2024-04-18T09:58:38.096Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Closing reporter org.apache.kafka.common.metrics.JmxReporter 2024-04-18T09:58:38.096Z INFO 7140 --- [ntainer#1-0-C-1] o.apache.kafka.common.metrics.Metrics : Metrics reporters closed 2024-04-18T09:58:38.098Z INFO 7140 --- [ntainer#1-0-C-1] o.a.kafka.common.utils.AppInfoParser : App info kafka.consumer for consumer-ncmp-async-rest-request-event-group-8 unregistered 2024-04-18T09:58:38.098Z INFO 7140 --- [ntainer#1-0-C-1] o.s.k.l.KafkaMessageListenerContainer : ncmp-async-rest-request-event-group: Consumer stopped [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.553 s - in org.onap.cps.ncmp.api.impl.async.SerializationIntegrationSpec [INFO] Running org.onap.cps.ncmp.api.impl.client.DmiRestClientSpec 2024-04-18T09:58:38.117Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:38.117Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.123Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:38.123Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:38.129Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:38.129Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:38.171Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:38.171Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.253Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:38.253Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 09:58:38,284 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Worker thread will flush remaining events before exiting. 09:58:38,284 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Queue flush finished successfully within timeout. 09:58:38,285 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@716e729b - URL [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] is not of type file 09:58:38,287 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] to configuration watch list. 09:58:38,287 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@716e729b - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/defaults.xml] is not of type file 09:58:38,289 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word applicationName with class [org.springframework.boot.logging.logback.ApplicationNameConverter] 09:58:38,289 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word clr with class [org.springframework.boot.logging.logback.ColorConverter] 09:58:38,289 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word correlationId with class [org.springframework.boot.logging.logback.CorrelationIdConverter] 09:58:38,289 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wex with class [org.springframework.boot.logging.logback.WhitespaceThrowableProxyConverter] 09:58:38,289 |-INFO in ch.qos.logback.core.joran.action.ConversionRuleAction - registering conversion word wEx with class [org.springframework.boot.logging.logback.ExtendedWhitespaceThrowableProxyConverter] 09:58:38,289 |-INFO in ch.qos.logback.core.joran.util.ConfigurationWatchListUtil@7c062778 - Adding [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] to configuration watch list. 09:58:38,289 |-INFO in ch.qos.logback.core.joran.spi.ConfigurationWatchList@716e729b - URL [jar:file:/home/jenkins/.m2/repository/org/springframework/boot/spring-boot/3.2.4/spring-boot-3.2.4.jar!/org/springframework/boot/logging/logback/console-appender.xml] is not of type file 09:58:38,292 |-WARN in IfNestedWithinSecondPhaseElementSC - elements cannot be nested within an , or element 09:58:38,292 |-WARN in IfNestedWithinSecondPhaseElementSC - See also http://logback.qos.ch/codes.html#nested_if_element 09:58:38,292 |-WARN in IfNestedWithinSecondPhaseElementSC - Element at line 60 contains a nested element at line 61 09:58:38,292 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Registering a new ReconfigureOnChangeTask ReconfigureOnChangeTask(born:1713434318292) 09:58:38,292 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Will scan for changes in [jar:file:/w/workspace/cps-master-verify-java/cps-service/target/cps-service-3.4.8-SNAPSHOT.jar!/logback-spring.xml] 09:58:38,292 |-INFO in ch.qos.logback.classic.model.processor.ConfigurationModelHandlerFull - Setting ReconfigureOnChangeTask scanning period to 30 seconds 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.startup.DigesterFactory] to ERROR 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating ERROR level on Logger[org.apache.catalina.startup.DigesterFactory] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.catalina.util.LifecycleBase] to ERROR 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating ERROR level on Logger[org.apache.catalina.util.LifecycleBase] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.coyote.http11.Http11NioProtocol] to WARN 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating WARN level on Logger[org.apache.coyote.http11.Http11NioProtocol] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.sshd.common.util.SecurityUtils] to WARN 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating WARN level on Logger[org.apache.sshd.common.util.SecurityUtils] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.apache.tomcat.util.net.NioSelectorPool] to WARN 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating WARN level on Logger[org.apache.tomcat.util.net.NioSelectorPool] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.eclipse.jetty.util.component.AbstractLifeCycle] to ERROR 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating ERROR level on Logger[org.eclipse.jetty.util.component.AbstractLifeCycle] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.hibernate.validator.internal.util.Version] to WARN 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating WARN level on Logger[org.hibernate.validator.internal.util.Version] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.classic.model.processor.LoggerModelHandler - Setting level of logger [org.springframework.boot.actuate.endpoint.jmx] to WARN 09:58:38,293 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating WARN level on Logger[org.springframework.boot.actuate.endpoint.jmx] onto the JUL framework 09:58:38,293 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [CONSOLE] 09:58:38,293 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:38,293 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 09:58:38,294 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [jsonConsole] 09:58:38,294 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 09:58:38,294 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventJsonProviders] for [providers] property 09:58:38,294 |-INFO in ch.qos.logback.core.model.processor.ImplicitModelHandler - Assuming default type [net.logstash.logback.composite.loggingevent.LoggingEventPatternJsonProvider] for [pattern] property 09:58:38,298 |-INFO in ch.qos.logback.classic.pattern.DateConverter@17c345e3 - Setting zoneId to "UTC" 09:58:38,299 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - Processing appender named [asyncConsole] 09:58:38,299 |-INFO in ch.qos.logback.core.model.processor.AppenderModelHandler - About to instantiate appender of type [ch.qos.logback.classic.AsyncAppender] 09:58:38,302 |-INFO in ch.qos.logback.core.model.processor.conditional.IfModelHandler - Condition [property("loggingFormat").equalsIgnoreCase("json")] evaluated to false on line 61 09:58:38,302 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [CONSOLE] to ch.qos.logback.classic.AsyncAppender[asyncConsole] 09:58:38,302 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Attaching appender named [CONSOLE] to AsyncAppender. 09:58:38,302 |-INFO in ch.qos.logback.classic.AsyncAppender[asyncConsole] - Setting discardingThreshold to 51 09:58:38,302 |-INFO in ch.qos.logback.classic.model.processor.RootLoggerModelHandler - Setting level of ROOT logger to INFO 09:58:38,302 |-INFO in ch.qos.logback.classic.jul.LevelChangePropagator@a98b4cf - Propagating INFO level on Logger[ROOT] onto the JUL framework 09:58:38,302 |-INFO in ch.qos.logback.core.model.processor.AppenderRefModelHandler - Attaching appender named [asyncConsole] to Logger[ROOT] 09:58:38,302 |-INFO in ch.qos.logback.core.model.processor.DefaultProcessor@22a48ae3 - End of configuration. 09:58:38,302 |-INFO in org.springframework.boot.logging.logback.SpringBootJoranConfigurator@20dcce86 - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:38.306Z INFO 7140 --- [ main] o.o.c.n.a.impl.client.DmiRestClientSpec : Starting DmiRestClientSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:38.306Z INFO 7140 --- [ main] o.o.c.n.a.impl.client.DmiRestClientSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:38.318Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:38.318Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:38.322Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:38.322Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.330Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:38.330Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:38.354Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:38.354Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:38.373Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:38.373Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.410Z INFO 7140 --- [ main] o.o.c.n.a.impl.client.DmiRestClientSpec : Started DmiRestClientSpec in 0.143 seconds (process running for 64.483) 2024-04-18T09:58:38.432Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:38.432Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:38.446Z WARN 7140 --- [ main] o.o.c.n.api.impl.client.DmiRestClient : Failed to retrieve health status from some url. Error Message: class org.onap.cps.ncmp.api.impl.client.DmiRestClientSpec$__spock_feature_0_3prov1_closure3 cannot be cast to class com.fasterxml.jackson.databind.JsonNode (org.onap.cps.ncmp.api.impl.client.DmiRestClientSpec$__spock_feature_0_3prov1_closure3 and com.fasterxml.jackson.databind.JsonNode are in unnamed module of loader 'app') [INFO] Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.352 s - in org.onap.cps.ncmp.api.impl.client.DmiRestClientSpec [INFO] Running org.onap.cps.ncmp.api.impl.config.HttpClientConfigurationSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:38.485Z INFO 7140 --- [ main] .o.c.n.a.i.c.HttpClientConfigurationSpec : Starting HttpClientConfigurationSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:38.486Z INFO 7140 --- [ main] .o.c.n.a.i.c.HttpClientConfigurationSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:38.501Z INFO 7140 --- [ main] .o.c.n.a.i.c.HttpClientConfigurationSpec : Started HttpClientConfigurationSpec in 0.034 seconds (process running for 64.574) [INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.043 s - in org.onap.cps.ncmp.api.impl.config.HttpClientConfigurationSpec [INFO] Running org.onap.cps.ncmp.api.impl.config.NcmpConfigurationSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:38.531Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:38.531Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:38.531Z INFO 7140 --- [ main] o.o.c.n.a.i.c.NcmpConfigurationSpec : Starting NcmpConfigurationSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:38.531Z INFO 7140 --- [ main] o.o.c.n.a.i.c.NcmpConfigurationSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:38.537Z INFO 7140 --- [ main] o.o.c.n.a.i.c.NcmpConfigurationSpec : Started NcmpConfigurationSpec in 0.023 seconds (process running for 64.61) 2024-04-18T09:58:38.621Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:38.621Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:38.677Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:38.677Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. [INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.213 s - in org.onap.cps.ncmp.api.impl.config.NcmpConfigurationSpec [INFO] Running org.onap.cps.ncmp.api.impl.config.embeddedcache.CmNotificationSubscriptionCacheConfigSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:38.742Z INFO 7140 --- [ main] mNotificationSubscriptionCacheConfigSpec : Starting CmNotificationSubscriptionCacheConfigSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:38.742Z INFO 7140 --- [ main] mNotificationSubscriptionCacheConfigSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:38.776Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:38.776Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:38.819Z WARN 7140 --- [ main] c.h.i.impl.HazelcastInstanceFactory : Hazelcast is starting in a Java modular environment (Java 9 and newer) but without proper access to required Java packages. Use additional Java arguments to provide Hazelcast access to Java internal API. The internal API access is used to get the best performance results. Arguments to be used: --add-modules java.se --add-exports java.base/jdk.internal.ref=ALL-UNNAMED --add-opens java.base/java.lang=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.management/sun.management=ALL-UNNAMED --add-opens jdk.management/com.sun.management.internal=ALL-UNNAMED 2024-04-18T09:58:38.823Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:38.824Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:38.901Z INFO 7140 --- [ main] com.hazelcast.system.logo : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:58:38.901Z INFO 7140 --- [ main] com.hazelcast.system : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:58:38.901Z INFO 7140 --- [ main] com.hazelcast.system : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.250.0.254]:5701 2024-04-18T09:58:38.901Z INFO 7140 --- [ main] com.hazelcast.system : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Cluster name: cps-and-ncmp-test-caches 2024-04-18T09:58:38.901Z INFO 7140 --- [ main] com.hazelcast.system : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:58:38.904Z INFO 7140 --- [ main] com.hazelcast.system : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:58:39.026Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:39.026Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:39.085Z INFO 7140 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] TPC: disabled 2024-04-18T09:58:39.159Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:39.159Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:39.283Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:39.283Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:39.286Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:39.287Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:39.294Z INFO 7140 --- [ main] com.hazelcast.system.security : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:58:39.322Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:39.322Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:39.391Z INFO 7140 --- [ main] com.hazelcast.instance.impl.Node : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Using Multicast discovery 2024-04-18T09:58:39.395Z WARN 7140 --- [ main] com.hazelcast.cp.CPSubsystem : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:58:39.409Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:39.410Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:39.410Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:39.410Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:39.485Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:39.485Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:39.574Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:39.574Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:39.579Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:39.579Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:39.629Z INFO 7140 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:58:39.635Z INFO 7140 --- [ main] com.hazelcast.core.LifecycleService : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] [10.250.0.254]:5701 is STARTING 2024-04-18T09:58:39.735Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:39.735Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:39.927Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:39.927Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.066Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:40.066Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.230Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:40.230Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:40.414Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:40.414Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:40.425Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:40.426Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:40.440Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:40.440Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.487Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:40.487Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:40.512Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:40.512Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.538Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:40.538Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.583Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:40.583Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:40.628Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:40.628Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.689Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:40.689Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:40.832Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:40.832Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:40.919Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:40.919Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:41.232Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:41.232Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:41.365Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:41.365Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:41.428Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:41.428Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:41.435Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:41.435Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:41.440Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:41.440Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:41.481Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:41.481Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:41.494Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:41.494Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:41.542Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:41.542Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:41.568Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:41.568Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:41.692Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:41.692Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:41.836Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:41.836Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.073Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:42.073Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.219Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:42.219Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.288Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:42.288Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:42.347Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:42.347Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.371Z INFO 7140 --- [ main] c.h.internal.cluster.ClusterService : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Members {size:1, ver:1} [ Member [10.250.0.254]:5701 - 8de7e085-0d71-4d9b-a410-9f074e45b83d this ] 2024-04-18T09:58:42.390Z INFO 7140 --- [ main] com.hazelcast.core.LifecycleService : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] [10.250.0.254]:5701 is STARTED 2024-04-18T09:58:42.425Z INFO 7140 --- [ main] mNotificationSubscriptionCacheConfigSpec : Started CmNotificationSubscriptionCacheConfigSpec in 3.699 seconds (process running for 68.497) 2024-04-18T09:58:42.435Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:42.435Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:42.496Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:42.496Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.522Z INFO 7140 --- [ main] c.h.i.p.impl.PartitionStateManager : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Initializing cluster partition table arrangement... 2024-04-18T09:58:42.531Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:42.531Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:42.544Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:42.544Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:42.586Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:42.586Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. [INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.946 s - in org.onap.cps.ncmp.api.impl.config.embeddedcache.CmNotificationSubscriptionCacheConfigSpec [INFO] Running org.onap.cps.ncmp.api.impl.config.embeddedcache.SynchronizationCacheConfigSpec . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) ' |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v3.2.4) 2024-04-18T09:58:42.692Z INFO 7140 --- [ main] n.a.i.c.e.SynchronizationCacheConfigSpec : Starting SynchronizationCacheConfigSpec using Java 17.0.6-ea with PID 7140 (started by jenkins in /w/workspace/cps-master-verify-java/cps-ncmp-service) 2024-04-18T09:58:42.692Z INFO 7140 --- [ main] n.a.i.c.e.SynchronizationCacheConfigSpec : No active profile set, falling back to 1 default profile: "default" 2024-04-18T09:58:42.724Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:42.724Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5702 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Cluster name: cps-and-ncmp-test-caches 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:58:42.730Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:58:42.734Z INFO 7140 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] TPC: disabled 2024-04-18T09:58:42.744Z INFO 7140 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:58:42.745Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:42.745Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:42.745Z INFO 7140 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Using Multicast discovery 2024-04-18T09:58:42.745Z WARN 7140 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:58:42.758Z INFO 7140 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:58:42.759Z INFO 7140 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] [10.30.106.178]:5702 is STARTING 2024-04-18T09:58:42.776Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:42.776Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:42.862Z INFO 7140 --- [ main] c.h.i.cluster.impl.MulticastJoiner : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Trying to join to discovered node: [10.30.106.210]:5702 2024-04-18T09:58:42.888Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:54229 and /10.30.106.210:5702 2024-04-18T09:58:43.115Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:43.115Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.335Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:43.336Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.340Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:43.340Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:43.421Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:43.421Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.435Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:43.435Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:43.473Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:43.473Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:43.505Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:43.505Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.522Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:43.522Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.624Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:43.624Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:43.677Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:43.677Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:43.806Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:43.806Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:43.981Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:43.981Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:44.270Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:44.270Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:44.277Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:44.277Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:44.339Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:44.339Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:44.443Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:44.443Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:44.509Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:44.509Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:44.533Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:44.533Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:44.582Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:44.582Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:44.628Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:44.628Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:44.638Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:44.638Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:44.677Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:44.677Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:44.961Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:44.961Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.036Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:45.036Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:45.224Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:45.224Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.229Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:45.229Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.394Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:45.394Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.436Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:45.436Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:45.446Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:45.446Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:45.481Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:45.481Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:45.640Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:45.640Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:45.665Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:45.665Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.687Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:45.687Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:45.831Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:45.831Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:45.965Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:45.966Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.090Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:46.090Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:46.133Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:46.133Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.179Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:46.179Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.349Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:46.349Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:46.389Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:46.389Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:46.448Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:46.448Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.534Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:46.534Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:46.591Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:46.591Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:46.736Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:46.736Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.818Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:46.818Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.818Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:46.818Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:46.844Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:46.844Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:47.193Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:47.193Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:47.238Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:47.238Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.283Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:47.283Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.302Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:47.302Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.489Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:47.489Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:47.543Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:47.543Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:47.551Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:47.551Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:47.589Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:47.589Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.746Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:47.746Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:47.772Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:47.772Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.826Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:47.826Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:47.947Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:47.947Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:48.155Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:48.155Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.198Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:48.198Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:48.237Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:48.237Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.242Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:48.242Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.443Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:48.443Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.493Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:48.493Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:48.554Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:48.554Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:48.626Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.210:43855 2024-04-18T09:58:48.629Z INFO 7140 --- [ration.thread-2] c.h.internal.cluster.ClusterService : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Members {size:6, ver:6} [ Member [10.30.106.210]:5702 - aba42cdc-6165-479b-bf80-e6b5538d9895 Member [10.30.106.210]:5703 - 649a1443-cdd6-44d4-96ff-ff8dd6afb7b8 Member [10.30.106.210]:5704 - d5e321f2-948a-4d03-a965-a491d9cf9800 Member [10.30.106.210]:5705 - 37ae080b-737b-4610-860a-8bf039d3abad Member [10.30.106.178]:5702 - f33cb85b-d858-47bf-8e54-9d1a2f7c5618 this Member [10.30.106.210]:5706 - 0a9b1084-00ef-43a0-abe2-9848367a3706 ] 2024-04-18T09:58:48.631Z INFO 7140 --- [cached.thread-2] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5703, timeout: 10000, bind-any: true 2024-04-18T09:58:48.631Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5704, timeout: 10000, bind-any: true 2024-04-18T09:58:48.632Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.210:48947 2024-04-18T09:58:48.632Z INFO 7140 --- [cached.thread-6] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5706, timeout: 10000, bind-any: true 2024-04-18T09:58:48.634Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:42527 and /10.30.106.210:5704 2024-04-18T09:58:48.634Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.210:53181 2024-04-18T09:58:48.634Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:39071 and /10.30.106.210:5703 2024-04-18T09:58:48.638Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5702 and /10.30.106.210:50613 2024-04-18T09:58:48.640Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:51425 and /10.30.106.210:5706 2024-04-18T09:58:48.676Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:48.676Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.699Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:48.699Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:48.732Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:48.732Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:48.749Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:48.749Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:48.850Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:48.850Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:49.096Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:49.096Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.200Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:49.200Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.260Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:49.260Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.346Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:49.346Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:49.409Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:49.409Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:49.447Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:49.447Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.556Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:49.556Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:49.602Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:49.602Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:49.629Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 5432 killed; [ssh-agent] Stopped. Build was aborted Aborted by new patch set. [PostBuildScript] - [INFO] Executing post build scripts. [cps-master-verify-java] $ /bin/bash /tmp/jenkins7311244027517939639.sh ---> sysstat.sh 2024-04-18T09:58:49.629Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.632Z INFO 7140 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] [10.30.106.178]:5702 is STARTED 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system.logo : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Copyright (c) 2008-2023, Hazelcast, Inc. All Rights Reserved. 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Hazelcast Platform 5.3.6 (20231109 - 9903dc9) starting at [10.30.106.178]:5703 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Cluster name: cps-and-ncmp-test-caches 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Integrity Checker is disabled. Fail-fast on corrupted executables will not be performed. For more information, see the documentation for Integrity Checker. 2024-04-18T09:58:49.639Z INFO 7140 --- [ main] com.hazelcast.system : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] The Jet engine is disabled. To enable the Jet engine on the members, do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true) - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true (for Hazelcast embedded, works only when loading config via Config.load) - Add environment variable: HZ_JET_ENABLED=true (recommended when running container image. For Hazelcast embedded, works only when loading config via Config.load) 2024-04-18T09:58:49.645Z INFO 7140 --- [ main] c.h.internal.tpc.TpcServerBootstrap : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] TPC: disabled 2024-04-18T09:58:49.654Z INFO 7140 --- [ main] com.hazelcast.system.security : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Enable DEBUG/FINE log level for log category com.hazelcast.system.security or use -Dhazelcast.security.recommendations system property to see 🔒 security recommendations and the status of current config. 2024-04-18T09:58:49.655Z INFO 7140 --- [ main] com.hazelcast.instance.impl.Node : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Using Multicast discovery 2024-04-18T09:58:49.655Z WARN 7140 --- [ main] com.hazelcast.cp.CPSubsystem : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 2024-04-18T09:58:49.684Z INFO 7140 --- [ main] c.h.internal.diagnostics.Diagnostics : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 2024-04-18T09:58:49.685Z INFO 7140 --- [ main] com.hazelcast.core.LifecycleService : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] [10.30.106.178]:5703 is STARTING 2024-04-18T09:58:49.753Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:49.753Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:49.838Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:49.838Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:49.855Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:49.855Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:49.867Z INFO 7140 --- [ main] c.h.i.cluster.impl.MulticastJoiner : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Trying to join to discovered node: [10.30.106.210]:5702 2024-04-18T09:58:49.872Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:57147 and /10.30.106.210:5702 2024-04-18T09:58:50.157Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:50.157Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:50.266Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:50.266Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:50.300Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:50.300Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:50.400Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:50.400Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:50.432Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:50.432Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:50.458Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:50.458Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:50.467Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:50.467Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:50.651Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:50.651Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:50.706Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:50.706Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:50.907Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:50.908Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:50.957Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:50.958Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:50.992Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:50.992Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.061Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:51.061Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.204Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:51.204Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.270Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:51.270Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:51.285Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:51.285Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.304Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:51.304Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:51.361Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:51.361Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:51.505Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:51.505Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.671Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:51.671Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:51.710Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:51.710Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:51.760Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:51.760Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:51.860Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:51.860Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:51.948Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:51.948Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.115Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:52.115Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.257Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:52.257Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:52.359Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:52.359Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.475Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:52.475Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:52.490Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:52.490Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.524Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:52.525Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.563Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:52.563Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:52.660Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:52.660Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.762Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:52.762Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:52.801Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:52.801Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:52.862Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:52.862Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:52.914Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:52.914Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:53.269Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:53.269Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.312Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:53.313Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.343Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:53.343Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.412Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:53.412Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:53.577Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:53.577Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.629Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:53.629Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:53.666Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:53.666Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:53.713Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:53.713Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.817Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:53.817Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:53.864Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:53.865Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:53.904Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:53.904Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:53.966Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:53.966Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:54.316Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:54.316Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:54.465Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:54.465Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.473Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:54.473Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.497Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:54.497Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.532Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:54.532Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:54.671Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:54.671Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:54.731Z INFO 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Node 1 disconnected. 2024-04-18T09:58:54.731Z WARN 7140 --- [ad | producer-5] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-5] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.768Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Node 1 disconnected. 2024-04-18T09:58:54.768Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-1, groupId=test] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:54.817Z INFO 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Node 1 disconnected. 2024-04-18T09:58:54.817Z WARN 7140 --- [ad | producer-7] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-7] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.857Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:54.857Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:54.979Z INFO 7140 --- [ration.thread-0] c.h.internal.cluster.ClusterService : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Members {size:7, ver:7} [ Member [10.30.106.210]:5702 - aba42cdc-6165-479b-bf80-e6b5538d9895 Member [10.30.106.210]:5703 - 649a1443-cdd6-44d4-96ff-ff8dd6afb7b8 Member [10.30.106.210]:5704 - d5e321f2-948a-4d03-a965-a491d9cf9800 Member [10.30.106.210]:5705 - 37ae080b-737b-4610-860a-8bf039d3abad Member [10.30.106.178]:5702 - f33cb85b-d858-47bf-8e54-9d1a2f7c5618 this Member [10.30.106.210]:5706 - 0a9b1084-00ef-43a0-abe2-9848367a3706 Member [10.30.106.178]:5703 - c23ef9fc-86df-4408-ba43-97d826a011bc ] 2024-04-18T09:58:54.981Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5703 and /10.30.106.178:37165 2024-04-18T09:58:54.982Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:37165 and /10.30.106.178:5703 2024-04-18T09:58:54.983Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5703 and /10.30.106.210:35305 2024-04-18T09:58:54.983Z INFO 7140 --- [ration.thread-0] c.h.internal.cluster.ClusterService : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Members {size:7, ver:7} [ Member [10.30.106.210]:5702 - aba42cdc-6165-479b-bf80-e6b5538d9895 Member [10.30.106.210]:5703 - 649a1443-cdd6-44d4-96ff-ff8dd6afb7b8 Member [10.30.106.210]:5704 - d5e321f2-948a-4d03-a965-a491d9cf9800 Member [10.30.106.210]:5705 - 37ae080b-737b-4610-860a-8bf039d3abad Member [10.30.106.178]:5702 - f33cb85b-d858-47bf-8e54-9d1a2f7c5618 Member [10.30.106.210]:5706 - 0a9b1084-00ef-43a0-abe2-9848367a3706 Member [10.30.106.178]:5703 - c23ef9fc-86df-4408-ba43-97d826a011bc this ] 2024-04-18T09:58:54.984Z INFO 7140 --- [cached.thread-2] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5703, timeout: 10000, bind-any: true 2024-04-18T09:58:54.984Z INFO 7140 --- [cached.thread-6] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5705, timeout: 10000, bind-any: true 2024-04-18T09:58:54.984Z INFO 7140 --- [cached.thread-5] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5704, timeout: 10000, bind-any: true 2024-04-18T09:58:54.986Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5706, timeout: 10000, bind-any: true 2024-04-18T09:58:54.989Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:34925 and /10.30.106.210:5703 2024-04-18T09:58:54.991Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:58137 and /10.30.106.210:5704 2024-04-18T09:58:54.991Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:5703 and /10.30.106.210:55963 2024-04-18T09:58:54.999Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:57269 and /10.30.106.210:5705 2024-04-18T09:58:54.999Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Initialized new cluster connection between /10.30.106.178:35193 and /10.30.106.210:5706 2024-04-18T09:58:55.067Z INFO 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Node 1 disconnected. 2024-04-18T09:58:55.067Z WARN 7140 --- [t-thread | test] org.apache.kafka.clients.NetworkClient : [Consumer clientId=consumer-test-2, groupId=test] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:55.069Z INFO 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Node 1 disconnected. 2024-04-18T09:58:55.069Z WARN 7140 --- [ad | producer-9] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-9] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:55.220Z INFO 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Node 1 disconnected. 2024-04-18T09:58:55.220Z WARN 7140 --- [ad | producer-2] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-2] Connection to node 1 (localhost/127.0.0.1:32770) could not be established. Broker may not be available. 2024-04-18T09:58:55.529Z INFO 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Node 1 disconnected. 2024-04-18T09:58:55.529Z WARN 7140 --- [ad | producer-3] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-3] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:55.550Z INFO 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Node 1 disconnected. 2024-04-18T09:58:55.550Z WARN 7140 --- [ad | producer-8] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-8] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:55.589Z INFO 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Node 1 disconnected. 2024-04-18T09:58:55.589Z WARN 7140 --- [ad | producer-1] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-1] Connection to node 1 (localhost/127.0.0.1:32768) could not be established. Broker may not be available. 2024-04-18T09:58:55.670Z INFO 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Node 1 disconnected. 2024-04-18T09:58:55.670Z WARN 7140 --- [ad | producer-4] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-4] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:55.727Z INFO 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Node 1 disconnected. 2024-04-18T09:58:55.727Z WARN 7140 --- [d | producer-10] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-10] Connection to node 1 (localhost/127.0.0.1:32774) could not be established. Broker may not be available. 2024-04-18T09:58:55.762Z INFO 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Node 1 disconnected. 2024-04-18T09:58:55.762Z WARN 7140 --- [ad | producer-6] org.apache.kafka.clients.NetworkClient : [Producer clientId=producer-6] Connection to node 1 (localhost/127.0.0.1:32772) could not be established. Broker may not be available. 2024-04-18T09:58:55.793Z WARN 7140 --- [.async.thread-5] c.h.i.p.o.MigrationRequestOperation : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Failure while executing MigrationInfo{uuid=cd361304-84b5-49f6-bf46-9fcfa3b2f9ae, partitionId=186, source=[10.30.106.210]:5702 - aba42cdc-6165-479b-bf80-e6b5538d9895, sourceCurrentReplicaIndex=5, sourceNewReplicaIndex=6, destination=[10.30.106.210]:5705 - 37ae080b-737b-4610-860a-8bf039d3abad, destinationCurrentReplicaIndex=-1, destinationNewReplicaIndex=5, master=[10.30.106.210]:5702, initialPartitionVersion=9, partitionVersionIncrement=2, status=ACTIVE} java.lang.IllegalStateException: Migration operation is received before startup is completed. Sender: [10.30.106.178]:5702 at com.hazelcast.internal.partition.operation.BaseMigrationOperation.verifyNodeStarted(BaseMigrationOperation.java:108) at com.hazelcast.internal.partition.operation.BaseMigrationOperation.beforeRun(BaseMigrationOperation.java:90) at com.hazelcast.internal.partition.operation.MigrationOperation.beforeRun(MigrationOperation.java:57) at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:269) at com.hazelcast.spi.impl.operationservice.impl.OperationRunnerImpl.run(OperationRunnerImpl.java:502) at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.process(OperationThread.java:202) at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.process(OperationThread.java:142) at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.loop(OperationThread.java:134) at com.hazelcast.spi.impl.operationexecutor.impl.OperationThread.executeRun(OperationThread.java:115) at com.hazelcast.internal.util.executor.HazelcastManagedThread.run(HazelcastManagedThread.java:111) 2024-04-18T09:58:55.795Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=2, /10.30.106.178:5702->/10.30.106.210:53181, qualifier=null, endpoint=[10.30.106.210]:5704, remoteUuid=d5e321f2-948a-4d03-a965-a491d9cf9800, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.796Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=6, /10.30.106.178:58137->/10.30.106.210:5704, qualifier=null, endpoint=[10.30.106.210]:5704, remoteUuid=d5e321f2-948a-4d03-a965-a491d9cf9800, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.797Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=6, /10.30.106.178:39071->/10.30.106.210:5703, qualifier=null, endpoint=[10.30.106.210]:5703, remoteUuid=649a1443-cdd6-44d4-96ff-ff8dd6afb7b8, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.798Z INFO 7140 --- [.IO.thread-in-1] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=8, /10.30.106.178:5703->/10.30.106.210:55963, qualifier=null, endpoint=[10.30.106.210]:5703, remoteUuid=649a1443-cdd6-44d4-96ff-ff8dd6afb7b8, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.799Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5703, timeout: 10000, bind-any: true 2024-04-18T09:58:55.799Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5704, timeout: 10000, bind-any: true 2024-04-18T09:58:55.799Z INFO 7140 --- [cached.thread-1] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5704, timeout: 10000, bind-any: true 2024-04-18T09:58:55.799Z INFO 7140 --- [cached.thread-4] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5703, timeout: 10000, bind-any: true 2024-04-18T09:58:55.799Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=5, /10.30.106.178:34925->/10.30.106.210:5703, qualifier=null, endpoint=[10.30.106.210]:5703, remoteUuid=649a1443-cdd6-44d4-96ff-ff8dd6afb7b8, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.799Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5703. Reason: IOException[Connection refused to address /10.30.106.210:5703] 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5704. Reason: IOException[Connection refused to address /10.30.106.210:5704] 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5703, timeout: 10000, bind-any: true 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-4] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5703. Reason: IOException[Connection refused to address /10.30.106.210:5703] 2024-04-18T09:58:55.800Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=3, /10.30.106.178:5702->/10.30.106.210:43855, qualifier=null, endpoint=[10.30.106.210]:5705, remoteUuid=37ae080b-737b-4610-860a-8bf039d3abad, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-1] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5704. Reason: IOException[Connection refused to address /10.30.106.210:5704] 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5705, timeout: 10000, bind-any: true 2024-04-18T09:58:55.800Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5703. Reason: IOException[Connection refused to address /10.30.106.210:5703] 2024-04-18T09:58:55.801Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5705. Reason: IOException[Connection refused to address /10.30.106.210:5705] 2024-04-18T09:58:55.801Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=4, /10.30.106.178:57269->/10.30.106.210:5705, qualifier=null, endpoint=[10.30.106.210]:5705, remoteUuid=37ae080b-737b-4610-860a-8bf039d3abad, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.802Z INFO 7140 --- [.IO.thread-in-2] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=3, /10.30.106.178:5703->/10.30.106.210:35305, qualifier=null, endpoint=[10.30.106.210]:5705, remoteUuid=37ae080b-737b-4610-860a-8bf039d3abad, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.802Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5705, timeout: 10000, bind-any: true 2024-04-18T09:58:55.802Z INFO 7140 --- [cached.thread-2] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5705, timeout: 10000, bind-any: true 2024-04-18T09:58:55.802Z INFO 7140 --- [cached.thread-3] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5705. Reason: IOException[Connection refused to address /10.30.106.210:5705] 2024-04-18T09:58:55.802Z INFO 7140 --- [.IO.thread-in-0] c.h.i.server.tcp.TcpServerConnection : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connection[id=7, /10.30.106.178:42527->/10.30.106.210:5704, qualifier=null, endpoint=[10.30.106.210]:5704, remoteUuid=d5e321f2-948a-4d03-a965-a491d9cf9800, alive=false, connectionType=MEMBER, planeIndex=0] closed. Reason: Connection closed by the other side 2024-04-18T09:58:55.803Z INFO 7140 --- [cached.thread-2] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5703 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5705. Reason: IOException[Connection refused to address /10.30.106.210:5705] 2024-04-18T09:58:55.803Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Connecting to /10.30.106.210:5704, timeout: 10000, bind-any: true 2024-04-18T09:58:55.803Z INFO 7140 --- [cached.thread-7] c.h.i.server.tcp.TcpServerConnector : [10.30.106.178]:5702 [cps-and-ncmp-test-caches] [5.3.6] Could not connect to: /10.30.106.210:5704. Reason: IOException[Connection refused to address /10.30.106.210:5704] 2024-04-18T09:58:55.804Z INFO 7140 --- [.ShutdownThread] com.hazelcast.instance.impl.Node : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] Running shutdown hook... Current node state: ACTIVE 2024-04-18T09:58:55.804Z INFO 7140 --- [.ShutdownThread] com.hazelcast.core.LifecycleService : [10.250.0.254]:5701 [cps-and-ncmp-test-caches] [5.3.6] [10.250.0.254]:5701 is SHUTTING_DOWN [cps-master-verify-java] $ /bin/bash /tmp/jenkins5795579239450030441.sh ---> package-listing.sh ++ facter osfamily ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY=redhat + workspace=/w/workspace/cps-master-verify-java + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/cps-master-verify-java ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + rpm -qa + sort [INFO] [INFO] Results: [INFO] [INFO] Tests run: 187, Failures: 0, Errors: 0, Skipped: 0 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for cps 3.4.8-SNAPSHOT: [INFO] [INFO] org.onap.cps:cps-dependencies ...................... SUCCESS [ 0.958 s] [INFO] cps-bom ............................................ SUCCESS [ 0.005 s] [INFO] checkstyle ......................................... SUCCESS [ 3.472 s] [INFO] spotbugs ........................................... SUCCESS [ 0.021 s] [INFO] cps-parent ......................................... SUCCESS [ 9.154 s] [INFO] cps-events ......................................... SUCCESS [ 6.016 s] [INFO] cps-path-parser .................................... SUCCESS [ 11.120 s] [INFO] cps-service ........................................ SUCCESS [ 46.440 s] [INFO] cps-rest ........................................... SUCCESS [ 21.626 s] [INFO] cps-ncmp-events .................................... SUCCESS [ 4.863 s] [INFO] cps-ncmp-service ................................... FAILURE [01:35 min] [INFO] cps-ncmp-rest ...................................... SKIPPED [INFO] cps-ncmp-rest-stub ................................. SKIPPED [INFO] cps-ncmp-rest-stub-service ......................... SKIPPED [INFO] cps-ncmp-rest-stub-app ............................. SKIPPED [INFO] cps-ri ............................................. SKIPPED [INFO] dmi-plugin-demo-and-csit-stub ...................... SKIPPED [INFO] dmi-plugin-demo-and-csit-stub-service .............. SKIPPED [INFO] dmi-plugin-demo-and-csit-stub-app .................. SKIPPED [INFO] integration-test ................................... SKIPPED [INFO] cps-application .................................... SKIPPED [INFO] jacoco-report ...................................... SKIPPED [INFO] cps ................................................ SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 03:22 min [INFO] Finished at: 2024-04-18T09:58:56Z [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:3.0.0-M5:test (default-test) on project cps-ncmp-service: There are test failures. [ERROR] [ERROR] Please refer to /w/workspace/cps-master-verify-java/cps-ncmp-service/target/surefire-reports for the individual test results. [ERROR] Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream. [ERROR] The forked VM terminated without properly saying goodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd /w/workspace/cps-master-verify-java/cps-ncmp-service && /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java '-javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-service/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/*' -jar /w/workspace/cps-master-verify-java/cps-ncmp-service/target/surefire/surefirebooter4526204341005336383.jar /w/workspace/cps-master-verify-java/cps-ncmp-service/target/surefire 2024-04-18T09-55-56_262-jvmRun1 surefire645724550391534501tmp surefire_310805534627089720277tmp [ERROR] Error occurred in starting fork, check output in log [ERROR] Process Exit Code: 143 [ERROR] org.apache.maven.surefire.booter.SurefireBooterForkException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called? [ERROR] Command was /bin/sh -c cd /w/workspace/cps-master-verify-java/cps-ncmp-service && /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java '-javaagent:/home/jenkins/.m2/repository/org/jacoco/org.jacoco.agent/0.8.10/org.jacoco.agent-0.8.10-runtime.jar=destfile=/w/workspace/cps-master-verify-java/cps-ncmp-service/target/code-coverage/jacoco-ut.exec,excludes=org/onap/cps/event/model/*:org/onap/cps/rest/model/*:org/onap/cps/cpspath/parser/antlr4/*:org/onap/cps/ncmp/rest/model/*:org/onap/cps/**/*MapperImpl.class:org/onap/cps/ncmp/rest/stub/*' -jar /w/workspace/cps-master-verify-java/cps-ncmp-service/target/surefire/surefirebooter4526204341005336383.jar /w/workspace/cps-master-verify-java/cps-ncmp-service/target/surefire 2024-04-18T09-55-56_262-jvmRun1 surefire645724550391534501tmp surefire_310805534627089720277tmp [ERROR] Error occurred in starting fork, check output in log [ERROR] Process Exit Code: 143 [ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.fork(ForkStarter.java:748) [ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:305) [ERROR] at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:265) [ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1314) [ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1159) [ERROR] at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:932) [ERROR] at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137) [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210) [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156) [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148) [ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117) [ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81) [ERROR] at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56) [ERROR] at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) [ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305) [ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192) [ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105) [ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:957) [ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289) [ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:193) [ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [ERROR] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) [ERROR] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [ERROR] at java.base/java.lang.reflect.Method.invoke(Method.java:568) [ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282) [ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225) [ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406) [ERROR] at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347) [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn -rf :cps-ncmp-service + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/cps-master-verify-java ']' + mkdir -p /w/workspace/cps-master-verify-java/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/cps-master-verify-java/archives/ [cps-master-verify-java] $ /bin/bash /tmp/jenkins16986190399033028341.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/cps-master-verify-java/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-ckhe from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-ckhe/bin to PATH INFO: Running in OpenStack, capturing instance metadata [cps-master-verify-java] $ /bin/bash /tmp/jenkins16331361403812001463.sh provisioning config files... copy managed file [jenkins-log-archives-settings] to file:/w/workspace/cps-master-verify-java@tmp/config7692347772648885259tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[] Run condition [Regular expression match] preventing perform for step [Provide Configuration files] [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [cps-master-verify-java] $ /bin/bash /tmp/jenkins3869908768910255348.sh ---> create-netrc.sh [cps-master-verify-java] $ /bin/bash /tmp/jenkins15496777175091701310.sh ---> python-tools-install.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/cps-master-verify-java/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-ckhe from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-ckhe/bin to PATH [cps-master-verify-java] $ /bin/bash /tmp/jenkins5170380019924305331.sh ---> sudo-logs.sh Archiving 'sudo' log.. [cps-master-verify-java] $ /bin/bash /tmp/jenkins14822945404805242107.sh ---> job-cost.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/cps-master-verify-java/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-ckhe from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-ckhe/bin to PATH INFO: No Stack... INFO: Retrieving Pricing Info for: v3-standard-8 INFO: Archiving Costs [cps-master-verify-java] $ /bin/bash -l /tmp/jenkins17221672153292488059.sh ---> logs-deploy.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/cps-master-verify-java/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-ckhe from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-ckhe/bin to PATH INFO: Nexus URL https://nexus.onap.org path production/vex-yul-ecomp-jenkins-1/cps-master-verify-java/7255 INFO: archiving workspace using pattern(s): -p **/target/surefire-reports/*-output.txt Archives upload complete. INFO: archiving logs to Nexus ---> uname -a: Linux prd-centos8-docker-8c-8g-24093.novalocal 4.18.0-448.el8.x86_64 #1 SMP Wed Jan 18 15:02:46 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian CPU(s): 8 On-line CPU(s) list: 0-7 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 8 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC-Rome Processor Stepping: 0 CPU MHz: 2800.000 BogoMIPS: 5600.00 Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 32K L1i cache: 32K L2 cache: 512K L3 cache: 16384K NUMA node0 CPU(s): 0-7 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities ---> nproc: 8 ---> df -h: Filesystem Size Used Avail Use% Mounted on devtmpfs 16G 0 16G 0% /dev tmpfs 16G 0 16G 0% /dev/shm tmpfs 16G 17M 16G 1% /run tmpfs 16G 0 16G 0% /sys/fs/cgroup /dev/vda1 160G 12G 149G 7% / tmpfs 3.2G 0 3.2G 0% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 31890 1044 27334 19 3511 30382 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:12:2d:fd brd ff:ff:ff:ff:ff:ff altname enp0s3 altname ens3 inet 10.30.106.178/23 brd 10.30.107.255 scope global dynamic noprefixroute eth0 valid_lft 86069sec preferred_lft 86069sec inet6 fe80::f816:3eff:fe12:2dfd/64 scope link valid_lft forever preferred_lft forever 3: docker0: mtu 1500 qdisc noqueue state DOWN group default link/ether 02:42:39:fc:82:19 brd ff:ff:ff:ff:ff:ff inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 valid_lft forever preferred_lft forever inet6 fe80::42:39ff:fefc:8219/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 4.18.0-448.el8.x86_64 (centos-stream-8-docker-63ec32c8-5733-60f9-7d79-7a12e2b2c472.nova) 04/18/2024 _x86_64_ (8 CPU) 09:54:00 LINUX RESTART (8 CPU) 09:55:01 AM tps rtps wtps bread/s bwrtn/s 09:56:01 AM 225.61 19.96 205.65 3789.67 10521.31 09:57:01 AM 139.63 1.43 138.19 492.18 12942.89 09:58:01 AM 48.15 0.10 48.05 27.72 4270.11 09:59:01 AM 91.34 12.45 78.89 875.31 26402.50 Average: 126.18 8.49 117.69 1296.26 13534.23 09:55:01 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 09:56:01 AM 28512196 30535752 4143508 12.69 2688 2339036 2431596 7.21 238880 3428840 174564 09:57:01 AM 27529752 29798328 5125952 15.70 2688 2579964 3085696 9.16 308840 4322160 50920 09:58:01 AM 26058680 29143176 6597024 20.20 2688 3380016 3136752 9.31 386332 5655916 684948 09:59:01 AM 27997800 31111632 4657904 14.26 2688 3407644 1510180 4.48 666312 3441924 4 Average: 27524607 30147222 5131097 15.71 2688 2926665 2541056 7.54 400091 4212210 227609 09:55:01 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 09:56:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:56:01 AM eth0 285.65 243.00 2823.83 54.00 0.00 0.00 0.00 0.00 09:56:01 AM docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:57:01 AM lo 2.23 2.23 0.45 0.45 0.00 0.00 0.00 0.00 09:57:01 AM eth0 273.57 212.38 3449.34 28.74 0.00 0.00 0.00 0.00 09:57:01 AM docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:58:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:58:01 AM eth0 958.43 323.91 6481.02 31.87 0.00 0.00 0.00 0.00 09:58:01 AM docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:59:01 AM lo 29.14 29.14 3.13 3.13 0.00 0.00 0.00 0.00 09:59:01 AM eth0 86.40 82.16 51.47 23.85 0.00 0.00 0.00 0.00 09:59:01 AM docker0 6.58 7.21 0.84 0.85 0.00 0.00 0.00 0.00 Average: lo 7.84 7.84 0.89 0.89 0.00 0.00 0.00 0.00 Average: eth0 401.02 215.36 3201.41 34.61 0.00 0.00 0.00 0.00 Average: docker0 1.65 1.80 0.21 0.21 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 4.18.0-448.el8.x86_64 (centos-stream-8-docker-63ec32c8-5733-60f9-7d79-7a12e2b2c472.nova) 04/18/2024 _x86_64_ (8 CPU) 09:54:00 LINUX RESTART (8 CPU) 09:55:01 AM CPU %user %nice %system %iowait %steal %idle 09:56:01 AM all 20.32 0.00 1.29 1.09 0.07 77.23 09:56:01 AM 0 25.24 0.00 1.60 1.44 0.08 71.64 09:56:01 AM 1 22.39 0.00 0.73 0.85 0.10 75.92 09:56:01 AM 2 8.31 0.00 0.77 0.85 0.07 90.00 09:56:01 AM 3 28.02 0.00 2.22 1.65 0.07 68.03 09:56:01 AM 4 10.86 0.00 0.97 1.22 0.08 86.86 09:56:01 AM 5 24.81 0.00 1.84 0.92 0.05 72.38 09:56:01 AM 6 21.55 0.00 1.27 1.52 0.05 75.61 09:56:01 AM 7 21.33 0.00 0.95 0.23 0.05 77.44 09:57:01 AM all 31.63 0.00 1.05 0.84 0.08 66.40 09:57:01 AM 0 30.41 0.00 0.73 0.33 0.08 68.44 09:57:01 AM 1 29.22 0.00 1.07 1.18 0.08 68.44 09:57:01 AM 2 14.37 0.00 0.89 2.34 0.08 82.32 09:57:01 AM 3 46.40 0.00 1.56 0.60 0.08 51.35 09:57:01 AM 4 35.32 0.00 1.51 0.95 0.08 62.13 09:57:01 AM 5 38.61 0.00 0.64 1.12 0.07 59.57 09:57:01 AM 6 41.17 0.00 1.34 0.02 0.07 57.41 09:57:01 AM 7 17.58 0.00 0.69 0.15 0.10 81.48 09:58:01 AM all 39.71 0.00 2.32 0.27 0.09 57.62 09:58:01 AM 0 42.33 0.00 2.03 0.02 0.07 55.55 09:58:01 AM 1 47.49 0.00 2.39 1.35 0.10 48.67 09:58:01 AM 2 29.42 0.00 2.09 0.00 0.10 68.39 09:58:01 AM 3 42.96 0.00 2.25 0.03 0.08 54.67 09:58:01 AM 4 43.70 0.00 2.38 0.20 0.08 53.63 09:58:01 AM 5 30.44 0.00 2.68 0.27 0.10 66.51 09:58:01 AM 6 38.14 0.00 2.62 0.25 0.10 58.89 09:58:01 AM 7 43.17 0.00 2.11 0.02 0.08 54.62 09:59:01 AM all 18.18 0.29 2.02 1.22 0.09 78.20 09:59:01 AM 0 15.34 0.05 1.59 0.18 0.10 82.73 09:59:01 AM 1 17.96 0.32 2.43 2.41 0.08 76.81 09:59:01 AM 2 16.97 0.02 1.61 1.56 0.08 79.76 09:59:01 AM 3 19.18 0.00 1.93 0.27 0.07 78.56 09:59:01 AM 4 15.93 0.03 1.71 0.03 0.10 82.20 09:59:01 AM 5 23.03 0.18 2.16 0.03 0.08 74.51 09:59:01 AM 6 21.81 0.02 2.51 2.19 0.08 73.39 09:59:01 AM 7 15.20 1.72 2.27 3.04 0.10 77.66 Average: all 27.44 0.07 1.67 0.85 0.08 69.88 Average: 0 28.31 0.01 1.49 0.49 0.08 69.61 Average: 1 29.23 0.08 1.65 1.45 0.09 67.50 Average: 2 17.25 0.00 1.34 1.19 0.08 80.13 Average: 3 34.13 0.00 1.99 0.64 0.08 63.16 Average: 4 26.42 0.01 1.64 0.60 0.09 71.24 Average: 5 29.22 0.05 1.83 0.59 0.08 68.24 Average: 6 30.66 0.00 1.93 1.00 0.08 66.33 Average: 7 24.30 0.43 1.51 0.86 0.08 72.82