02:14:02 Started by upstream project "integration-distribution-test-titanium" build number 320 02:14:02 originally caused by: 02:14:02 Started by upstream project "autorelease-release-titanium-mvn39-openjdk21" build number 327 02:14:02 originally caused by: 02:14:02 Started by timer 02:14:02 Running as SYSTEM 02:14:02 [EnvInject] - Loading node environment variables. 02:14:02 Building remotely on prd-centos8-robot-2c-8g-43664 (centos8-robot-2c-8g) in workspace /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 02:14:02 [ssh-agent] Looking for ssh-agent implementation... 02:14:02 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 02:14:02 $ ssh-agent 02:14:02 SSH_AUTH_SOCK=/tmp/ssh-vdBcN5h7eWTm/agent.5298 02:14:02 SSH_AGENT_PID=5299 02:14:02 [ssh-agent] Started. 02:14:02 Running ssh-add (command line suppressed) 02:14:02 Identity added: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_7396891247760880209.key (/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_7396891247760880209.key) 02:14:02 [ssh-agent] Using credentials jenkins (Release Engineering Jenkins Key) 02:14:02 The recommended git tool is: NONE 02:14:06 using credential opendaylight-jenkins-ssh 02:14:06 Wiping out workspace first. 02:14:06 Cloning the remote Git repository 02:14:06 Cloning repository git://devvexx.opendaylight.org/mirror/integration/test 02:14:06 > git init /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test # timeout=10 02:14:06 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 02:14:06 > git --version # timeout=10 02:14:06 > git --version # 'git version 2.43.0' 02:14:06 using GIT_SSH to set credentials Release Engineering Jenkins Key 02:14:06 [INFO] Currently running in a labeled security context 02:14:06 [INFO] Currently SELinux is 'enforcing' on the host 02:14:06 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh3392629283066198809.key 02:14:06 Verifying host key using known hosts file 02:14:06 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 02:14:06 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test +refs/heads/*:refs/remotes/origin/* # timeout=10 02:14:09 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 02:14:09 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 02:14:10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 02:14:10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 02:14:10 using GIT_SSH to set credentials Release Engineering Jenkins Key 02:14:10 [INFO] Currently running in a labeled security context 02:14:10 [INFO] Currently SELinux is 'enforcing' on the host 02:14:10 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh5053615801612199646.key 02:14:10 Verifying host key using known hosts file 02:14:10 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 02:14:10 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test master # timeout=10 02:14:10 > git rev-parse FETCH_HEAD^{commit} # timeout=10 02:14:10 Checking out Revision 074069f5d4a1380883ffe1b3642610e77516dcc9 (origin/master) 02:14:10 > git config core.sparsecheckout # timeout=10 02:14:10 > git checkout -f 074069f5d4a1380883ffe1b3642610e77516dcc9 # timeout=10 02:14:11 Commit message: "Update car perf tests for RFC8040 compliance" 02:14:11 > git rev-parse FETCH_HEAD^{commit} # timeout=10 02:14:11 > git rev-list --no-walk 7a5ac501dd6c185ea1e99b3d109765d08b47774f # timeout=10 02:14:11 No emails were triggered. 02:14:11 provisioning config files... 02:14:11 copy managed file [npmrc] to file:/home/jenkins/.npmrc 02:14:11 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 02:14:11 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 02:14:11 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins9507693093363722491.sh 02:14:11 ---> python-tools-install.sh 02:14:11 Setup pyenv: 02:14:12 system 02:14:12 * 3.8.13 (set by /opt/pyenv/version) 02:14:12 * 3.9.13 (set by /opt/pyenv/version) 02:14:12 * 3.10.13 (set by /opt/pyenv/version) 02:14:12 * 3.11.7 (set by /opt/pyenv/version) 02:14:24 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-zBC1 02:14:24 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 02:14:34 lf-activate-venv(): INFO: Installing: lftools 02:15:28 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:15:28 Generating Requirements File 02:15:50 Python 3.11.7 02:15:50 pip 25.1.1 from /tmp/venv-zBC1/lib/python3.11/site-packages/pip (python 3.11) 02:15:51 appdirs==1.4.4 02:15:51 argcomplete==3.6.2 02:15:51 aspy.yaml==1.3.0 02:15:51 attrs==25.3.0 02:15:51 autopage==0.5.2 02:15:51 beautifulsoup4==4.13.4 02:15:51 boto3==1.39.8 02:15:51 botocore==1.39.8 02:15:51 bs4==0.0.2 02:15:51 cachetools==5.5.2 02:15:51 certifi==2025.7.14 02:15:51 cffi==1.17.1 02:15:51 cfgv==3.4.0 02:15:51 chardet==5.2.0 02:15:51 charset-normalizer==3.4.2 02:15:51 click==8.2.1 02:15:51 cliff==4.10.0 02:15:51 cmd2==2.7.0 02:15:51 cryptography==3.3.2 02:15:51 debtcollector==3.0.0 02:15:51 decorator==5.2.1 02:15:51 defusedxml==0.7.1 02:15:51 Deprecated==1.2.18 02:15:51 distlib==0.4.0 02:15:51 dnspython==2.7.0 02:15:51 docker==7.1.0 02:15:51 dogpile.cache==1.4.0 02:15:51 durationpy==0.10 02:15:51 email_validator==2.2.0 02:15:51 filelock==3.18.0 02:15:51 future==1.0.0 02:15:51 gitdb==4.0.12 02:15:51 GitPython==3.1.44 02:15:51 google-auth==2.40.3 02:15:51 httplib2==0.22.0 02:15:51 identify==2.6.12 02:15:51 idna==3.10 02:15:51 importlib-resources==1.5.0 02:15:51 iso8601==2.1.0 02:15:51 Jinja2==3.1.6 02:15:51 jmespath==1.0.1 02:15:51 jsonpatch==1.33 02:15:51 jsonpointer==3.0.0 02:15:51 jsonschema==4.24.1 02:15:51 jsonschema-specifications==2025.4.1 02:15:51 keystoneauth1==5.11.1 02:15:51 kubernetes==33.1.0 02:15:51 lftools==0.37.13 02:15:51 lxml==6.0.0 02:15:51 markdown-it-py==3.0.0 02:15:51 MarkupSafe==3.0.2 02:15:51 mdurl==0.1.2 02:15:51 msgpack==1.1.1 02:15:51 multi_key_dict==2.0.3 02:15:51 munch==4.0.0 02:15:51 netaddr==1.3.0 02:15:51 niet==1.4.2 02:15:51 nodeenv==1.9.1 02:15:51 oauth2client==4.1.3 02:15:51 oauthlib==3.3.1 02:15:51 openstacksdk==4.6.0 02:15:51 os-client-config==2.3.0 02:15:51 os-service-types==1.8.0 02:15:51 osc-lib==4.1.0 02:15:51 oslo.config==10.0.0 02:15:51 oslo.context==6.0.0 02:15:51 oslo.i18n==6.5.1 02:15:51 oslo.log==7.2.0 02:15:51 oslo.serialization==5.7.0 02:15:51 oslo.utils==9.0.0 02:15:51 packaging==25.0 02:15:51 pbr==6.1.1 02:15:51 platformdirs==4.3.8 02:15:51 prettytable==3.16.0 02:15:51 psutil==7.0.0 02:15:51 pyasn1==0.6.1 02:15:51 pyasn1_modules==0.4.2 02:15:51 pycparser==2.22 02:15:51 pygerrit2==2.0.15 02:15:51 PyGithub==2.6.1 02:15:51 Pygments==2.19.2 02:15:51 PyJWT==2.10.1 02:15:51 PyNaCl==1.5.0 02:15:51 pyparsing==2.4.7 02:15:51 pyperclip==1.9.0 02:15:51 pyrsistent==0.20.0 02:15:51 python-cinderclient==9.7.0 02:15:51 python-dateutil==2.9.0.post0 02:15:51 python-heatclient==4.3.0 02:15:51 python-jenkins==1.8.2 02:15:51 python-keystoneclient==5.6.0 02:15:51 python-magnumclient==4.8.1 02:15:51 python-openstackclient==8.1.0 02:15:51 python-swiftclient==4.8.0 02:15:51 PyYAML==6.0.2 02:15:51 referencing==0.36.2 02:15:51 requests==2.32.4 02:15:51 requests-oauthlib==2.0.0 02:15:51 requestsexceptions==1.4.0 02:15:51 rfc3986==2.0.0 02:15:51 rich==14.0.0 02:15:51 rich-argparse==1.7.1 02:15:51 rpds-py==0.26.0 02:15:51 rsa==4.9.1 02:15:51 ruamel.yaml==0.18.14 02:15:51 ruamel.yaml.clib==0.2.12 02:15:51 s3transfer==0.13.0 02:15:51 simplejson==3.20.1 02:15:51 six==1.17.0 02:15:51 smmap==5.0.2 02:15:51 soupsieve==2.7 02:15:51 stevedore==5.4.1 02:15:51 tabulate==0.9.0 02:15:51 toml==0.10.2 02:15:51 tomlkit==0.13.3 02:15:51 tqdm==4.67.1 02:15:51 typing_extensions==4.14.1 02:15:51 tzdata==2025.2 02:15:51 urllib3==1.26.20 02:15:51 virtualenv==20.31.2 02:15:51 wcwidth==0.2.13 02:15:51 websocket-client==1.8.0 02:15:51 wrapt==1.17.2 02:15:51 xdg==6.0.0 02:15:51 xmltodict==0.14.2 02:15:51 yq==3.4.3 02:15:51 [EnvInject] - Injecting environment variables from a build step. 02:15:51 [EnvInject] - Injecting as environment variables the properties content 02:15:51 OS_STACK_TEMPLATE=csit-2-instance-type.yaml 02:15:51 OS_CLOUD=vex 02:15:51 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:15:51 OS_STACK_TEMPLATE_DIR=openstack-hot 02:15:51 02:15:51 [EnvInject] - Variables injected successfully. 02:15:51 provisioning config files... 02:15:51 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 02:15:51 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins18036135622938013727.sh 02:15:51 ---> Create parameters file for OpenStack HOT 02:15:51 OpenStack Heat parameters generated 02:15:51 ----------------------------------- 02:15:51 parameters: 02:15:51 vm_0_count: '3' 02:15:51 vm_0_flavor: 'v3-standard-4' 02:15:51 vm_0_image: 'ZZCI - Ubuntu 22.04 - builder - x86_64 - 20250201-010426.857' 02:15:51 vm_1_count: '1' 02:15:51 vm_1_flavor: 'v3-standard-2' 02:15:51 vm_1_image: 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 - 20250201-060151.911' 02:15:51 02:15:51 job_name: '62057-313' 02:15:51 silo: 'releng' 02:15:51 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins12041160784780699352.sh 02:15:51 ---> Create HEAT stack 02:15:51 + source /home/jenkins/lf-env.sh 02:15:51 + lf-activate-venv --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 02:15:51 ++ mktemp -d /tmp/venv-XXXX 02:15:51 + lf_venv=/tmp/venv-CJmz 02:15:51 + local venv_file=/tmp/.os_lf_venv 02:15:51 + local python=python3 02:15:51 + local options 02:15:51 + local set_path=true 02:15:51 + local install_args= 02:15:51 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 02:15:51 + options=' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 02:15:51 + eval set -- ' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 02:15:51 ++ set -- --python python3 -- 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 02:15:51 + true 02:15:51 + case $1 in 02:15:51 + python=python3 02:15:51 + shift 2 02:15:51 + true 02:15:51 + case $1 in 02:15:51 + shift 02:15:51 + break 02:15:51 + case $python in 02:15:51 + local pkg_list= 02:15:51 + [[ -d /opt/pyenv ]] 02:15:51 + echo 'Setup pyenv:' 02:15:51 Setup pyenv: 02:15:51 + export PYENV_ROOT=/opt/pyenv 02:15:51 + PYENV_ROOT=/opt/pyenv 02:15:51 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:15:51 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:15:51 + pyenv versions 02:15:51 system 02:15:51 3.8.13 02:15:51 3.9.13 02:15:51 3.10.13 02:15:51 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:15:51 + command -v pyenv 02:15:51 ++ pyenv init - --no-rehash 02:15:51 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 02:15:51 for i in ${!paths[@]}; do 02:15:51 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 02:15:51 fi; done; 02:15:51 echo "${paths[*]}"'\'')" 02:15:51 export PATH="/opt/pyenv/shims:${PATH}" 02:15:51 export PYENV_SHELL=bash 02:15:51 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 02:15:51 pyenv() { 02:15:51 local command 02:15:51 command="${1:-}" 02:15:51 if [ "$#" -gt 0 ]; then 02:15:51 shift 02:15:51 fi 02:15:51 02:15:51 case "$command" in 02:15:51 rehash|shell) 02:15:51 eval "$(pyenv "sh-$command" "$@")" 02:15:51 ;; 02:15:51 *) 02:15:51 command pyenv "$command" "$@" 02:15:51 ;; 02:15:51 esac 02:15:51 }' 02:15:51 +++ bash --norc -ec 'IFS=:; paths=($PATH); 02:15:51 for i in ${!paths[@]}; do 02:15:51 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 02:15:51 fi; done; 02:15:51 echo "${paths[*]}"' 02:15:51 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:15:51 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:15:51 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:15:51 ++ export PYENV_SHELL=bash 02:15:51 ++ PYENV_SHELL=bash 02:15:51 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 02:15:51 +++ complete -F _pyenv pyenv 02:15:51 ++ lf-pyver python3 02:15:51 ++ local py_version_xy=python3 02:15:51 ++ local py_version_xyz= 02:15:51 ++ awk '{ print $1 }' 02:15:51 ++ pyenv versions 02:15:51 ++ local command 02:15:51 ++ command=versions 02:15:51 ++ '[' 1 -gt 0 ']' 02:15:51 ++ shift 02:15:51 ++ case "$command" in 02:15:51 ++ command pyenv versions 02:15:51 ++ pyenv versions 02:15:51 ++ sed 's/^[ *]* //' 02:15:51 ++ grep -E '^[0-9.]*[0-9]$' 02:15:52 ++ [[ ! -s /tmp/.pyenv_versions ]] 02:15:52 +++ tail -n 1 02:15:52 +++ grep '^3' /tmp/.pyenv_versions 02:15:52 +++ sort -V 02:15:52 ++ py_version_xyz=3.11.7 02:15:52 ++ [[ -z 3.11.7 ]] 02:15:52 ++ echo 3.11.7 02:15:52 ++ return 0 02:15:52 + pyenv local 3.11.7 02:15:52 + local command 02:15:52 + command=local 02:15:52 + '[' 2 -gt 0 ']' 02:15:52 + shift 02:15:52 + case "$command" in 02:15:52 + command pyenv local 3.11.7 02:15:52 + pyenv local 3.11.7 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='lftools[openstack] ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='kubernetes ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='niet ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='python-heatclient ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='python-openstackclient ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='python-magnumclient ' 02:15:52 + for arg in "$@" 02:15:52 + case $arg in 02:15:52 + pkg_list+='yq ' 02:15:52 + [[ -f /tmp/.os_lf_venv ]] 02:15:52 ++ cat /tmp/.os_lf_venv 02:15:52 + lf_venv=/tmp/venv-zBC1 02:15:52 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from' file:/tmp/.os_lf_venv 02:15:52 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:15:52 + /tmp/venv-zBC1/bin/python3 -m pip install --upgrade --quiet pip 'setuptools<66' virtualenv 02:15:54 + [[ -z lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ]] 02:15:54 + echo 'lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ' 02:15:54 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 02:15:54 + /tmp/venv-zBC1/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 02:16:30 + type python3 02:16:30 + true 02:16:30 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH' 02:16:30 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:16:30 + PATH=/tmp/venv-zBC1/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 02:16:30 + return 0 02:16:30 + openstack --os-cloud vex limits show --absolute 02:16:31 +--------------------------+---------+ 02:16:31 | Name | Value | 02:16:31 +--------------------------+---------+ 02:16:31 | maxTotalInstances | -1 | 02:16:31 | maxTotalCores | 450 | 02:16:31 | maxTotalRAMSize | 1000000 | 02:16:31 | maxServerMeta | 128 | 02:16:31 | maxImageMeta | 128 | 02:16:31 | maxPersonality | 5 | 02:16:31 | maxPersonalitySize | 10240 | 02:16:31 | maxTotalKeypairs | 100 | 02:16:31 | maxServerGroups | 10 | 02:16:31 | maxServerGroupMembers | 10 | 02:16:31 | maxTotalFloatingIps | -1 | 02:16:31 | maxSecurityGroups | -1 | 02:16:31 | maxSecurityGroupRules | -1 | 02:16:31 | totalRAMUsed | 729088 | 02:16:31 | totalCoresUsed | 178 | 02:16:31 | totalInstancesUsed | 62 | 02:16:31 | totalFloatingIpsUsed | 0 | 02:16:31 | totalSecurityGroupsUsed | 0 | 02:16:31 | totalServerGroupsUsed | 0 | 02:16:31 | maxTotalVolumes | -1 | 02:16:31 | maxTotalSnapshots | 10 | 02:16:31 | maxTotalVolumeGigabytes | 4096 | 02:16:31 | maxTotalBackups | 10 | 02:16:31 | maxTotalBackupGigabytes | 1000 | 02:16:31 | totalVolumesUsed | 3 | 02:16:31 | totalGigabytesUsed | 60 | 02:16:31 | totalSnapshotsUsed | 0 | 02:16:31 | totalBackupsUsed | 0 | 02:16:31 | totalBackupGigabytesUsed | 0 | 02:16:31 +--------------------------+---------+ 02:16:32 + pushd /opt/ciman/openstack-hot 02:16:32 /opt/ciman/openstack-hot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 02:16:32 + lftools openstack --os-cloud vex stack create releng-openflowplugin-csit-3node-clustering-only-titanium-313 csit-2-instance-type.yaml /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/stack-parameters.yaml 02:16:59 Creating stack releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:16:59 Waiting to initialize infrastructure... 02:16:59 Stack initialization successful. 02:16:59 ------------------------------------ 02:16:59 Stack Details 02:16:59 ------------------------------------ 02:16:59 {'added': None, 02:16:59 'capabilities': [], 02:16:59 'created_at': '2025-07-18T02:16:35Z', 02:16:59 'deleted': None, 02:16:59 'deleted_at': None, 02:16:59 'description': 'No description', 02:16:59 'environment': None, 02:16:59 'environment_files': None, 02:16:59 'files': None, 02:16:59 'files_container': None, 02:16:59 'id': '927e3b09-372f-41a1-83b3-926c461a45a9', 02:16:59 'is_rollback_disabled': True, 02:16:59 'links': [{'href': 'https://orchestration.public.mtl1.vexxhost.net/v1/12c36e260d8e4bb2913965203b1b491f/stacks/releng-openflowplugin-csit-3node-clustering-only-titanium-313/927e3b09-372f-41a1-83b3-926c461a45a9', 02:16:59 'rel': 'self'}], 02:16:59 'location': Munch({'cloud': 'vex', 'region_name': 'ca-ymq-1', 'zone': None, 'project': Munch({'id': '12c36e260d8e4bb2913965203b1b491f', 'name': '61975f2c-7c17-4d69-82fa-c3ae420ad6fd', 'domain_id': None, 'domain_name': 'Default'})}), 02:16:59 'name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-313', 02:16:59 'notification_topics': [], 02:16:59 'outputs': [{'description': 'IP addresses of the 2nd vm types', 02:16:59 'output_key': 'vm_1_ips', 02:16:59 'output_value': ['10.30.171.44']}, 02:16:59 {'description': 'IP addresses of the 1st vm types', 02:16:59 'output_key': 'vm_0_ips', 02:16:59 'output_value': ['10.30.170.189', 02:16:59 '10.30.171.89', 02:16:59 '10.30.171.50']}], 02:16:59 'owner_id': ****, 02:16:59 'parameters': {'OS::project_id': '12c36e260d8e4bb2913965203b1b491f', 02:16:59 'OS::stack_id': '927e3b09-372f-41a1-83b3-926c461a45a9', 02:16:59 'OS::stack_name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-313', 02:16:59 'job_name': '62057-313', 02:16:59 'silo': 'releng', 02:16:59 'vm_0_count': '3', 02:16:59 'vm_0_flavor': 'v3-standard-4', 02:16:59 'vm_0_image': 'ZZCI - Ubuntu 22.04 - builder - x86_64 - ' 02:16:59 '20250201-010426.857', 02:16:59 'vm_1_count': '1', 02:16:59 'vm_1_flavor': 'v3-standard-2', 02:16:59 'vm_1_image': 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 ' 02:16:59 '- 20250201-060151.911'}, 02:16:59 'parent_id': None, 02:16:59 'replaced': None, 02:16:59 'status': 'CREATE_COMPLETE', 02:16:59 'status_reason': 'Stack CREATE completed successfully', 02:16:59 'tags': [], 02:16:59 'template': None, 02:16:59 'template_description': 'No description', 02:16:59 'template_url': None, 02:16:59 'timeout_mins': 15, 02:16:59 'unchanged': None, 02:16:59 'updated': None, 02:16:59 'updated_at': None, 02:16:59 'user_project_id': '9c06b570a3194dba86913fc25aae4490'} 02:16:59 ------------------------------------ 02:16:59 + popd 02:16:59 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 02:16:59 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins5161568321313435582.sh 02:16:59 ---> Copy SSH public keys to CSIT lab 02:16:59 Setup pyenv: 02:16:59 system 02:16:59 3.8.13 02:16:59 3.9.13 02:16:59 3.10.13 02:16:59 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:16:59 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:17:01 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 02:17:15 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:17:17 SSH not responding on 10.30.171.50. Retrying in 10 seconds... 02:17:17 SSH not responding on 10.30.171.89. Retrying in 10 seconds... 02:17:17 SSH not responding on 10.30.170.189. Retrying in 10 seconds... 02:17:23 SSH not responding on 10.30.171.44. Retrying in 10 seconds... 02:17:27 Ping to 10.30.171.50 successful. 02:17:27 Ping to 10.30.171.89 successful. 02:17:27 Ping to 10.30.170.189 successful. 02:17:27 SSH not responding on 10.30.171.89. Retrying in 10 seconds... 02:17:27 SSH not responding on 10.30.171.50. Retrying in 10 seconds... 02:17:29 SSH not responding on 10.30.170.189. Retrying in 10 seconds... 02:17:33 Ping to 10.30.171.44 successful. 02:17:33 SSH not responding on 10.30.171.44. Retrying in 10 seconds... 02:17:37 Ping to 10.30.171.89 successful. 02:17:37 Ping to 10.30.171.50 successful. 02:17:38 SSH not responding on 10.30.171.89. Retrying in 10 seconds... 02:17:39 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:17:39 Ping to 10.30.170.189 successful. 02:17:39 releng-62057-313-0-builder-2 02:17:39 Successfully copied public keys to slave 10.30.171.50 02:17:40 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:17:41 releng-62057-313-0-builder-0 02:17:41 Successfully copied public keys to slave 10.30.170.189 02:17:43 Ping to 10.30.171.44 successful. 02:17:43 SSH not responding on 10.30.171.44. Retrying in 10 seconds... 02:17:48 Ping to 10.30.171.89 successful. 02:17:49 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:17:49 releng-62057-313-0-builder-1 02:17:49 Successfully copied public keys to slave 10.30.171.89 02:17:53 Ping to 10.30.171.44 successful. 02:17:54 SSH not responding on 10.30.171.44. Retrying in 10 seconds... 02:18:04 Ping to 10.30.171.44 successful. 02:18:05 Warning: Permanently added '10.30.171.44' (ECDSA) to the list of known hosts. 02:18:06 releng-62057-313-1-mininet-ovs-217-0 02:18:06 Successfully copied public keys to slave 10.30.171.44 02:18:06 Process 6527 ready. 02:18:06 Process 6528 ready. 02:18:06 Process 6529 ready. 02:18:06 Process 6530 ready. 02:18:06 SSH ready on all stack servers. 02:18:06 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins16375077760658411694.sh 02:18:06 Setup pyenv: 02:18:06 system 02:18:06 3.8.13 02:18:06 3.9.13 02:18:06 3.10.13 02:18:06 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:18:10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-bBfd 02:18:10 lf-activate-venv(): INFO: Save venv in file: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.robot_venv 02:18:14 lf-activate-venv(): INFO: Installing: setuptools wheel 02:18:16 lf-activate-venv(): INFO: Adding /tmp/venv-bBfd/bin to PATH 02:18:16 + echo 'Installing Python Requirements' 02:18:16 Installing Python Requirements 02:18:16 + cat 02:18:16 + python -m pip install -r requirements.txt 02:18:16 Looking in indexes: https://nexus3.opendaylight.org/repository/PyPi/simple 02:18:16 Collecting docker-py (from -r requirements.txt (line 1)) 02:18:16 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-py/1.10.6/docker_py-1.10.6-py2.py3-none-any.whl (50 kB) 02:18:16 Collecting ipaddr (from -r requirements.txt (line 2)) 02:18:16 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ipaddr/2.2.0/ipaddr-2.2.0.tar.gz (26 kB) 02:18:16 Preparing metadata (setup.py): started 02:18:16 Preparing metadata (setup.py): finished with status 'done' 02:18:16 Collecting netaddr (from -r requirements.txt (line 3)) 02:18:16 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/netaddr/1.3.0/netaddr-1.3.0-py3-none-any.whl (2.3 MB) 02:18:16 Collecting netifaces (from -r requirements.txt (line 4)) 02:18:16 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/netifaces/0.11.0/netifaces-0.11.0.tar.gz (30 kB) 02:18:16 Preparing metadata (setup.py): started 02:18:16 Preparing metadata (setup.py): finished with status 'done' 02:18:17 Collecting pyhocon (from -r requirements.txt (line 5)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyhocon/0.3.61/pyhocon-0.3.61-py3-none-any.whl (25 kB) 02:18:17 Collecting requests (from -r requirements.txt (line 6)) 02:18:17 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/requests/2.32.4/requests-2.32.4-py3-none-any.whl (64 kB) 02:18:17 Collecting robotframework (from -r requirements.txt (line 7)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework/7.3.2/robotframework-7.3.2-py3-none-any.whl (795 kB) 02:18:17 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 795.1/795.1 kB 18.8 MB/s eta 0:00:00 02:18:17 Collecting robotframework-httplibrary (from -r requirements.txt (line 8)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-httplibrary/0.4.2/robotframework-httplibrary-0.4.2.tar.gz (9.1 kB) 02:18:17 Preparing metadata (setup.py): started 02:18:17 Preparing metadata (setup.py): finished with status 'done' 02:18:17 Collecting robotframework-requests==0.9.7 (from -r requirements.txt (line 9)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-requests/0.9.7/robotframework_requests-0.9.7-py3-none-any.whl (21 kB) 02:18:17 Collecting robotframework-selenium2library (from -r requirements.txt (line 10)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-selenium2library/3.0.0/robotframework_selenium2library-3.0.0-py2.py3-none-any.whl (6.2 kB) 02:18:17 Collecting robotframework-sshlibrary==3.8.0 (from -r requirements.txt (line 11)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-sshlibrary/3.8.0/robotframework-sshlibrary-3.8.0.tar.gz (51 kB) 02:18:17 Preparing metadata (setup.py): started 02:18:17 Preparing metadata (setup.py): finished with status 'done' 02:18:17 Collecting scapy (from -r requirements.txt (line 12)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scapy/2.6.1/scapy-2.6.1-py3-none-any.whl (2.4 MB) 02:18:17 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 61.9 MB/s eta 0:00:00 02:18:17 Collecting jsonpath-rw (from -r requirements.txt (line 15)) 02:18:17 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpath-rw/1.4.0/jsonpath-rw-1.4.0.tar.gz (13 kB) 02:18:17 Preparing metadata (setup.py): started 02:18:18 Preparing metadata (setup.py): finished with status 'done' 02:18:18 Collecting elasticsearch (from -r requirements.txt (line 18)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/9.0.2/elasticsearch-9.0.2-py3-none-any.whl (914 kB) 02:18:18 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 914.3/914.3 kB 42.1 MB/s eta 0:00:00 02:18:18 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.18.0/elasticsearch_dsl-8.18.0-py3-none-any.whl (10 kB) 02:18:18 Collecting pyangbind (from -r requirements.txt (line 22)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyangbind/0.8.6/pyangbind-0.8.6-py3-none-any.whl (52 kB) 02:18:18 Collecting isodate (from -r requirements.txt (line 25)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/isodate/0.7.2/isodate-0.7.2-py3-none-any.whl (22 kB) 02:18:18 Collecting jmespath (from -r requirements.txt (line 28)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jmespath/1.0.1/jmespath-1.0.1-py3-none-any.whl (20 kB) 02:18:18 Collecting jsonpatch (from -r requirements.txt (line 31)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpatch/1.33/jsonpatch-1.33-py2.py3-none-any.whl (12 kB) 02:18:18 Collecting paramiko>=1.15.3 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/paramiko/3.5.1/paramiko-3.5.1-py3-none-any.whl (227 kB) 02:18:18 Collecting scp>=0.13.0 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scp/0.15.0/scp-0.15.0-py2.py3-none-any.whl (8.8 kB) 02:18:18 Collecting docker-pycreds>=0.2.1 (from docker-py->-r requirements.txt (line 1)) 02:18:18 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-pycreds/0.4.0/docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) 02:18:18 Collecting six>=1.4.0 (from docker-py->-r requirements.txt (line 1)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/six/1.17.0/six-1.17.0-py2.py3-none-any.whl (11 kB) 02:18:18 Collecting websocket-client>=0.32.0 (from docker-py->-r requirements.txt (line 1)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/websocket-client/1.8.0/websocket_client-1.8.0-py3-none-any.whl (58 kB) 02:18:18 Collecting pyparsing<4,>=2 (from pyhocon->-r requirements.txt (line 5)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pyparsing/3.2.3/pyparsing-3.2.3-py3-none-any.whl (111 kB) 02:18:18 Collecting charset_normalizer<4,>=2 (from requests->-r requirements.txt (line 6)) 02:18:18 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/charset-normalizer/3.4.2/charset_normalizer-3.4.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (147 kB) 02:18:19 Collecting idna<4,>=2.5 (from requests->-r requirements.txt (line 6)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/idna/3.10/idna-3.10-py3-none-any.whl (70 kB) 02:18:19 Collecting urllib3<3,>=1.21.1 (from requests->-r requirements.txt (line 6)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/urllib3/2.5.0/urllib3-2.5.0-py3-none-any.whl (129 kB) 02:18:19 Collecting certifi>=2017.4.17 (from requests->-r requirements.txt (line 6)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/certifi/2025.7.14/certifi-2025.7.14-py3-none-any.whl (162 kB) 02:18:19 Collecting webtest>=2.0 (from robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webtest/3.0.6/webtest-3.0.6-py3-none-any.whl (32 kB) 02:18:19 Collecting jsonpointer (from robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpointer/3.0.0/jsonpointer-3.0.0-py2.py3-none-any.whl (7.6 kB) 02:18:19 Collecting robotframework-seleniumlibrary>=3.0.0 (from robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-seleniumlibrary/6.7.1/robotframework_seleniumlibrary-6.7.1-py2.py3-none-any.whl (104 kB) 02:18:19 Collecting ply (from jsonpath-rw->-r requirements.txt (line 15)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ply/3.11/ply-3.11-py2.py3-none-any.whl (49 kB) 02:18:19 Collecting decorator (from jsonpath-rw->-r requirements.txt (line 15)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/decorator/5.2.1/decorator-5.2.1-py3-none-any.whl (9.2 kB) 02:18:19 Collecting elastic-transport<9,>=8.15.1 (from elasticsearch->-r requirements.txt (line 18)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/8.17.1/elastic_transport-8.17.1-py3-none-any.whl (64 kB) 02:18:19 Collecting python-dateutil (from elasticsearch->-r requirements.txt (line 18)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/python-dateutil/2.9.0.post0/python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) 02:18:19 Collecting typing-extensions (from elasticsearch->-r requirements.txt (line 18)) 02:18:19 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/typing-extensions/4.14.1/typing_extensions-4.14.1-py3-none-any.whl (43 kB) 02:18:19 Collecting elasticsearch (from -r requirements.txt (line 18)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/8.18.1/elasticsearch-8.18.1-py3-none-any.whl (906 kB) 02:18:19 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 906.3/906.3 kB 48.5 MB/s eta 0:00:00 02:18:19 Collecting pyang (from pyangbind->-r requirements.txt (line 22)) 02:18:19 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyang/2.6.1/pyang-2.6.1-py2.py3-none-any.whl (594 kB) 02:18:19 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 594.7/594.7 kB 37.7 MB/s eta 0:00:00 02:18:20 Collecting lxml (from pyangbind->-r requirements.txt (line 22)) 02:18:20 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/lxml/6.0.0/lxml-6.0.0-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (5.2 MB) 02:18:20 Collecting regex (from pyangbind->-r requirements.txt (line 22)) 02:18:20 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/regex/2024.11.6/regex-2024.11.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (792 kB) 02:18:20 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 792.7/792.7 kB 26.7 MB/s eta 0:00:00 02:18:20 Collecting enum34 (from pyangbind->-r requirements.txt (line 22)) 02:18:20 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/enum34/1.1.10/enum34-1.1.10-py3-none-any.whl (11 kB) 02:18:21 Collecting bcrypt>=3.2 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:21 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/bcrypt/4.3.0/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_x86_64.whl (284 kB) 02:18:21 Collecting cryptography>=3.3 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:21 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cryptography/45.0.5/cryptography-45.0.5-cp311-abi3-manylinux_2_28_x86_64.whl (4.5 MB) 02:18:21 Collecting pynacl>=1.5 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:21 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pynacl/1.5.0/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (856 kB) 02:18:21 Collecting cffi>=1.14 (from cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:21 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cffi/1.17.1/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (467 kB) 02:18:21 Collecting pycparser (from cffi>=1.14->cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 02:18:21 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pycparser/2.22/pycparser-2.22-py3-none-any.whl (117 kB) 02:18:21 Collecting selenium>=4.3.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:21 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/selenium/4.34.2/selenium-4.34.2-py3-none-any.whl (9.4 MB) 02:18:22 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.4/9.4 MB 82.3 MB/s eta 0:00:00 02:18:22 Collecting robotframework-pythonlibcore>=4.4.1 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-pythonlibcore/4.4.1/robotframework_pythonlibcore-4.4.1-py2.py3-none-any.whl (12 kB) 02:18:22 Collecting click>=8.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/click/8.2.1/click-8.2.1-py3-none-any.whl (102 kB) 02:18:22 Collecting trio~=0.30.0 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio/0.30.0/trio-0.30.0-py3-none-any.whl (499 kB) 02:18:22 Collecting trio-websocket~=0.12.2 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio-websocket/0.12.2/trio_websocket-0.12.2-py3-none-any.whl (21 kB) 02:18:22 Collecting attrs>=23.2.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/attrs/25.3.0/attrs-25.3.0-py3-none-any.whl (63 kB) 02:18:22 Collecting sortedcontainers (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sortedcontainers/2.4.0/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) 02:18:22 Collecting outcome (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/outcome/1.3.0.post0/outcome-1.3.0.post0-py2.py3-none-any.whl (10 kB) 02:18:22 Collecting sniffio>=1.3.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl (10 kB) 02:18:22 Collecting wsproto>=0.14 (from trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/wsproto/1.2.0/wsproto-1.2.0-py3-none-any.whl (24 kB) 02:18:22 Collecting pysocks!=1.5.7,<2.0,>=1.5.6 (from urllib3[socks]~=2.5.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pysocks/1.7.1/PySocks-1.7.1-py3-none-any.whl (16 kB) 02:18:22 Collecting WebOb>=1.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:22 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webob/1.8.9/WebOb-1.8.9-py2.py3-none-any.whl (115 kB) 02:18:23 Collecting waitress>=3.0.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:23 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/waitress/3.0.2/waitress-3.0.2-py3-none-any.whl (56 kB) 02:18:23 Collecting beautifulsoup4 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:23 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/beautifulsoup4/4.13.4/beautifulsoup4-4.13.4-py3-none-any.whl (187 kB) 02:18:23 Collecting h11<1,>=0.9.0 (from wsproto>=0.14->trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 02:18:23 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/h11/0.16.0/h11-0.16.0-py3-none-any.whl (37 kB) 02:18:23 Collecting soupsieve>1.2 (from beautifulsoup4->webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 02:18:23 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/soupsieve/2.7/soupsieve-2.7-py3-none-any.whl (36 kB) 02:18:23 Building wheels for collected packages: robotframework-sshlibrary, ipaddr, netifaces, robotframework-httplibrary, jsonpath-rw 02:18:23 DEPRECATION: Building 'robotframework-sshlibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-sshlibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 02:18:23 Building wheel for robotframework-sshlibrary (setup.py): started 02:18:23 Building wheel for robotframework-sshlibrary (setup.py): finished with status 'done' 02:18:23 Created wheel for robotframework-sshlibrary: filename=robotframework_sshlibrary-3.8.0-py3-none-any.whl size=55205 sha256=6c89aaab9d2211089a945fe992a58ec7a86353fbb279e91d690e652c3c665cce 02:18:23 Stored in directory: /home/jenkins/.cache/pip/wheels/f7/c9/b3/a977b7bcc410d45ae27d240df3d00a12585509180e373ecccc 02:18:23 DEPRECATION: Building 'ipaddr' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'ipaddr'. Discussion can be found at https://github.com/pypa/pip/issues/6334 02:18:23 Building wheel for ipaddr (setup.py): started 02:18:23 Building wheel for ipaddr (setup.py): finished with status 'done' 02:18:23 Created wheel for ipaddr: filename=ipaddr-2.2.0-py3-none-any.whl size=18353 sha256=d37ee5f2fdc7a693a65b50e97d09b08b634938f45cc7b7e0606a88928fcec119 02:18:23 Stored in directory: /home/jenkins/.cache/pip/wheels/dc/6c/04/da2d847fa8d45c59af3e1d83e2acc29cb8adcbaf04c0898dbf 02:18:23 DEPRECATION: Building 'netifaces' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'netifaces'. Discussion can be found at https://github.com/pypa/pip/issues/6334 02:18:23 Building wheel for netifaces (setup.py): started 02:18:26 Building wheel for netifaces (setup.py): finished with status 'done' 02:18:26 Created wheel for netifaces: filename=netifaces-0.11.0-cp311-cp311-linux_x86_64.whl size=41078 sha256=7cd01bb9bf907d6680847f5ef8ee264dd0a1b3233e9a0ba64d8da832f86924f6 02:18:26 Stored in directory: /home/jenkins/.cache/pip/wheels/f8/18/88/e61d54b995bea304bdb1d040a92b72228a1bf72ca2a3eba7c9 02:18:26 DEPRECATION: Building 'robotframework-httplibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-httplibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 02:18:26 Building wheel for robotframework-httplibrary (setup.py): started 02:18:26 Building wheel for robotframework-httplibrary (setup.py): finished with status 'done' 02:18:26 Created wheel for robotframework-httplibrary: filename=robotframework_httplibrary-0.4.2-py3-none-any.whl size=10014 sha256=2fe5f05b3270fa9ffecbecfc612d4e9d3749ab4b570354dc866355bcc003dd2b 02:18:26 Stored in directory: /home/jenkins/.cache/pip/wheels/aa/bc/0d/9a20dd51effef392aae2733cb4c7b66c6fa29fca33d88b57ed 02:18:26 DEPRECATION: Building 'jsonpath-rw' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'jsonpath-rw'. Discussion can be found at https://github.com/pypa/pip/issues/6334 02:18:26 Building wheel for jsonpath-rw (setup.py): started 02:18:26 Building wheel for jsonpath-rw (setup.py): finished with status 'done' 02:18:26 Created wheel for jsonpath-rw: filename=jsonpath_rw-1.4.0-py3-none-any.whl size=15176 sha256=38dffd53a730965d67871109d5feaef85bc4a5a5b307764c7256f75e978e7193 02:18:26 Stored in directory: /home/jenkins/.cache/pip/wheels/f1/54/63/9a8da38cefae13755097b36cc852decc25d8ef69c37d58d4eb 02:18:27 Successfully built robotframework-sshlibrary ipaddr netifaces robotframework-httplibrary jsonpath-rw 02:18:27 Installing collected packages: sortedcontainers, ply, netifaces, ipaddr, enum34, websocket-client, WebOb, waitress, urllib3, typing-extensions, soupsieve, sniffio, six, scapy, robotframework-pythonlibcore, robotframework, regex, pysocks, pyparsing, pycparser, netaddr, lxml, jsonpointer, jmespath, isodate, idna, h11, decorator, click, charset_normalizer, certifi, bcrypt, attrs, wsproto, requests, python-dateutil, pyhocon, pyang, outcome, jsonpath-rw, jsonpatch, elastic-transport, docker-pycreds, cffi, beautifulsoup4, webtest, trio, robotframework-requests, pynacl, pyangbind, elasticsearch, docker-py, cryptography, trio-websocket, robotframework-httplibrary, paramiko, elasticsearch-dsl, selenium, scp, robotframework-sshlibrary, robotframework-seleniumlibrary, robotframework-selenium2library 02:18:34 02:18:34 Successfully installed WebOb-1.8.9 attrs-25.3.0 bcrypt-4.3.0 beautifulsoup4-4.13.4 certifi-2025.7.14 cffi-1.17.1 charset_normalizer-3.4.2 click-8.2.1 cryptography-45.0.5 decorator-5.2.1 docker-py-1.10.6 docker-pycreds-0.4.0 elastic-transport-8.17.1 elasticsearch-8.18.1 elasticsearch-dsl-8.18.0 enum34-1.1.10 h11-0.16.0 idna-3.10 ipaddr-2.2.0 isodate-0.7.2 jmespath-1.0.1 jsonpatch-1.33 jsonpath-rw-1.4.0 jsonpointer-3.0.0 lxml-6.0.0 netaddr-1.3.0 netifaces-0.11.0 outcome-1.3.0.post0 paramiko-3.5.1 ply-3.11 pyang-2.6.1 pyangbind-0.8.6 pycparser-2.22 pyhocon-0.3.61 pynacl-1.5.0 pyparsing-3.2.3 pysocks-1.7.1 python-dateutil-2.9.0.post0 regex-2024.11.6 requests-2.32.4 robotframework-7.3.2 robotframework-httplibrary-0.4.2 robotframework-pythonlibcore-4.4.1 robotframework-requests-0.9.7 robotframework-selenium2library-3.0.0 robotframework-seleniumlibrary-6.7.1 robotframework-sshlibrary-3.8.0 scapy-2.6.1 scp-0.15.0 selenium-4.34.2 six-1.17.0 sniffio-1.3.1 sortedcontainers-2.4.0 soupsieve-2.7 trio-0.30.0 trio-websocket-0.12.2 typing-extensions-4.14.1 urllib3-2.5.0 waitress-3.0.2 websocket-client-1.8.0 webtest-3.0.6 wsproto-1.2.0 02:18:35 + pip freeze 02:18:35 attrs==25.3.0 02:18:35 bcrypt==4.3.0 02:18:35 beautifulsoup4==4.13.4 02:18:35 certifi==2025.7.14 02:18:35 cffi==1.17.1 02:18:35 charset-normalizer==3.4.2 02:18:35 click==8.2.1 02:18:35 cryptography==45.0.5 02:18:35 decorator==5.2.1 02:18:35 distlib==0.4.0 02:18:35 docker-py==1.10.6 02:18:35 docker-pycreds==0.4.0 02:18:35 elastic-transport==8.17.1 02:18:35 elasticsearch==8.18.1 02:18:35 elasticsearch-dsl==8.18.0 02:18:35 enum34==1.1.10 02:18:35 filelock==3.18.0 02:18:35 h11==0.16.0 02:18:35 idna==3.10 02:18:35 ipaddr==2.2.0 02:18:35 isodate==0.7.2 02:18:35 jmespath==1.0.1 02:18:35 jsonpatch==1.33 02:18:35 jsonpath-rw==1.4.0 02:18:35 jsonpointer==3.0.0 02:18:35 lxml==6.0.0 02:18:35 netaddr==1.3.0 02:18:35 netifaces==0.11.0 02:18:35 outcome==1.3.0.post0 02:18:35 paramiko==3.5.1 02:18:35 platformdirs==4.3.8 02:18:35 ply==3.11 02:18:35 pyang==2.6.1 02:18:35 pyangbind==0.8.6 02:18:35 pycparser==2.22 02:18:35 pyhocon==0.3.61 02:18:35 PyNaCl==1.5.0 02:18:35 pyparsing==3.2.3 02:18:35 PySocks==1.7.1 02:18:35 python-dateutil==2.9.0.post0 02:18:35 regex==2024.11.6 02:18:35 requests==2.32.4 02:18:35 robotframework==7.3.2 02:18:35 robotframework-httplibrary==0.4.2 02:18:35 robotframework-pythonlibcore==4.4.1 02:18:35 robotframework-requests==0.9.7 02:18:35 robotframework-selenium2library==3.0.0 02:18:35 robotframework-seleniumlibrary==6.7.1 02:18:35 robotframework-sshlibrary==3.8.0 02:18:35 scapy==2.6.1 02:18:35 scp==0.15.0 02:18:35 selenium==4.34.2 02:18:35 six==1.17.0 02:18:35 sniffio==1.3.1 02:18:35 sortedcontainers==2.4.0 02:18:35 soupsieve==2.7 02:18:35 trio==0.30.0 02:18:35 trio-websocket==0.12.2 02:18:35 typing_extensions==4.14.1 02:18:35 urllib3==2.5.0 02:18:35 virtualenv==20.31.2 02:18:35 waitress==3.0.2 02:18:35 WebOb==1.8.9 02:18:35 websocket-client==1.8.0 02:18:35 WebTest==3.0.6 02:18:35 wsproto==1.2.0 02:18:35 [EnvInject] - Injecting environment variables from a build step. 02:18:35 [EnvInject] - Injecting as environment variables the properties file path 'env.properties' 02:18:35 [EnvInject] - Variables injected successfully. 02:18:35 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins5797745009338596719.sh 02:18:35 Setup pyenv: 02:18:35 system 02:18:35 3.8.13 02:18:35 3.9.13 02:18:35 3.10.13 02:18:35 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:18:36 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:18:37 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient yq 02:18:45 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 02:18:45 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 02:18:45 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:18:45 + ODL_SYSTEM=() 02:18:45 + TOOLS_SYSTEM=() 02:18:45 + OPENSTACK_SYSTEM=() 02:18:45 + OPENSTACK_CONTROLLERS=() 02:18:45 + mapfile -t ADDR 02:18:45 ++ jq -r '.outputs[] | select(.output_key | match("^vm_[0-9]+_ips$")) | .output_value | .[]' 02:18:45 ++ openstack stack show -f json -c outputs releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:18:47 + for i in "${ADDR[@]}" 02:18:47 ++ ssh 10.30.170.189 hostname -s 02:18:47 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:18:47 + REMHOST=releng-62057-313-0-builder-0 02:18:47 + case ${REMHOST} in 02:18:47 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 02:18:47 + for i in "${ADDR[@]}" 02:18:48 ++ ssh 10.30.171.89 hostname -s 02:18:48 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:18:48 + REMHOST=releng-62057-313-0-builder-1 02:18:48 + case ${REMHOST} in 02:18:48 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 02:18:48 + for i in "${ADDR[@]}" 02:18:48 ++ ssh 10.30.171.50 hostname -s 02:18:48 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:18:48 + REMHOST=releng-62057-313-0-builder-2 02:18:48 + case ${REMHOST} in 02:18:48 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 02:18:48 + for i in "${ADDR[@]}" 02:18:48 ++ ssh 10.30.171.44 hostname -s 02:18:48 Warning: Permanently added '10.30.171.44' (ECDSA) to the list of known hosts. 02:18:49 + REMHOST=releng-62057-313-1-mininet-ovs-217-0 02:18:49 + case ${REMHOST} in 02:18:49 + TOOLS_SYSTEM=("${TOOLS_SYSTEM[@]}" "${i}") 02:18:49 + echo NUM_ODL_SYSTEM=3 02:18:49 + echo NUM_TOOLS_SYSTEM=1 02:18:49 + '[' '' == yes ']' 02:18:49 + NUM_OPENSTACK_SYSTEM=0 02:18:49 + echo NUM_OPENSTACK_SYSTEM=0 02:18:49 + '[' 0 -eq 2 ']' 02:18:49 + echo ODL_SYSTEM_IP=10.30.170.189 02:18:49 ++ seq 0 2 02:18:49 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 02:18:49 + echo ODL_SYSTEM_1_IP=10.30.170.189 02:18:49 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 02:18:49 + echo ODL_SYSTEM_2_IP=10.30.171.89 02:18:49 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 02:18:49 + echo ODL_SYSTEM_3_IP=10.30.171.50 02:18:49 + echo TOOLS_SYSTEM_IP=10.30.171.44 02:18:49 ++ seq 0 0 02:18:49 + for i in $(seq 0 $(( ${#TOOLS_SYSTEM[@]} - 1 ))) 02:18:49 + echo TOOLS_SYSTEM_1_IP=10.30.171.44 02:18:49 + openstack_index=0 02:18:49 + NUM_OPENSTACK_CONTROL_NODES=1 02:18:49 + echo NUM_OPENSTACK_CONTROL_NODES=1 02:18:49 ++ seq 0 0 02:18:49 + for i in $(seq 0 $((NUM_OPENSTACK_CONTROL_NODES - 1))) 02:18:49 + echo OPENSTACK_CONTROL_NODE_1_IP= 02:18:49 + NUM_OPENSTACK_COMPUTE_NODES=-1 02:18:49 + echo NUM_OPENSTACK_COMPUTE_NODES=-1 02:18:49 + '[' -1 -ge 2 ']' 02:18:49 ++ seq 0 -2 02:18:49 + NUM_OPENSTACK_HAPROXY_NODES=0 02:18:49 + echo NUM_OPENSTACK_HAPROXY_NODES=0 02:18:49 ++ seq 0 -1 02:18:49 + echo 'Contents of slave_addresses.txt:' 02:18:49 Contents of slave_addresses.txt: 02:18:49 + cat slave_addresses.txt 02:18:49 NUM_ODL_SYSTEM=3 02:18:49 NUM_TOOLS_SYSTEM=1 02:18:49 NUM_OPENSTACK_SYSTEM=0 02:18:49 ODL_SYSTEM_IP=10.30.170.189 02:18:49 ODL_SYSTEM_1_IP=10.30.170.189 02:18:49 ODL_SYSTEM_2_IP=10.30.171.89 02:18:49 ODL_SYSTEM_3_IP=10.30.171.50 02:18:49 TOOLS_SYSTEM_IP=10.30.171.44 02:18:49 TOOLS_SYSTEM_1_IP=10.30.171.44 02:18:49 NUM_OPENSTACK_CONTROL_NODES=1 02:18:49 OPENSTACK_CONTROL_NODE_1_IP= 02:18:49 NUM_OPENSTACK_COMPUTE_NODES=-1 02:18:49 NUM_OPENSTACK_HAPROXY_NODES=0 02:18:49 [EnvInject] - Injecting environment variables from a build step. 02:18:49 [EnvInject] - Injecting as environment variables the properties file path 'slave_addresses.txt' 02:18:49 [EnvInject] - Variables injected successfully. 02:18:49 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins3926195428128410226.sh 02:18:49 Preparing for JRE Version 21 02:18:49 Karaf artifact is karaf 02:18:49 Karaf project is integration 02:18:49 Java home is /usr/lib/jvm/java-21-openjdk-amd64 02:18:49 [EnvInject] - Injecting environment variables from a build step. 02:18:49 [EnvInject] - Injecting as environment variables the properties file path 'set_variables.env' 02:18:49 [EnvInject] - Variables injected successfully. 02:18:49 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins1741286717800299608.sh 02:18:49 Distribution bundle URL is https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:18:49 Distribution bundle is karaf-0.22.0.zip 02:18:49 Distribution bundle version is 0.22.0 02:18:49 Distribution folder is karaf-0.22.0 02:18:49 Nexus prefix is https://nexus.opendaylight.org 02:18:49 [EnvInject] - Injecting environment variables from a build step. 02:18:49 [EnvInject] - Injecting as environment variables the properties file path 'detect_variables.env' 02:18:49 [EnvInject] - Variables injected successfully. 02:18:49 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins15407900888449714866.sh 02:18:49 Setup pyenv: 02:18:49 system 02:18:49 3.8.13 02:18:49 3.9.13 02:18:49 3.10.13 02:18:49 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:18:49 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:18:51 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 02:18:51 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 02:18:51 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient 02:18:57 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 02:18:57 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 02:18:57 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:18:57 Copying common-functions.sh to /tmp 02:18:59 Copying common-functions.sh to 10.30.170.189:/tmp 02:18:59 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:18:59 Copying common-functions.sh to 10.30.171.89:/tmp 02:18:59 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:00 Copying common-functions.sh to 10.30.171.50:/tmp 02:19:00 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:00 Copying common-functions.sh to 10.30.171.44:/tmp 02:19:00 Warning: Permanently added '10.30.171.44' (ECDSA) to the list of known hosts. 02:19:01 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins9559725457317136317.sh 02:19:01 common-functions.sh is being sourced 02:19:01 common-functions environment: 02:19:01 MAVENCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:01 ACTUALFEATURES: 02:19:01 FEATURESCONF: /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:01 CUSTOMPROP: /tmp/karaf-0.22.0/etc/custom.properties 02:19:01 LOGCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:01 MEMCONF: /tmp/karaf-0.22.0/bin/setenv 02:19:01 CONTROLLERMEM: 2048m 02:19:01 AKKACONF: /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:01 MODULESCONF: /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:01 MODULESHARDSCONF: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:01 SUITES: 02:19:01 02:19:01 ################################################# 02:19:01 ## Configure Cluster and Start ## 02:19:01 ################################################# 02:19:01 ACTUALFEATURES: odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer 02:19:01 SPACE_SEPARATED_FEATURES: odl-infrautils-ready odl-jolokia odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer 02:19:01 Locating script plan to use... 02:19:01 Finished running script plans 02:19:01 Configuring member-1 with IP address 10.30.170.189 02:19:01 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:01 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:01 + source /tmp/common-functions.sh karaf-0.22.0 02:19:01 common-functions.sh is being sourced 02:19:01 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 02:19:01 ++ echo 'common-functions.sh is being sourced' 02:19:01 ++ BUNDLEFOLDER=karaf-0.22.0 02:19:01 ++ export MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:01 ++ MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:01 ++ export FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:01 ++ FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:01 ++ export CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:01 ++ CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:01 ++ export LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:01 ++ LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:01 ++ export MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:01 ++ MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:01 ++ export CONTROLLERMEM= 02:19:01 ++ CONTROLLERMEM= 02:19:01 ++ '[' '' = calcium ']' 02:19:01 ++ CLUSTER_SYSTEM=pekko 02:19:01 ++ export AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:01 ++ AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:01 ++ export MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:01 ++ MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:01 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:01 ++ MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:01 ++ print_common_env 02:19:01 ++ cat 02:19:01 common-functions environment: 02:19:01 MAVENCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:01 ACTUALFEATURES: 02:19:01 FEATURESCONF: /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:01 CUSTOMPROP: /tmp/karaf-0.22.0/etc/custom.properties 02:19:01 LOGCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:01 MEMCONF: /tmp/karaf-0.22.0/bin/setenv 02:19:01 CONTROLLERMEM: 02:19:01 AKKACONF: /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:01 MODULESCONF: /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:01 MODULESHARDSCONF: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:01 SUITES: 02:19:01 02:19:01 ++ SSH='ssh -t -t' 02:19:01 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 02:19:01 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 02:19:01 Changing to /tmp 02:19:01 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:01 + echo 'Changing to /tmp' 02:19:01 + cd /tmp 02:19:01 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip' 02:19:01 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:01 --2025-07-18 02:19:01-- https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:01 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 02:19:01 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 02:19:01 HTTP request sent, awaiting response... 200 OK 02:19:01 Length: 187527106 (179M) [application/zip] 02:19:01 Saving to: ‘karaf-0.22.0.zip’ 02:19:01 02:19:01 0K ........ ........ ........ ........ ........ ........ 1% 56.3M 3s 02:19:01 3072K ........ ........ ........ ........ ........ ........ 3% 98.0M 2s 02:19:01 6144K ........ ........ ........ ........ ........ ........ 5% 147M 2s 02:19:01 9216K ........ ........ ........ ........ ........ ........ 6% 156M 2s 02:19:01 12288K ........ ........ ........ ........ ........ ........ 8% 176M 2s 02:19:01 15360K ........ ........ ........ ........ ........ ........ 10% 209M 1s 02:19:01 18432K ........ ........ ........ ........ ........ ........ 11% 181M 1s 02:19:01 21504K ........ ........ ........ ........ ........ ........ 13% 209M 1s 02:19:01 24576K ........ ........ ........ ........ ........ ........ 15% 203M 1s 02:19:01 27648K ........ ........ ........ ........ ........ ........ 16% 210M 1s 02:19:01 30720K ........ ........ ........ ........ ........ ........ 18% 189M 1s 02:19:01 33792K ........ ........ ........ ........ ........ ........ 20% 204M 1s 02:19:01 36864K ........ ........ ........ ........ ........ ........ 21% 209M 1s 02:19:01 39936K ........ ........ ........ ........ ........ ........ 23% 166M 1s 02:19:01 43008K ........ ........ ........ ........ ........ ........ 25% 237M 1s 02:19:02 46080K ........ ........ ........ ........ ........ ........ 26% 199M 1s 02:19:02 49152K ........ ........ ........ ........ ........ ........ 28% 224M 1s 02:19:02 52224K ........ ........ ........ ........ ........ ........ 30% 228M 1s 02:19:02 55296K ........ ........ ........ ........ ........ ........ 31% 216M 1s 02:19:02 58368K ........ ........ ........ ........ ........ ........ 33% 220M 1s 02:19:02 61440K ........ ........ ........ ........ ........ ........ 35% 139M 1s 02:19:02 64512K ........ ........ ........ ........ ........ ........ 36% 208M 1s 02:19:02 67584K ........ ........ ........ ........ ........ ........ 38% 194M 1s 02:19:02 70656K ........ ........ ........ ........ ........ ........ 40% 216M 1s 02:19:02 73728K ........ ........ ........ ........ ........ ........ 41% 227M 1s 02:19:02 76800K ........ ........ ........ ........ ........ ........ 43% 227M 1s 02:19:02 79872K ........ ........ ........ ........ ........ ........ 45% 221M 1s 02:19:02 82944K ........ ........ ........ ........ ........ ........ 46% 249M 1s 02:19:02 86016K ........ ........ ........ ........ ........ ........ 48% 171M 1s 02:19:02 89088K ........ ........ ........ ........ ........ ........ 50% 138M 1s 02:19:02 92160K ........ ........ ........ ........ ........ ........ 52% 116M 1s 02:19:02 95232K ........ ........ ........ ........ ........ ........ 53% 188M 0s 02:19:02 98304K ........ ........ ........ ........ ........ ........ 55% 181M 0s 02:19:02 101376K ........ ........ ........ ........ ........ ........ 57% 137M 0s 02:19:02 104448K ........ ........ ........ ........ ........ ........ 58% 132M 0s 02:19:02 107520K ........ ........ ........ ........ ........ ........ 60% 113M 0s 02:19:02 110592K ........ ........ ........ ........ ........ ........ 62% 139M 0s 02:19:02 113664K ........ ........ ........ ........ ........ ........ 63% 212M 0s 02:19:02 116736K ........ ........ ........ ........ ........ ........ 65% 151M 0s 02:19:02 119808K ........ ........ ........ ........ ........ ........ 67% 223M 0s 02:19:02 122880K ........ ........ ........ ........ ........ ........ 68% 208M 0s 02:19:02 125952K ........ ........ ........ ........ ........ ........ 70% 162M 0s 02:19:02 129024K ........ ........ ........ ........ ........ ........ 72% 128M 0s 02:19:02 132096K ........ ........ ........ ........ ........ ........ 73% 125M 0s 02:19:02 135168K ........ ........ ........ ........ ........ ........ 75% 139M 0s 02:19:02 138240K ........ ........ ........ ........ ........ ........ 77% 112M 0s 02:19:02 141312K ........ ........ ........ ........ ........ ........ 78% 113M 0s 02:19:02 144384K ........ ........ ........ ........ ........ ........ 80% 205M 0s 02:19:02 147456K ........ ........ ........ ........ ........ ........ 82% 209M 0s 02:19:02 150528K ........ ........ ........ ........ ........ ........ 83% 173M 0s 02:19:02 153600K ........ ........ ........ ........ ........ ........ 85% 168M 0s 02:19:02 156672K ........ ........ ........ ........ ........ ........ 87% 156M 0s 02:19:02 159744K ........ ........ ........ ........ ........ ........ 88% 220M 0s 02:19:02 162816K ........ ........ ........ ........ ........ ........ 90% 233M 0s 02:19:02 165888K ........ ........ ........ ........ ........ ........ 92% 244M 0s 02:19:02 168960K ........ ........ ........ ........ ........ ........ 93% 199M 0s 02:19:02 172032K ........ ........ ........ ........ ........ ........ 95% 213M 0s 02:19:02 175104K ........ ........ ........ ........ ........ ........ 97% 188M 0s 02:19:02 178176K ........ ........ ........ ........ ........ ........ 98% 201M 0s 02:19:02 181248K ........ ........ ........ ..... 100% 196M=1.1s 02:19:02 02:19:02 2025-07-18 02:19:02 (168 MB/s) - ‘karaf-0.22.0.zip’ saved [187527106/187527106] 02:19:02 02:19:02 Extracting the new controller... 02:19:02 + echo 'Extracting the new controller...' 02:19:02 + unzip -q karaf-0.22.0.zip 02:19:04 Adding external repositories... 02:19:04 + echo 'Adding external repositories...' 02:19:04 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:04 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:04 ################################################################################ 02:19:04 # 02:19:04 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:04 # contributor license agreements. See the NOTICE file distributed with 02:19:04 # this work for additional information regarding copyright ownership. 02:19:04 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:04 # (the "License"); you may not use this file except in compliance with 02:19:04 # the License. You may obtain a copy of the License at 02:19:04 # 02:19:04 # http://www.apache.org/licenses/LICENSE-2.0 02:19:04 # 02:19:04 # Unless required by applicable law or agreed to in writing, software 02:19:04 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:04 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:04 # See the License for the specific language governing permissions and 02:19:04 # limitations under the License. 02:19:04 # 02:19:04 ################################################################################ 02:19:04 02:19:04 # 02:19:04 # If set to true, the following property will not allow any certificate to be used 02:19:04 # when accessing Maven repositories through SSL 02:19:04 # 02:19:04 #org.ops4j.pax.url.mvn.certificateCheck= 02:19:04 02:19:04 # 02:19:04 # Path to the local Maven settings file. 02:19:04 # The repositories defined in this file will be automatically added to the list 02:19:04 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 02:19:04 # below is not set. 02:19:04 # The following locations are checked for the existence of the settings.xml file 02:19:04 # * 1. looks for the specified url 02:19:04 # * 2. if not found looks for ${user.home}/.m2/settings.xml 02:19:04 # * 3. if not found looks for ${maven.home}/conf/settings.xml 02:19:04 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 02:19:04 # 02:19:04 #org.ops4j.pax.url.mvn.settings= 02:19:04 02:19:04 # 02:19:04 # Path to the local Maven repository which is used to avoid downloading 02:19:04 # artifacts when they already exist locally. 02:19:04 # The value of this property will be extracted from the settings.xml file 02:19:04 # above, or defaulted to: 02:19:04 # System.getProperty( "user.home" ) + "/.m2/repository" 02:19:04 # 02:19:04 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 02:19:04 02:19:04 # 02:19:04 # Default this to false. It's just weird to use undocumented repos 02:19:04 # 02:19:04 org.ops4j.pax.url.mvn.useFallbackRepositories=false 02:19:04 02:19:04 # 02:19:04 # Uncomment if you don't wanna use the proxy settings 02:19:04 # from the Maven conf/settings.xml file 02:19:04 # 02:19:04 # org.ops4j.pax.url.mvn.proxySupport=false 02:19:04 02:19:04 # 02:19:04 # Comma separated list of repositories scanned when resolving an artifact. 02:19:04 # Those repositories will be checked before iterating through the 02:19:04 # below list of repositories and even before the local repository 02:19:04 # A repository url can be appended with zero or more of the following flags: 02:19:04 # @snapshots : the repository contains snaphots 02:19:04 # @noreleases : the repository does not contain any released artifacts 02:19:04 # 02:19:04 # The following property value will add the system folder as a repo. 02:19:04 # 02:19:04 org.ops4j.pax.url.mvn.defaultRepositories=\ 02:19:04 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 02:19:04 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 02:19:04 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 02:19:04 02:19:04 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 02:19:04 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 02:19:04 02:19:04 # 02:19:04 # Comma separated list of repositories scanned when resolving an artifact. 02:19:04 # The default list includes the following repositories: 02:19:04 # http://repo1.maven.org/maven2@id=central 02:19:04 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 02:19:04 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 02:19:04 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 02:19:04 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 02:19:04 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 02:19:04 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:04 # To add repositories to the default ones, prepend '+' to the list of repositories 02:19:04 # to add. 02:19:04 # A repository url can be appended with zero or more of the following flags: 02:19:04 # @snapshots : the repository contains snapshots 02:19:04 # @noreleases : the repository does not contain any released artifacts 02:19:04 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 02:19:04 # 02:19:04 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:04 02:19:04 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 02:19:04 + [[ True == \T\r\u\e ]] 02:19:04 + echo 'Configuring the startup features...' 02:19:04 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:04 + FEATURE_TEST_STRING=features-test 02:19:04 + FEATURE_TEST_VERSION=0.22.0 02:19:04 + KARAF_VERSION=karaf4 02:19:04 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 02:19:04 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:04 ################################################################################ 02:19:04 # 02:19:04 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:04 # contributor license agreements. See the NOTICE file distributed with 02:19:04 # this work for additional information regarding copyright ownership. 02:19:04 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:04 # (the "License"); you may not use this file except in compliance with 02:19:04 # the License. You may obtain a copy of the License at 02:19:04 # 02:19:04 # http://www.apache.org/licenses/LICENSE-2.0 02:19:04 # 02:19:04 # Unless required by applicable law or agreed to in writing, software 02:19:04 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:04 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:04 # See the License for the specific language governing permissions and 02:19:04 # limitations under the License. 02:19:04 # 02:19:04 ################################################################################ 02:19:04 02:19:04 # 02:19:04 # Comma separated list of features repositories to register by default 02:19:04 # 02:19:04 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/7dcf83ed-3930-47c1-8604-2be667442bd7.xml 02:19:04 02:19:04 # 02:19:04 # Comma separated list of features to install at startup 02:19:04 # 02:19:04 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 6529f430-fdfc-4281-a08e-c1b54a0a72d5 02:19:04 02:19:04 # 02:19:04 # Resource repositories (OBR) that the features resolver can use 02:19:04 # to resolve requirements/capabilities 02:19:04 # 02:19:04 # The format of the resourceRepositories is 02:19:04 # resourceRepositories=[xml:url|json:url],... 02:19:04 # for Instance: 02:19:04 # 02:19:04 #resourceRepositories=xml:http://host/path/to/index.xml 02:19:04 # or 02:19:04 #resourceRepositories=json:http://host/path/to/index.json 02:19:04 # 02:19:04 02:19:04 # 02:19:04 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 02:19:04 # 02:19:04 featuresBootAsynchronous=false 02:19:04 02:19:04 # 02:19:04 # Service requirements enforcement 02:19:04 # 02:19:04 # By default, the feature resolver checks the service requirements/capabilities of 02:19:04 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 02:19:04 # the required bundles. 02:19:04 # The following flag can have those values: 02:19:04 # - disable: service requirements are completely ignored 02:19:04 # - default: service requirements are ignored for old features 02:19:04 # - enforce: service requirements are always verified 02:19:04 # 02:19:04 #serviceRequirements=default 02:19:04 02:19:04 # 02:19:04 # Store cfg file for config element in feature 02:19:04 # 02:19:04 #configCfgStore=true 02:19:04 02:19:04 # 02:19:04 # Define if the feature service automatically refresh bundles 02:19:04 # 02:19:04 autoRefresh=true 02:19:04 02:19:04 # 02:19:04 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 02:19:04 # XML file defines instructions related to features processing 02:19:04 # versions.properties may declare properties to resolve placeholders in XML file 02:19:04 # both files are relative to ${karaf.etc} 02:19:04 # 02:19:04 #featureProcessing=org.apache.karaf.features.xml 02:19:04 #featureProcessingVersions=versions.properties 02:19:04 + [[ ! -z '' ]] 02:19:04 + cat /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:04 + configure_karaf_log karaf4 '' 02:19:04 + local -r karaf_version=karaf4 02:19:04 + local -r controllerdebugmap= 02:19:04 + local logapi=log4j 02:19:04 + grep log4j2 /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:04 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:04 log4j2.rootLogger.level = INFO 02:19:04 #log4j2.rootLogger.type = asyncRoot 02:19:04 #log4j2.rootLogger.includeLocation = false 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:04 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:04 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:04 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:04 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:04 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:04 log4j2.logger.spifly.level = WARN 02:19:04 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:04 log4j2.logger.audit.level = INFO 02:19:04 log4j2.logger.audit.additivity = false 02:19:04 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:04 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:04 log4j2.appender.console.type = Console 02:19:04 log4j2.appender.console.name = Console 02:19:04 log4j2.appender.console.layout.type = PatternLayout 02:19:04 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:04 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:04 log4j2.appender.rolling.name = RollingFile 02:19:04 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:04 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:04 #log4j2.appender.rolling.immediateFlush = false 02:19:04 log4j2.appender.rolling.append = true 02:19:04 log4j2.appender.rolling.layout.type = PatternLayout 02:19:04 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:04 log4j2.appender.rolling.policies.type = Policies 02:19:04 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:04 log4j2.appender.rolling.policies.size.size = 64MB 02:19:04 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:04 log4j2.appender.rolling.strategy.max = 7 02:19:04 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:04 log4j2.appender.audit.name = AuditRollingFile 02:19:04 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:04 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:04 log4j2.appender.audit.append = true 02:19:04 log4j2.appender.audit.layout.type = PatternLayout 02:19:04 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:04 log4j2.appender.audit.policies.type = Policies 02:19:04 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:04 log4j2.appender.audit.policies.size.size = 8MB 02:19:04 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:04 log4j2.appender.audit.strategy.max = 7 02:19:04 log4j2.appender.osgi.type = PaxOsgi 02:19:04 log4j2.appender.osgi.name = PaxOsgi 02:19:04 log4j2.appender.osgi.filter = * 02:19:04 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:04 #log4j2.logger.aether.level = TRACE 02:19:04 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:04 #log4j2.logger.http-headers.level = DEBUG 02:19:04 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:04 #log4j2.logger.maven.level = TRACE 02:19:04 + logapi=log4j2 02:19:04 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 02:19:04 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 02:19:04 + '[' log4j2 == log4j2 ']' 02:19:04 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:04 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 02:19:04 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 02:19:04 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 02:19:04 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 02:19:04 controllerdebugmap: 02:19:04 cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:04 + unset IFS 02:19:04 + echo 'controllerdebugmap: ' 02:19:04 + '[' -n '' ']' 02:19:04 + echo 'cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg' 02:19:04 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:04 ################################################################################ 02:19:04 # 02:19:04 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:04 # contributor license agreements. See the NOTICE file distributed with 02:19:04 # this work for additional information regarding copyright ownership. 02:19:04 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:04 # (the "License"); you may not use this file except in compliance with 02:19:04 # the License. You may obtain a copy of the License at 02:19:04 # 02:19:04 # http://www.apache.org/licenses/LICENSE-2.0 02:19:04 # 02:19:04 # Unless required by applicable law or agreed to in writing, software 02:19:04 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:04 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:04 # See the License for the specific language governing permissions and 02:19:04 # limitations under the License. 02:19:04 # 02:19:04 ################################################################################ 02:19:04 02:19:04 # Common pattern layout for appenders 02:19:04 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:04 02:19:04 # Root logger 02:19:04 log4j2.rootLogger.level = INFO 02:19:04 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:04 #log4j2.rootLogger.type = asyncRoot 02:19:04 #log4j2.rootLogger.includeLocation = false 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:04 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:04 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:04 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:04 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:04 02:19:04 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:04 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:04 02:19:04 # Loggers configuration 02:19:04 02:19:04 # Spifly logger 02:19:04 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:04 log4j2.logger.spifly.level = WARN 02:19:04 02:19:04 # Security audit logger 02:19:04 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:04 log4j2.logger.audit.level = INFO 02:19:04 log4j2.logger.audit.additivity = false 02:19:04 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:04 02:19:04 # Appenders configuration 02:19:04 02:19:04 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:04 log4j2.appender.console.type = Console 02:19:04 log4j2.appender.console.name = Console 02:19:04 log4j2.appender.console.layout.type = PatternLayout 02:19:04 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:04 02:19:04 # Rolling file appender 02:19:04 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:04 log4j2.appender.rolling.name = RollingFile 02:19:04 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:04 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:04 # uncomment to not force a disk flush 02:19:04 #log4j2.appender.rolling.immediateFlush = false 02:19:04 log4j2.appender.rolling.append = true 02:19:04 log4j2.appender.rolling.layout.type = PatternLayout 02:19:04 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:04 log4j2.appender.rolling.policies.type = Policies 02:19:04 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:04 log4j2.appender.rolling.policies.size.size = 1GB 02:19:04 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:04 log4j2.appender.rolling.strategy.max = 7 02:19:04 02:19:04 # Audit file appender 02:19:04 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:04 log4j2.appender.audit.name = AuditRollingFile 02:19:04 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:04 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:04 log4j2.appender.audit.append = true 02:19:04 log4j2.appender.audit.layout.type = PatternLayout 02:19:04 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:04 log4j2.appender.audit.policies.type = Policies 02:19:04 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:04 log4j2.appender.audit.policies.size.size = 8MB 02:19:04 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:04 log4j2.appender.audit.strategy.max = 7 02:19:04 02:19:04 # OSGi appender 02:19:04 log4j2.appender.osgi.type = PaxOsgi 02:19:04 log4j2.appender.osgi.name = PaxOsgi 02:19:04 log4j2.appender.osgi.filter = * 02:19:04 02:19:04 # help with identification of maven-related problems with pax-url-aether 02:19:04 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:04 #log4j2.logger.aether.level = TRACE 02:19:04 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:04 #log4j2.logger.http-headers.level = DEBUG 02:19:04 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:04 #log4j2.logger.maven.level = TRACE 02:19:04 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:04 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:04 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.0/bin/setenv 02:19:04 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 02:19:04 + local -r controllermem=2048m 02:19:04 Configure 02:19:04 java home: /usr/lib/jvm/java-21-openjdk-amd64 02:19:04 + local -r memconf=/tmp/karaf-0.22.0/bin/setenv 02:19:04 + echo Configure 02:19:04 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 02:19:04 + echo ' max memory: 2048m' 02:19:04 max memory: 2048m 02:19:04 memconf: /tmp/karaf-0.22.0/bin/setenv 02:19:04 + echo ' memconf: /tmp/karaf-0.22.0/bin/setenv' 02:19:04 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.0/bin/setenv 02:19:04 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.0/bin/setenv 02:19:04 cat /tmp/karaf-0.22.0/bin/setenv 02:19:04 + echo 'cat /tmp/karaf-0.22.0/bin/setenv' 02:19:04 + cat /tmp/karaf-0.22.0/bin/setenv 02:19:04 #!/bin/sh 02:19:04 # 02:19:04 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:04 # contributor license agreements. See the NOTICE file distributed with 02:19:04 # this work for additional information regarding copyright ownership. 02:19:04 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:04 # (the "License"); you may not use this file except in compliance with 02:19:04 # the License. You may obtain a copy of the License at 02:19:04 # 02:19:04 # http://www.apache.org/licenses/LICENSE-2.0 02:19:04 # 02:19:04 # Unless required by applicable law or agreed to in writing, software 02:19:04 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:04 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:04 # See the License for the specific language governing permissions and 02:19:04 # limitations under the License. 02:19:04 # 02:19:04 02:19:04 # 02:19:04 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 02:19:04 # script: client, instance, shell, start, status, stop, karaf 02:19:04 # 02:19:04 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 02:19:04 # Actions go here... 02:19:04 # fi 02:19:04 02:19:04 # 02:19:04 # general settings which should be applied for all scripts go here; please keep 02:19:04 # in mind that it is possible that scripts might be executed more than once, e.g. 02:19:04 # in example of the start script where the start script is executed first and the 02:19:04 # karaf script afterwards. 02:19:04 # 02:19:04 02:19:04 # 02:19:04 # The following section shows the possible configuration options for the default 02:19:04 # karaf scripts 02:19:04 # 02:19:04 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 02:19:04 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 02:19:04 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 02:19:04 # export EXTRA_JAVA_OPTS # Additional JVM options 02:19:04 # export KARAF_HOME # Karaf home folder 02:19:04 # export KARAF_DATA # Karaf data folder 02:19:04 # export KARAF_BASE # Karaf base folder 02:19:04 # export KARAF_ETC # Karaf etc folder 02:19:04 # export KARAF_LOG # Karaf log folder 02:19:04 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 02:19:04 # export KARAF_OPTS # Additional available Karaf options 02:19:04 # export KARAF_DEBUG # Enable debug mode 02:19:04 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 02:19:04 # export KARAF_NOROOT # Prevent execution as root if set to true 02:19:04 Set Java version 02:19:04 + echo 'Set Java version' 02:19:04 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 02:19:04 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:04 sudo: a password is required 02:19:04 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:04 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:04 sudo: a password is required 02:19:04 JDK default version ... 02:19:04 + echo 'JDK default version ...' 02:19:04 + java -version 02:19:04 openjdk version "21.0.5" 2024-10-15 02:19:04 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 02:19:04 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 02:19:04 Set JAVA_HOME 02:19:04 + echo 'Set JAVA_HOME' 02:19:04 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:04 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:04 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:04 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:04 Listing all open ports on controller system... 02:19:04 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:04 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 02:19:04 + echo 'Listing all open ports on controller system...' 02:19:04 + netstat -pnatu 02:19:04 /tmp/configuration-script.sh: line 40: netstat: command not found 02:19:04 Configuring cluster 02:19:04 + '[' -f /tmp/custom_shard_config.txt ']' 02:19:04 + echo 'Configuring cluster' 02:19:04 + /tmp/karaf-0.22.0/bin/configure_cluster.sh 1 10.30.170.189 10.30.171.89 10.30.171.50 02:19:04 ################################################ 02:19:04 ## Configure Cluster ## 02:19:04 ################################################ 02:19:04 ERROR: Cluster configurations files not found. Please configure clustering feature. 02:19:04 Dump pekko.conf 02:19:04 + echo 'Dump pekko.conf' 02:19:04 + cat /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:04 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:04 Dump modules.conf 02:19:04 + echo 'Dump modules.conf' 02:19:04 + cat /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:04 cat: /tmp/karaf-0.22.0/configuration/initial/modules.conf: No such file or directory 02:19:04 Dump module-shards.conf 02:19:04 + echo 'Dump module-shards.conf' 02:19:04 + cat /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:04 cat: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf: No such file or directory 02:19:04 Configuring member-2 with IP address 10.30.171.89 02:19:04 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:04 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:05 + source /tmp/common-functions.sh karaf-0.22.0 02:19:05 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 02:19:05 ++ echo 'common-functions.sh is being sourced' 02:19:05 common-functions.sh is being sourced 02:19:05 ++ BUNDLEFOLDER=karaf-0.22.0 02:19:05 ++ export MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:05 ++ MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:05 ++ export FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:05 ++ FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:05 ++ export CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:05 ++ CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:05 ++ export LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:05 ++ LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:05 ++ export MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:05 ++ MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:05 ++ export CONTROLLERMEM= 02:19:05 ++ CONTROLLERMEM= 02:19:05 ++ '[' '' = calcium ']' 02:19:05 ++ CLUSTER_SYSTEM=pekko 02:19:05 ++ export AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:05 ++ AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:05 ++ export MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:05 ++ MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:05 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:05 ++ MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:05 ++ print_common_env 02:19:05 ++ cat 02:19:05 common-functions environment: 02:19:05 MAVENCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:05 ACTUALFEATURES: 02:19:05 FEATURESCONF: /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:05 CUSTOMPROP: /tmp/karaf-0.22.0/etc/custom.properties 02:19:05 LOGCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:05 MEMCONF: /tmp/karaf-0.22.0/bin/setenv 02:19:05 CONTROLLERMEM: 02:19:05 AKKACONF: /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:05 MODULESCONF: /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:05 MODULESHARDSCONF: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:05 SUITES: 02:19:05 02:19:05 ++ SSH='ssh -t -t' 02:19:05 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 02:19:05 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 02:19:05 Changing to /tmp 02:19:05 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:05 + echo 'Changing to /tmp' 02:19:05 + cd /tmp 02:19:05 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip' 02:19:05 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:05 --2025-07-18 02:19:05-- https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:05 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 02:19:05 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 02:19:05 HTTP request sent, awaiting response... 200 OK 02:19:05 Length: 187527106 (179M) [application/zip] 02:19:05 Saving to: ‘karaf-0.22.0.zip’ 02:19:05 02:19:05 0K ........ ........ ........ ........ ........ ........ 1% 78.9M 2s 02:19:05 3072K ........ ........ ........ ........ ........ ........ 3% 146M 2s 02:19:05 6144K ........ ........ ........ ........ ........ ........ 5% 175M 1s 02:19:05 9216K ........ ........ ........ ........ ........ ........ 6% 205M 1s 02:19:05 12288K ........ ........ ........ ........ ........ ........ 8% 229M 1s 02:19:05 15360K ........ ........ ........ ........ ........ ........ 10% 241M 1s 02:19:05 18432K ........ ........ ........ ........ ........ ........ 11% 288M 1s 02:19:05 21504K ........ ........ ........ ........ ........ ........ 13% 280M 1s 02:19:05 24576K ........ ........ ........ ........ ........ ........ 15% 296M 1s 02:19:05 27648K ........ ........ ........ ........ ........ ........ 16% 311M 1s 02:19:05 30720K ........ ........ ........ ........ ........ ........ 18% 292M 1s 02:19:05 33792K ........ ........ ........ ........ ........ ........ 20% 313M 1s 02:19:05 36864K ........ ........ ........ ........ ........ ........ 21% 314M 1s 02:19:05 39936K ........ ........ ........ ........ ........ ........ 23% 337M 1s 02:19:05 43008K ........ ........ ........ ........ ........ ........ 25% 327M 1s 02:19:05 46080K ........ ........ ........ ........ ........ ........ 26% 316M 1s 02:19:05 49152K ........ ........ ........ ........ ........ ........ 28% 335M 1s 02:19:05 52224K ........ ........ ........ ........ ........ ........ 30% 298M 1s 02:19:05 55296K ........ ........ ........ ........ ........ ........ 31% 308M 1s 02:19:05 58368K ........ ........ ........ ........ ........ ........ 33% 418M 0s 02:19:05 61440K ........ ........ ........ ........ ........ ........ 35% 327M 0s 02:19:05 64512K ........ ........ ........ ........ ........ ........ 36% 329M 0s 02:19:05 67584K ........ ........ ........ ........ ........ ........ 38% 319M 0s 02:19:05 70656K ........ ........ ........ ........ ........ ........ 40% 334M 0s 02:19:05 73728K ........ ........ ........ ........ ........ ........ 41% 340M 0s 02:19:05 76800K ........ ........ ........ ........ ........ ........ 43% 314M 0s 02:19:05 79872K ........ ........ ........ ........ ........ ........ 45% 333M 0s 02:19:05 82944K ........ ........ ........ ........ ........ ........ 46% 315M 0s 02:19:05 86016K ........ ........ ........ ........ ........ ........ 48% 321M 0s 02:19:05 89088K ........ ........ ........ ........ ........ ........ 50% 324M 0s 02:19:05 92160K ........ ........ ........ ........ ........ ........ 52% 328M 0s 02:19:05 95232K ........ ........ ........ ........ ........ ........ 53% 156M 0s 02:19:05 98304K ........ ........ ........ ........ ........ ........ 55% 262M 0s 02:19:05 101376K ........ ........ ........ ........ ........ ........ 57% 325M 0s 02:19:05 104448K ........ ........ ........ ........ ........ ........ 58% 325M 0s 02:19:05 107520K ........ ........ ........ ........ ........ ........ 60% 316M 0s 02:19:05 110592K ........ ........ ........ ........ ........ ........ 62% 302M 0s 02:19:05 113664K ........ ........ ........ ........ ........ ........ 63% 314M 0s 02:19:05 116736K ........ ........ ........ ........ ........ ........ 65% 324M 0s 02:19:05 119808K ........ ........ ........ ........ ........ ........ 67% 322M 0s 02:19:05 122880K ........ ........ ........ ........ ........ ........ 68% 338M 0s 02:19:05 125952K ........ ........ ........ ........ ........ ........ 70% 333M 0s 02:19:05 129024K ........ ........ ........ ........ ........ ........ 72% 324M 0s 02:19:05 132096K ........ ........ ........ ........ ........ ........ 73% 337M 0s 02:19:05 135168K ........ ........ ........ ........ ........ ........ 75% 333M 0s 02:19:05 138240K ........ ........ ........ ........ ........ ........ 77% 266M 0s 02:19:05 141312K ........ ........ ........ ........ ........ ........ 78% 239M 0s 02:19:05 144384K ........ ........ ........ ........ ........ ........ 80% 238M 0s 02:19:05 147456K ........ ........ ........ ........ ........ ........ 82% 227M 0s 02:19:05 150528K ........ ........ ........ ........ ........ ........ 83% 300M 0s 02:19:05 153600K ........ ........ ........ ........ ........ ........ 85% 300M 0s 02:19:05 156672K ........ ........ ........ ........ ........ ........ 87% 296M 0s 02:19:05 159744K ........ ........ ........ ........ ........ ........ 88% 306M 0s 02:19:05 162816K ........ ........ ........ ........ ........ ........ 90% 306M 0s 02:19:05 165888K ........ ........ ........ ........ ........ ........ 92% 302M 0s 02:19:05 168960K ........ ........ ........ ........ ........ ........ 93% 302M 0s 02:19:05 172032K ........ ........ ........ ........ ........ ........ 95% 302M 0s 02:19:05 175104K ........ ........ ........ ........ ........ ........ 97% 292M 0s 02:19:05 178176K ........ ........ ........ ........ ........ ........ 98% 312M 0s 02:19:05 181248K ........ ........ ........ ..... 100% 301M=0.6s 02:19:05 02:19:05 2025-07-18 02:19:05 (276 MB/s) - ‘karaf-0.22.0.zip’ saved [187527106/187527106] 02:19:05 02:19:05 + echo 'Extracting the new controller...' 02:19:05 Extracting the new controller... 02:19:05 + unzip -q karaf-0.22.0.zip 02:19:07 Adding external repositories... 02:19:07 + echo 'Adding external repositories...' 02:19:07 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:07 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:07 ################################################################################ 02:19:07 # 02:19:07 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:07 # contributor license agreements. See the NOTICE file distributed with 02:19:07 # this work for additional information regarding copyright ownership. 02:19:07 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:07 # (the "License"); you may not use this file except in compliance with 02:19:07 # the License. You may obtain a copy of the License at 02:19:07 # 02:19:07 # http://www.apache.org/licenses/LICENSE-2.0 02:19:07 # 02:19:07 # Unless required by applicable law or agreed to in writing, software 02:19:07 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:07 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:07 # See the License for the specific language governing permissions and 02:19:07 # limitations under the License. 02:19:07 # 02:19:07 ################################################################################ 02:19:07 02:19:07 # 02:19:07 # If set to true, the following property will not allow any certificate to be used 02:19:07 # when accessing Maven repositories through SSL 02:19:07 # 02:19:07 #org.ops4j.pax.url.mvn.certificateCheck= 02:19:07 02:19:07 # 02:19:07 # Path to the local Maven settings file. 02:19:07 # The repositories defined in this file will be automatically added to the list 02:19:07 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 02:19:07 # below is not set. 02:19:07 # The following locations are checked for the existence of the settings.xml file 02:19:07 # * 1. looks for the specified url 02:19:07 # * 2. if not found looks for ${user.home}/.m2/settings.xml 02:19:07 # * 3. if not found looks for ${maven.home}/conf/settings.xml 02:19:07 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 02:19:07 # 02:19:07 #org.ops4j.pax.url.mvn.settings= 02:19:07 02:19:07 # 02:19:07 # Path to the local Maven repository which is used to avoid downloading 02:19:07 # artifacts when they already exist locally. 02:19:07 # The value of this property will be extracted from the settings.xml file 02:19:07 # above, or defaulted to: 02:19:07 # System.getProperty( "user.home" ) + "/.m2/repository" 02:19:07 # 02:19:07 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 02:19:07 02:19:07 # 02:19:07 # Default this to false. It's just weird to use undocumented repos 02:19:07 # 02:19:07 org.ops4j.pax.url.mvn.useFallbackRepositories=false 02:19:07 02:19:07 # 02:19:07 # Uncomment if you don't wanna use the proxy settings 02:19:07 # from the Maven conf/settings.xml file 02:19:07 # 02:19:07 # org.ops4j.pax.url.mvn.proxySupport=false 02:19:07 02:19:07 # 02:19:07 # Comma separated list of repositories scanned when resolving an artifact. 02:19:07 # Those repositories will be checked before iterating through the 02:19:07 # below list of repositories and even before the local repository 02:19:07 # A repository url can be appended with zero or more of the following flags: 02:19:07 # @snapshots : the repository contains snaphots 02:19:07 # @noreleases : the repository does not contain any released artifacts 02:19:07 # 02:19:07 # The following property value will add the system folder as a repo. 02:19:07 # 02:19:07 org.ops4j.pax.url.mvn.defaultRepositories=\ 02:19:07 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 02:19:07 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 02:19:07 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 02:19:07 02:19:07 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 02:19:07 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 02:19:07 02:19:07 # 02:19:07 # Comma separated list of repositories scanned when resolving an artifact. 02:19:07 # The default list includes the following repositories: 02:19:07 # http://repo1.maven.org/maven2@id=central 02:19:07 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 02:19:07 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 02:19:07 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 02:19:07 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 02:19:07 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 02:19:07 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:07 # To add repositories to the default ones, prepend '+' to the list of repositories 02:19:07 # to add. 02:19:07 # A repository url can be appended with zero or more of the following flags: 02:19:07 # @snapshots : the repository contains snapshots 02:19:07 # @noreleases : the repository does not contain any released artifacts 02:19:07 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 02:19:07 # 02:19:07 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:07 02:19:07 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 02:19:07 + [[ True == \T\r\u\e ]] 02:19:07 + echo 'Configuring the startup features...' 02:19:07 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:07 + FEATURE_TEST_STRING=features-test 02:19:07 + FEATURE_TEST_VERSION=0.22.0 02:19:07 + KARAF_VERSION=karaf4 02:19:07 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 02:19:07 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:07 + [[ ! -z '' ]] 02:19:07 + cat /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:07 ################################################################################ 02:19:07 # 02:19:07 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:07 # contributor license agreements. See the NOTICE file distributed with 02:19:07 # this work for additional information regarding copyright ownership. 02:19:07 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:07 # (the "License"); you may not use this file except in compliance with 02:19:07 # the License. You may obtain a copy of the License at 02:19:07 # 02:19:07 # http://www.apache.org/licenses/LICENSE-2.0 02:19:07 # 02:19:07 # Unless required by applicable law or agreed to in writing, software 02:19:07 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:07 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:07 # See the License for the specific language governing permissions and 02:19:07 # limitations under the License. 02:19:07 # 02:19:07 ################################################################################ 02:19:07 02:19:07 # 02:19:07 # Comma separated list of features repositories to register by default 02:19:07 # 02:19:07 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/7dcf83ed-3930-47c1-8604-2be667442bd7.xml 02:19:07 02:19:07 # 02:19:07 # Comma separated list of features to install at startup 02:19:07 # 02:19:07 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 6529f430-fdfc-4281-a08e-c1b54a0a72d5 02:19:07 02:19:07 # 02:19:07 # Resource repositories (OBR) that the features resolver can use 02:19:07 # to resolve requirements/capabilities 02:19:07 # 02:19:07 # The format of the resourceRepositories is 02:19:07 # resourceRepositories=[xml:url|json:url],... 02:19:07 # for Instance: 02:19:07 # 02:19:07 #resourceRepositories=xml:http://host/path/to/index.xml 02:19:07 # or 02:19:07 #resourceRepositories=json:http://host/path/to/index.json 02:19:07 # 02:19:07 02:19:07 # 02:19:07 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 02:19:07 # 02:19:07 featuresBootAsynchronous=false 02:19:07 02:19:07 # 02:19:07 # Service requirements enforcement 02:19:07 # 02:19:07 # By default, the feature resolver checks the service requirements/capabilities of 02:19:07 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 02:19:07 # the required bundles. 02:19:07 # The following flag can have those values: 02:19:07 # - disable: service requirements are completely ignored 02:19:07 # - default: service requirements are ignored for old features 02:19:07 # - enforce: service requirements are always verified 02:19:07 # 02:19:07 #serviceRequirements=default 02:19:07 02:19:07 # 02:19:07 # Store cfg file for config element in feature 02:19:07 # 02:19:07 #configCfgStore=true 02:19:07 02:19:07 # 02:19:07 # Define if the feature service automatically refresh bundles 02:19:07 # 02:19:07 autoRefresh=true 02:19:07 02:19:07 # 02:19:07 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 02:19:07 # XML file defines instructions related to features processing 02:19:07 # versions.properties may declare properties to resolve placeholders in XML file 02:19:07 # both files are relative to ${karaf.etc} 02:19:07 # 02:19:07 #featureProcessing=org.apache.karaf.features.xml 02:19:07 #featureProcessingVersions=versions.properties 02:19:07 + configure_karaf_log karaf4 '' 02:19:07 + local -r karaf_version=karaf4 02:19:07 + local -r controllerdebugmap= 02:19:07 + local logapi=log4j 02:19:07 + grep log4j2 /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:07 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:07 log4j2.rootLogger.level = INFO 02:19:07 #log4j2.rootLogger.type = asyncRoot 02:19:07 #log4j2.rootLogger.includeLocation = false 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:07 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:07 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:07 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:07 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:07 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:07 log4j2.logger.spifly.level = WARN 02:19:07 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:07 log4j2.logger.audit.level = INFO 02:19:07 log4j2.logger.audit.additivity = false 02:19:07 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:07 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:07 log4j2.appender.console.type = Console 02:19:07 log4j2.appender.console.name = Console 02:19:07 log4j2.appender.console.layout.type = PatternLayout 02:19:07 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:07 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:07 log4j2.appender.rolling.name = RollingFile 02:19:07 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:07 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:07 #log4j2.appender.rolling.immediateFlush = false 02:19:07 log4j2.appender.rolling.append = true 02:19:07 log4j2.appender.rolling.layout.type = PatternLayout 02:19:07 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:07 log4j2.appender.rolling.policies.type = Policies 02:19:07 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:07 log4j2.appender.rolling.policies.size.size = 64MB 02:19:07 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:07 log4j2.appender.rolling.strategy.max = 7 02:19:07 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:07 log4j2.appender.audit.name = AuditRollingFile 02:19:07 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:07 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:07 log4j2.appender.audit.append = true 02:19:07 log4j2.appender.audit.layout.type = PatternLayout 02:19:07 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:07 log4j2.appender.audit.policies.type = Policies 02:19:07 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:07 log4j2.appender.audit.policies.size.size = 8MB 02:19:07 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:07 log4j2.appender.audit.strategy.max = 7 02:19:07 log4j2.appender.osgi.type = PaxOsgi 02:19:07 log4j2.appender.osgi.name = PaxOsgi 02:19:07 log4j2.appender.osgi.filter = * 02:19:07 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:07 #log4j2.logger.aether.level = TRACE 02:19:07 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:07 #log4j2.logger.http-headers.level = DEBUG 02:19:07 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:07 #log4j2.logger.maven.level = TRACE 02:19:07 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 02:19:07 + logapi=log4j2 02:19:07 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 02:19:07 + '[' log4j2 == log4j2 ']' 02:19:07 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:07 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 02:19:07 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 02:19:07 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 02:19:07 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 02:19:07 + unset IFS 02:19:07 controllerdebugmap: 02:19:07 + echo 'controllerdebugmap: ' 02:19:07 + '[' -n '' ']' 02:19:07 cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:07 + echo 'cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg' 02:19:07 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:07 ################################################################################ 02:19:07 # 02:19:07 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:07 # contributor license agreements. See the NOTICE file distributed with 02:19:07 # this work for additional information regarding copyright ownership. 02:19:07 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:07 # (the "License"); you may not use this file except in compliance with 02:19:07 # the License. You may obtain a copy of the License at 02:19:07 # 02:19:07 # http://www.apache.org/licenses/LICENSE-2.0 02:19:07 # 02:19:07 # Unless required by applicable law or agreed to in writing, software 02:19:07 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:07 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:07 # See the License for the specific language governing permissions and 02:19:07 # limitations under the License. 02:19:07 # 02:19:07 ################################################################################ 02:19:07 02:19:07 # Common pattern layout for appenders 02:19:07 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:07 02:19:07 # Root logger 02:19:07 log4j2.rootLogger.level = INFO 02:19:07 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:07 #log4j2.rootLogger.type = asyncRoot 02:19:07 #log4j2.rootLogger.includeLocation = false 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:07 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:07 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:07 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:07 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:07 02:19:07 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:07 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:07 02:19:07 # Loggers configuration 02:19:07 02:19:07 # Spifly logger 02:19:07 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:07 log4j2.logger.spifly.level = WARN 02:19:07 02:19:07 # Security audit logger 02:19:07 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:07 log4j2.logger.audit.level = INFO 02:19:07 log4j2.logger.audit.additivity = false 02:19:07 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:07 02:19:07 # Appenders configuration 02:19:07 02:19:07 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:07 log4j2.appender.console.type = Console 02:19:07 log4j2.appender.console.name = Console 02:19:07 log4j2.appender.console.layout.type = PatternLayout 02:19:07 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:07 02:19:07 # Rolling file appender 02:19:07 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:07 log4j2.appender.rolling.name = RollingFile 02:19:07 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:07 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:07 # uncomment to not force a disk flush 02:19:07 #log4j2.appender.rolling.immediateFlush = false 02:19:07 log4j2.appender.rolling.append = true 02:19:07 log4j2.appender.rolling.layout.type = PatternLayout 02:19:07 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:07 log4j2.appender.rolling.policies.type = Policies 02:19:07 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:07 log4j2.appender.rolling.policies.size.size = 1GB 02:19:07 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:07 log4j2.appender.rolling.strategy.max = 7 02:19:07 02:19:07 # Audit file appender 02:19:07 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:07 log4j2.appender.audit.name = AuditRollingFile 02:19:07 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:07 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:07 log4j2.appender.audit.append = true 02:19:07 log4j2.appender.audit.layout.type = PatternLayout 02:19:07 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:07 log4j2.appender.audit.policies.type = Policies 02:19:07 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:07 log4j2.appender.audit.policies.size.size = 8MB 02:19:07 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:07 log4j2.appender.audit.strategy.max = 7 02:19:07 02:19:07 # OSGi appender 02:19:07 log4j2.appender.osgi.type = PaxOsgi 02:19:07 log4j2.appender.osgi.name = PaxOsgi 02:19:07 log4j2.appender.osgi.filter = * 02:19:07 02:19:07 # help with identification of maven-related problems with pax-url-aether 02:19:07 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:07 #log4j2.logger.aether.level = TRACE 02:19:07 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:07 #log4j2.logger.http-headers.level = DEBUG 02:19:07 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:07 #log4j2.logger.maven.level = TRACE 02:19:07 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:07 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:07 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.0/bin/setenv 02:19:07 Configure 02:19:07 java home: /usr/lib/jvm/java-21-openjdk-amd64 02:19:07 max memory: 2048m 02:19:07 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 02:19:07 + local -r controllermem=2048m 02:19:07 + local -r memconf=/tmp/karaf-0.22.0/bin/setenv 02:19:07 + echo Configure 02:19:07 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 02:19:07 + echo ' max memory: 2048m' 02:19:07 memconf: /tmp/karaf-0.22.0/bin/setenv 02:19:07 + echo ' memconf: /tmp/karaf-0.22.0/bin/setenv' 02:19:07 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.0/bin/setenv 02:19:07 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.0/bin/setenv 02:19:07 cat /tmp/karaf-0.22.0/bin/setenv 02:19:07 + echo 'cat /tmp/karaf-0.22.0/bin/setenv' 02:19:07 + cat /tmp/karaf-0.22.0/bin/setenv 02:19:07 #!/bin/sh 02:19:07 # 02:19:07 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:07 # contributor license agreements. See the NOTICE file distributed with 02:19:07 # this work for additional information regarding copyright ownership. 02:19:07 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:07 # (the "License"); you may not use this file except in compliance with 02:19:07 # the License. You may obtain a copy of the License at 02:19:07 # 02:19:07 # http://www.apache.org/licenses/LICENSE-2.0 02:19:07 # 02:19:07 # Unless required by applicable law or agreed to in writing, software 02:19:07 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:07 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:07 # See the License for the specific language governing permissions and 02:19:07 # limitations under the License. 02:19:07 # 02:19:07 02:19:07 # 02:19:07 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 02:19:07 # script: client, instance, shell, start, status, stop, karaf 02:19:07 # 02:19:07 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 02:19:07 # Actions go here... 02:19:07 # fi 02:19:07 02:19:07 # 02:19:07 # general settings which should be applied for all scripts go here; please keep 02:19:07 # in mind that it is possible that scripts might be executed more than once, e.g. 02:19:07 # in example of the start script where the start script is executed first and the 02:19:07 # karaf script afterwards. 02:19:07 # 02:19:07 02:19:07 # 02:19:07 # The following section shows the possible configuration options for the default 02:19:07 # karaf scripts 02:19:07 # 02:19:07 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 02:19:07 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 02:19:07 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 02:19:07 # export EXTRA_JAVA_OPTS # Additional JVM options 02:19:07 # export KARAF_HOME # Karaf home folder 02:19:07 # export KARAF_DATA # Karaf data folder 02:19:07 # export KARAF_BASE # Karaf base folder 02:19:07 # export KARAF_ETC # Karaf etc folder 02:19:07 # export KARAF_LOG # Karaf log folder 02:19:07 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 02:19:07 # export KARAF_OPTS # Additional available Karaf options 02:19:07 # export KARAF_DEBUG # Enable debug mode 02:19:07 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 02:19:07 # export KARAF_NOROOT # Prevent execution as root if set to true 02:19:07 Set Java version 02:19:07 + echo 'Set Java version' 02:19:07 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 02:19:07 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:07 sudo: a password is required 02:19:07 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:07 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:07 sudo: a password is required 02:19:07 JDK default version ... 02:19:07 + echo 'JDK default version ...' 02:19:07 + java -version 02:19:08 openjdk version "21.0.5" 2024-10-15 02:19:08 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 02:19:08 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 02:19:08 Set JAVA_HOME 02:19:08 + echo 'Set JAVA_HOME' 02:19:08 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:08 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:08 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:08 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:08 Listing all open ports on controller system... 02:19:08 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:08 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 02:19:08 + echo 'Listing all open ports on controller system...' 02:19:08 + netstat -pnatu 02:19:08 /tmp/configuration-script.sh: line 40: netstat: command not found 02:19:08 + '[' -f /tmp/custom_shard_config.txt ']' 02:19:08 Configuring cluster 02:19:08 + echo 'Configuring cluster' 02:19:08 + /tmp/karaf-0.22.0/bin/configure_cluster.sh 2 10.30.170.189 10.30.171.89 10.30.171.50 02:19:08 ################################################ 02:19:08 ## Configure Cluster ## 02:19:08 ################################################ 02:19:08 ERROR: Cluster configurations files not found. Please configure clustering feature. 02:19:08 + echo 'Dump pekko.conf' 02:19:08 Dump pekko.conf 02:19:08 + cat /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:08 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:08 Dump modules.conf 02:19:08 + echo 'Dump modules.conf' 02:19:08 + cat /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:08 cat: /tmp/karaf-0.22.0/configuration/initial/modules.conf: No such file or directory 02:19:08 Dump module-shards.conf 02:19:08 + echo 'Dump module-shards.conf' 02:19:08 + cat /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:08 cat: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf: No such file or directory 02:19:08 Configuring member-3 with IP address 10.30.171.50 02:19:08 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:08 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:09 + source /tmp/common-functions.sh karaf-0.22.0 02:19:09 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 02:19:09 common-functions.sh is being sourced 02:19:09 ++ echo 'common-functions.sh is being sourced' 02:19:09 ++ BUNDLEFOLDER=karaf-0.22.0 02:19:09 ++ export MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:09 ++ MAVENCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:09 ++ export FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:09 ++ FEATURESCONF=/tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:09 ++ export CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:09 ++ CUSTOMPROP=/tmp/karaf-0.22.0/etc/custom.properties 02:19:09 ++ export LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:09 ++ LOGCONF=/tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:09 ++ export MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:09 ++ MEMCONF=/tmp/karaf-0.22.0/bin/setenv 02:19:09 ++ export CONTROLLERMEM= 02:19:09 ++ CONTROLLERMEM= 02:19:09 ++ '[' '' = calcium ']' 02:19:09 ++ CLUSTER_SYSTEM=pekko 02:19:09 ++ export AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:09 ++ AKKACONF=/tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:09 ++ export MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:09 ++ MODULESCONF=/tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:09 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:09 ++ MODULESHARDSCONF=/tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:09 ++ print_common_env 02:19:09 ++ cat 02:19:09 common-functions environment: 02:19:09 MAVENCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:09 ACTUALFEATURES: 02:19:09 FEATURESCONF: /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:09 CUSTOMPROP: /tmp/karaf-0.22.0/etc/custom.properties 02:19:09 LOGCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:09 MEMCONF: /tmp/karaf-0.22.0/bin/setenv 02:19:09 CONTROLLERMEM: 02:19:09 AKKACONF: /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:09 MODULESCONF: /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:09 MODULESHARDSCONF: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:09 SUITES: 02:19:09 02:19:09 ++ SSH='ssh -t -t' 02:19:09 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 02:19:09 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 02:19:09 Changing to /tmp 02:19:09 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:09 + echo 'Changing to /tmp' 02:19:09 + cd /tmp 02:19:09 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip' 02:19:09 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:09 --2025-07-18 02:19:09-- https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip 02:19:09 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 02:19:09 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 02:19:09 HTTP request sent, awaiting response... 200 OK 02:19:09 Length: 187527106 (179M) [application/zip] 02:19:09 Saving to: ‘karaf-0.22.0.zip’ 02:19:09 02:19:09 0K ........ ........ ........ ........ ........ ........ 1% 67.8M 3s 02:19:09 3072K ........ ........ ........ ........ ........ ........ 3% 155M 2s 02:19:09 6144K ........ ........ ........ ........ ........ ........ 5% 185M 2s 02:19:09 9216K ........ ........ ........ ........ ........ ........ 6% 210M 1s 02:19:09 12288K ........ ........ ........ ........ ........ ........ 8% 173M 1s 02:19:09 15360K ........ ........ ........ ........ ........ ........ 10% 219M 1s 02:19:09 18432K ........ ........ ........ ........ ........ ........ 11% 247M 1s 02:19:09 21504K ........ ........ ........ ........ ........ ........ 13% 245M 1s 02:19:09 24576K ........ ........ ........ ........ ........ ........ 15% 250M 1s 02:19:09 27648K ........ ........ ........ ........ ........ ........ 16% 275M 1s 02:19:09 30720K ........ ........ ........ ........ ........ ........ 18% 278M 1s 02:19:09 33792K ........ ........ ........ ........ ........ ........ 20% 314M 1s 02:19:09 36864K ........ ........ ........ ........ ........ ........ 21% 333M 1s 02:19:09 39936K ........ ........ ........ ........ ........ ........ 23% 249M 1s 02:19:09 43008K ........ ........ ........ ........ ........ ........ 25% 322M 1s 02:19:09 46080K ........ ........ ........ ........ ........ ........ 26% 301M 1s 02:19:09 49152K ........ ........ ........ ........ ........ ........ 28% 243M 1s 02:19:09 52224K ........ ........ ........ ........ ........ ........ 30% 318M 1s 02:19:09 55296K ........ ........ ........ ........ ........ ........ 31% 310M 1s 02:19:09 58368K ........ ........ ........ ........ ........ ........ 33% 385M 1s 02:19:09 61440K ........ ........ ........ ........ ........ ........ 35% 323M 1s 02:19:09 64512K ........ ........ ........ ........ ........ ........ 36% 313M 0s 02:19:09 67584K ........ ........ ........ ........ ........ ........ 38% 318M 0s 02:19:09 70656K ........ ........ ........ ........ ........ ........ 40% 341M 0s 02:19:09 73728K ........ ........ ........ ........ ........ ........ 41% 328M 0s 02:19:09 76800K ........ ........ ........ ........ ........ ........ 43% 406M 0s 02:19:09 79872K ........ ........ ........ ........ ........ ........ 45% 391M 0s 02:19:09 82944K ........ ........ ........ ........ ........ ........ 46% 290M 0s 02:19:09 86016K ........ ........ ........ ........ ........ ........ 48% 338M 0s 02:19:09 89088K ........ ........ ........ ........ ........ ........ 50% 365M 0s 02:19:09 92160K ........ ........ ........ ........ ........ ........ 52% 391M 0s 02:19:09 95232K ........ ........ ........ ........ ........ ........ 53% 276M 0s 02:19:09 98304K ........ ........ ........ ........ ........ ........ 55% 325M 0s 02:19:09 101376K ........ ........ ........ ........ ........ ........ 57% 393M 0s 02:19:09 104448K ........ ........ ........ ........ ........ ........ 58% 399M 0s 02:19:09 107520K ........ ........ ........ ........ ........ ........ 60% 253M 0s 02:19:09 110592K ........ ........ ........ ........ ........ ........ 62% 330M 0s 02:19:09 113664K ........ ........ ........ ........ ........ ........ 63% 374M 0s 02:19:09 116736K ........ ........ ........ ........ ........ ........ 65% 394M 0s 02:19:09 119808K ........ ........ ........ ........ ........ ........ 67% 396M 0s 02:19:09 122880K ........ ........ ........ ........ ........ ........ 68% 314M 0s 02:19:09 125952K ........ ........ ........ ........ ........ ........ 70% 285M 0s 02:19:09 129024K ........ ........ ........ ........ ........ ........ 72% 323M 0s 02:19:09 132096K ........ ........ ........ ........ ........ ........ 73% 383M 0s 02:19:09 135168K ........ ........ ........ ........ ........ ........ 75% 392M 0s 02:19:09 138240K ........ ........ ........ ........ ........ ........ 77% 103M 0s 02:19:09 141312K ........ ........ ........ ........ ........ ........ 78% 378M 0s 02:19:09 144384K ........ ........ ........ ........ ........ ........ 80% 403M 0s 02:19:09 147456K ........ ........ ........ ........ ........ ........ 82% 397M 0s 02:19:09 150528K ........ ........ ........ ........ ........ ........ 83% 225M 0s 02:19:09 153600K ........ ........ ........ ........ ........ ........ 85% 326M 0s 02:19:09 156672K ........ ........ ........ ........ ........ ........ 87% 351M 0s 02:19:09 159744K ........ ........ ........ ........ ........ ........ 88% 365M 0s 02:19:09 162816K ........ ........ ........ ........ ........ ........ 90% 341M 0s 02:19:09 165888K ........ ........ ........ ........ ........ ........ 92% 363M 0s 02:19:09 168960K ........ ........ ........ ........ ........ ........ 93% 358M 0s 02:19:09 172032K ........ ........ ........ ........ ........ ........ 95% 353M 0s 02:19:09 175104K ........ ........ ........ ........ ........ ........ 97% 350M 0s 02:19:09 178176K ........ ........ ........ ........ ........ ........ 98% 360M 0s 02:19:09 181248K ........ ........ ........ ..... 100% 340M=0.6s 02:19:09 02:19:09 2025-07-18 02:19:09 (279 MB/s) - ‘karaf-0.22.0.zip’ saved [187527106/187527106] 02:19:09 02:19:09 Extracting the new controller... 02:19:09 + echo 'Extracting the new controller...' 02:19:09 + unzip -q karaf-0.22.0.zip 02:19:11 Adding external repositories... 02:19:11 + echo 'Adding external repositories...' 02:19:11 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:11 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:11 ################################################################################ 02:19:11 # 02:19:11 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:11 # contributor license agreements. See the NOTICE file distributed with 02:19:11 # this work for additional information regarding copyright ownership. 02:19:11 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:11 # (the "License"); you may not use this file except in compliance with 02:19:11 # the License. You may obtain a copy of the License at 02:19:11 # 02:19:11 # http://www.apache.org/licenses/LICENSE-2.0 02:19:11 # 02:19:11 # Unless required by applicable law or agreed to in writing, software 02:19:11 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:11 # See the License for the specific language governing permissions and 02:19:11 # limitations under the License. 02:19:11 # 02:19:11 ################################################################################ 02:19:11 02:19:11 # 02:19:11 # If set to true, the following property will not allow any certificate to be used 02:19:11 # when accessing Maven repositories through SSL 02:19:11 # 02:19:11 #org.ops4j.pax.url.mvn.certificateCheck= 02:19:11 02:19:11 # 02:19:11 # Path to the local Maven settings file. 02:19:11 # The repositories defined in this file will be automatically added to the list 02:19:11 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 02:19:11 # below is not set. 02:19:11 # The following locations are checked for the existence of the settings.xml file 02:19:11 # * 1. looks for the specified url 02:19:11 # * 2. if not found looks for ${user.home}/.m2/settings.xml 02:19:11 # * 3. if not found looks for ${maven.home}/conf/settings.xml 02:19:11 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 02:19:11 # 02:19:11 #org.ops4j.pax.url.mvn.settings= 02:19:11 02:19:11 # 02:19:11 # Path to the local Maven repository which is used to avoid downloading 02:19:11 # artifacts when they already exist locally. 02:19:11 # The value of this property will be extracted from the settings.xml file 02:19:11 # above, or defaulted to: 02:19:11 # System.getProperty( "user.home" ) + "/.m2/repository" 02:19:11 # 02:19:11 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 02:19:11 02:19:11 # 02:19:11 # Default this to false. It's just weird to use undocumented repos 02:19:11 # 02:19:11 org.ops4j.pax.url.mvn.useFallbackRepositories=false 02:19:11 02:19:11 # 02:19:11 # Uncomment if you don't wanna use the proxy settings 02:19:11 # from the Maven conf/settings.xml file 02:19:11 # 02:19:11 # org.ops4j.pax.url.mvn.proxySupport=false 02:19:11 02:19:11 # 02:19:11 # Comma separated list of repositories scanned when resolving an artifact. 02:19:11 # Those repositories will be checked before iterating through the 02:19:11 # below list of repositories and even before the local repository 02:19:11 # A repository url can be appended with zero or more of the following flags: 02:19:11 # @snapshots : the repository contains snaphots 02:19:11 # @noreleases : the repository does not contain any released artifacts 02:19:11 # 02:19:11 # The following property value will add the system folder as a repo. 02:19:11 # 02:19:11 org.ops4j.pax.url.mvn.defaultRepositories=\ 02:19:11 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 02:19:11 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 02:19:11 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 02:19:11 02:19:11 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 02:19:11 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 02:19:11 02:19:11 # 02:19:11 # Comma separated list of repositories scanned when resolving an artifact. 02:19:11 # The default list includes the following repositories: 02:19:11 # http://repo1.maven.org/maven2@id=central 02:19:11 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 02:19:11 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 02:19:11 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 02:19:11 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 02:19:11 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 02:19:11 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:11 # To add repositories to the default ones, prepend '+' to the list of repositories 02:19:11 # to add. 02:19:11 # A repository url can be appended with zero or more of the following flags: 02:19:11 # @snapshots : the repository contains snapshots 02:19:11 # @noreleases : the repository does not contain any released artifacts 02:19:11 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 02:19:11 # 02:19:11 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 02:19:11 02:19:11 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 02:19:11 + [[ True == \T\r\u\e ]] 02:19:11 + echo 'Configuring the startup features...' 02:19:11 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:11 + FEATURE_TEST_STRING=features-test 02:19:11 + FEATURE_TEST_VERSION=0.22.0 02:19:11 + KARAF_VERSION=karaf4 02:19:11 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 02:19:11 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:11 + [[ ! -z '' ]] 02:19:11 + cat /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:11 ################################################################################ 02:19:11 # 02:19:11 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:11 # contributor license agreements. See the NOTICE file distributed with 02:19:11 # this work for additional information regarding copyright ownership. 02:19:11 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:11 # (the "License"); you may not use this file except in compliance with 02:19:11 # the License. You may obtain a copy of the License at 02:19:11 # 02:19:11 # http://www.apache.org/licenses/LICENSE-2.0 02:19:11 # 02:19:11 # Unless required by applicable law or agreed to in writing, software 02:19:11 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:11 # See the License for the specific language governing permissions and 02:19:11 # limitations under the License. 02:19:11 # 02:19:11 ################################################################################ 02:19:11 02:19:11 # 02:19:11 # Comma separated list of features repositories to register by default 02:19:11 # 02:19:11 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/7dcf83ed-3930-47c1-8604-2be667442bd7.xml 02:19:11 02:19:11 # 02:19:11 # Comma separated list of features to install at startup 02:19:11 # 02:19:11 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 6529f430-fdfc-4281-a08e-c1b54a0a72d5 02:19:11 02:19:11 # 02:19:11 # Resource repositories (OBR) that the features resolver can use 02:19:11 # to resolve requirements/capabilities 02:19:11 # 02:19:11 # The format of the resourceRepositories is 02:19:11 # resourceRepositories=[xml:url|json:url],... 02:19:11 # for Instance: 02:19:11 # 02:19:11 #resourceRepositories=xml:http://host/path/to/index.xml 02:19:11 # or 02:19:11 #resourceRepositories=json:http://host/path/to/index.json 02:19:11 # 02:19:11 02:19:11 # 02:19:11 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 02:19:11 # 02:19:11 featuresBootAsynchronous=false 02:19:11 02:19:11 # 02:19:11 # Service requirements enforcement 02:19:11 # 02:19:11 # By default, the feature resolver checks the service requirements/capabilities of 02:19:11 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 02:19:11 # the required bundles. 02:19:11 # The following flag can have those values: 02:19:11 # - disable: service requirements are completely ignored 02:19:11 # - default: service requirements are ignored for old features 02:19:11 # - enforce: service requirements are always verified 02:19:11 # 02:19:11 #serviceRequirements=default 02:19:11 02:19:11 # 02:19:11 # Store cfg file for config element in feature 02:19:11 # 02:19:11 #configCfgStore=true 02:19:11 02:19:11 # 02:19:11 # Define if the feature service automatically refresh bundles 02:19:11 # 02:19:11 autoRefresh=true 02:19:11 02:19:11 # 02:19:11 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 02:19:11 # XML file defines instructions related to features processing 02:19:11 # versions.properties may declare properties to resolve placeholders in XML file 02:19:11 # both files are relative to ${karaf.etc} 02:19:11 # 02:19:11 #featureProcessing=org.apache.karaf.features.xml 02:19:11 #featureProcessingVersions=versions.properties 02:19:11 + configure_karaf_log karaf4 '' 02:19:11 + local -r karaf_version=karaf4 02:19:11 + local -r controllerdebugmap= 02:19:11 + local logapi=log4j 02:19:11 + grep log4j2 /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:11 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:11 log4j2.rootLogger.level = INFO 02:19:11 #log4j2.rootLogger.type = asyncRoot 02:19:11 #log4j2.rootLogger.includeLocation = false 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:11 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:11 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:11 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:11 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:11 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:11 log4j2.logger.spifly.level = WARN 02:19:11 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:11 log4j2.logger.audit.level = INFO 02:19:11 log4j2.logger.audit.additivity = false 02:19:11 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:11 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:11 log4j2.appender.console.type = Console 02:19:11 log4j2.appender.console.name = Console 02:19:11 log4j2.appender.console.layout.type = PatternLayout 02:19:11 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:11 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:11 log4j2.appender.rolling.name = RollingFile 02:19:11 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:11 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:11 #log4j2.appender.rolling.immediateFlush = false 02:19:11 log4j2.appender.rolling.append = true 02:19:11 log4j2.appender.rolling.layout.type = PatternLayout 02:19:11 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:11 log4j2.appender.rolling.policies.type = Policies 02:19:11 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:11 log4j2.appender.rolling.policies.size.size = 64MB 02:19:11 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:11 log4j2.appender.rolling.strategy.max = 7 02:19:11 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:11 log4j2.appender.audit.name = AuditRollingFile 02:19:11 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:11 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:11 log4j2.appender.audit.append = true 02:19:11 log4j2.appender.audit.layout.type = PatternLayout 02:19:11 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:11 log4j2.appender.audit.policies.type = Policies 02:19:11 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:11 log4j2.appender.audit.policies.size.size = 8MB 02:19:11 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:11 log4j2.appender.audit.strategy.max = 7 02:19:11 log4j2.appender.osgi.type = PaxOsgi 02:19:11 log4j2.appender.osgi.name = PaxOsgi 02:19:11 log4j2.appender.osgi.filter = * 02:19:11 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:11 #log4j2.logger.aether.level = TRACE 02:19:11 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:11 #log4j2.logger.http-headers.level = DEBUG 02:19:11 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:11 #log4j2.logger.maven.level = TRACE 02:19:11 + logapi=log4j2 02:19:11 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 02:19:11 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 02:19:11 + '[' log4j2 == log4j2 ']' 02:19:11 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:11 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 02:19:11 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 02:19:11 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 02:19:11 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 02:19:11 controllerdebugmap: 02:19:11 cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:11 + unset IFS 02:19:11 + echo 'controllerdebugmap: ' 02:19:11 + '[' -n '' ']' 02:19:11 + echo 'cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg' 02:19:11 + cat /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:11 ################################################################################ 02:19:11 # 02:19:11 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:11 # contributor license agreements. See the NOTICE file distributed with 02:19:11 # this work for additional information regarding copyright ownership. 02:19:11 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:11 # (the "License"); you may not use this file except in compliance with 02:19:11 # the License. You may obtain a copy of the License at 02:19:11 # 02:19:11 # http://www.apache.org/licenses/LICENSE-2.0 02:19:11 # 02:19:11 # Unless required by applicable law or agreed to in writing, software 02:19:11 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:11 # See the License for the specific language governing permissions and 02:19:11 # limitations under the License. 02:19:11 # 02:19:11 ################################################################################ 02:19:11 02:19:11 # Common pattern layout for appenders 02:19:11 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:11 02:19:11 # Root logger 02:19:11 log4j2.rootLogger.level = INFO 02:19:11 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:11 #log4j2.rootLogger.type = asyncRoot 02:19:11 #log4j2.rootLogger.includeLocation = false 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:11 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:11 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:11 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:11 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:11 02:19:11 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:11 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:11 02:19:11 # Loggers configuration 02:19:11 02:19:11 # Spifly logger 02:19:11 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:11 log4j2.logger.spifly.level = WARN 02:19:11 02:19:11 # Security audit logger 02:19:11 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:11 log4j2.logger.audit.level = INFO 02:19:11 log4j2.logger.audit.additivity = false 02:19:11 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:11 02:19:11 # Appenders configuration 02:19:11 02:19:11 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:11 log4j2.appender.console.type = Console 02:19:11 log4j2.appender.console.name = Console 02:19:11 log4j2.appender.console.layout.type = PatternLayout 02:19:11 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:11 02:19:11 # Rolling file appender 02:19:11 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:11 log4j2.appender.rolling.name = RollingFile 02:19:11 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:11 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:11 # uncomment to not force a disk flush 02:19:11 #log4j2.appender.rolling.immediateFlush = false 02:19:11 log4j2.appender.rolling.append = true 02:19:11 log4j2.appender.rolling.layout.type = PatternLayout 02:19:11 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:11 log4j2.appender.rolling.policies.type = Policies 02:19:11 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:11 log4j2.appender.rolling.policies.size.size = 1GB 02:19:11 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:11 log4j2.appender.rolling.strategy.max = 7 02:19:11 02:19:11 # Audit file appender 02:19:11 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:11 log4j2.appender.audit.name = AuditRollingFile 02:19:11 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:11 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:11 log4j2.appender.audit.append = true 02:19:11 log4j2.appender.audit.layout.type = PatternLayout 02:19:11 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:11 log4j2.appender.audit.policies.type = Policies 02:19:11 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:11 log4j2.appender.audit.policies.size.size = 8MB 02:19:11 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:11 log4j2.appender.audit.strategy.max = 7 02:19:11 02:19:11 # OSGi appender 02:19:11 log4j2.appender.osgi.type = PaxOsgi 02:19:11 log4j2.appender.osgi.name = PaxOsgi 02:19:11 log4j2.appender.osgi.filter = * 02:19:11 02:19:11 # help with identification of maven-related problems with pax-url-aether 02:19:11 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:11 #log4j2.logger.aether.level = TRACE 02:19:11 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:11 #log4j2.logger.http-headers.level = DEBUG 02:19:11 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:11 #log4j2.logger.maven.level = TRACE 02:19:11 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:11 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:11 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.0/bin/setenv 02:19:11 Configure 02:19:11 java home: /usr/lib/jvm/java-21-openjdk-amd64 02:19:11 max memory: 2048m 02:19:11 memconf: /tmp/karaf-0.22.0/bin/setenv 02:19:11 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 02:19:11 + local -r controllermem=2048m 02:19:11 + local -r memconf=/tmp/karaf-0.22.0/bin/setenv 02:19:11 + echo Configure 02:19:11 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 02:19:11 + echo ' max memory: 2048m' 02:19:11 + echo ' memconf: /tmp/karaf-0.22.0/bin/setenv' 02:19:11 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.0/bin/setenv 02:19:11 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.0/bin/setenv 02:19:11 cat /tmp/karaf-0.22.0/bin/setenv 02:19:11 + echo 'cat /tmp/karaf-0.22.0/bin/setenv' 02:19:11 + cat /tmp/karaf-0.22.0/bin/setenv 02:19:11 #!/bin/sh 02:19:11 # 02:19:11 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:11 # contributor license agreements. See the NOTICE file distributed with 02:19:11 # this work for additional information regarding copyright ownership. 02:19:11 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:11 # (the "License"); you may not use this file except in compliance with 02:19:11 # the License. You may obtain a copy of the License at 02:19:11 # 02:19:11 # http://www.apache.org/licenses/LICENSE-2.0 02:19:11 # 02:19:11 # Unless required by applicable law or agreed to in writing, software 02:19:11 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:11 # See the License for the specific language governing permissions and 02:19:11 # limitations under the License. 02:19:11 # 02:19:11 02:19:11 # 02:19:11 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 02:19:11 # script: client, instance, shell, start, status, stop, karaf 02:19:11 # 02:19:11 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 02:19:11 # Actions go here... 02:19:11 # fi 02:19:11 02:19:11 # 02:19:11 # general settings which should be applied for all scripts go here; please keep 02:19:11 # in mind that it is possible that scripts might be executed more than once, e.g. 02:19:11 # in example of the start script where the start script is executed first and the 02:19:11 # karaf script afterwards. 02:19:11 # 02:19:11 02:19:11 # 02:19:11 # The following section shows the possible configuration options for the default 02:19:11 # karaf scripts 02:19:11 # 02:19:11 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 02:19:11 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 02:19:11 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 02:19:11 # export EXTRA_JAVA_OPTS # Additional JVM options 02:19:11 # export KARAF_HOME # Karaf home folder 02:19:11 # export KARAF_DATA # Karaf data folder 02:19:11 # export KARAF_BASE # Karaf base folder 02:19:11 # export KARAF_ETC # Karaf etc folder 02:19:11 # export KARAF_LOG # Karaf log folder 02:19:11 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 02:19:11 # export KARAF_OPTS # Additional available Karaf options 02:19:11 # export KARAF_DEBUG # Enable debug mode 02:19:11 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 02:19:11 # export KARAF_NOROOT # Prevent execution as root if set to true 02:19:11 Set Java version 02:19:11 + echo 'Set Java version' 02:19:11 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 02:19:11 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:11 sudo: a password is required 02:19:11 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:11 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 02:19:11 sudo: a password is required 02:19:11 JDK default version ... 02:19:11 + echo 'JDK default version ...' 02:19:11 + java -version 02:19:11 openjdk version "21.0.5" 2024-10-15 02:19:11 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 02:19:11 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 02:19:11 Set JAVA_HOME 02:19:11 + echo 'Set JAVA_HOME' 02:19:11 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:11 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 02:19:11 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:11 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:11 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 02:19:11 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 02:19:11 Listing all open ports on controller system... 02:19:11 + echo 'Listing all open ports on controller system...' 02:19:11 + netstat -pnatu 02:19:11 /tmp/configuration-script.sh: line 40: netstat: command not found 02:19:11 + '[' -f /tmp/custom_shard_config.txt ']' 02:19:11 + echo 'Configuring cluster' 02:19:11 Configuring cluster 02:19:11 + /tmp/karaf-0.22.0/bin/configure_cluster.sh 3 10.30.170.189 10.30.171.89 10.30.171.50 02:19:11 ################################################ 02:19:11 ## Configure Cluster ## 02:19:11 ################################################ 02:19:11 ERROR: Cluster configurations files not found. Please configure clustering feature. 02:19:11 Dump pekko.conf 02:19:11 + echo 'Dump pekko.conf' 02:19:11 + cat /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:11 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:11 Dump modules.conf 02:19:11 + echo 'Dump modules.conf' 02:19:11 + cat /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:11 cat: /tmp/karaf-0.22.0/configuration/initial/modules.conf: No such file or directory 02:19:11 Dump module-shards.conf 02:19:11 + echo 'Dump module-shards.conf' 02:19:11 + cat /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:11 cat: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf: No such file or directory 02:19:11 Locating config plan to use... 02:19:11 config plan exists!!! 02:19:11 Changing the config plan path... 02:19:11 # Place the suites in run order: 02:19:11 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh 02:19:11 Executing /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh... 02:19:11 Copying config files to ODL Controller folder 02:19:11 Set AKKA debug on 10.30.170.189 02:19:11 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:11 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:12 Enable AKKA debug 02:19:12 sed: can't read /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 Dump /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:12 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 Dump /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:12 ################################################################################ 02:19:12 # 02:19:12 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:12 # contributor license agreements. See the NOTICE file distributed with 02:19:12 # this work for additional information regarding copyright ownership. 02:19:12 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:12 # (the "License"); you may not use this file except in compliance with 02:19:12 # the License. You may obtain a copy of the License at 02:19:12 # 02:19:12 # http://www.apache.org/licenses/LICENSE-2.0 02:19:12 # 02:19:12 # Unless required by applicable law or agreed to in writing, software 02:19:12 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:12 # See the License for the specific language governing permissions and 02:19:12 # limitations under the License. 02:19:12 # 02:19:12 ################################################################################ 02:19:12 02:19:12 # Common pattern layout for appenders 02:19:12 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:12 02:19:12 # Root logger 02:19:12 log4j2.rootLogger.level = INFO 02:19:12 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:12 #log4j2.rootLogger.type = asyncRoot 02:19:12 #log4j2.rootLogger.includeLocation = false 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:12 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:12 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:12 02:19:12 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:12 02:19:12 # Loggers configuration 02:19:12 02:19:12 # Spifly logger 02:19:12 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:12 log4j2.logger.spifly.level = WARN 02:19:12 02:19:12 # Security audit logger 02:19:12 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:12 log4j2.logger.audit.level = INFO 02:19:12 log4j2.logger.audit.additivity = false 02:19:12 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:12 02:19:12 # Appenders configuration 02:19:12 02:19:12 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:12 log4j2.appender.console.type = Console 02:19:12 log4j2.appender.console.name = Console 02:19:12 log4j2.appender.console.layout.type = PatternLayout 02:19:12 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:12 02:19:12 # Rolling file appender 02:19:12 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:12 log4j2.appender.rolling.name = RollingFile 02:19:12 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:12 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:12 # uncomment to not force a disk flush 02:19:12 #log4j2.appender.rolling.immediateFlush = false 02:19:12 log4j2.appender.rolling.append = true 02:19:12 log4j2.appender.rolling.layout.type = PatternLayout 02:19:12 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.rolling.policies.type = Policies 02:19:12 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.rolling.policies.size.size = 1GB 02:19:12 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.rolling.strategy.max = 7 02:19:12 02:19:12 # Audit file appender 02:19:12 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:12 log4j2.appender.audit.name = AuditRollingFile 02:19:12 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:12 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:12 log4j2.appender.audit.append = true 02:19:12 log4j2.appender.audit.layout.type = PatternLayout 02:19:12 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.audit.policies.type = Policies 02:19:12 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.audit.policies.size.size = 8MB 02:19:12 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.audit.strategy.max = 7 02:19:12 02:19:12 # OSGi appender 02:19:12 log4j2.appender.osgi.type = PaxOsgi 02:19:12 log4j2.appender.osgi.name = PaxOsgi 02:19:12 log4j2.appender.osgi.filter = * 02:19:12 02:19:12 # help with identification of maven-related problems with pax-url-aether 02:19:12 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:12 #log4j2.logger.aether.level = TRACE 02:19:12 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:12 #log4j2.logger.http-headers.level = DEBUG 02:19:12 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:12 #log4j2.logger.maven.level = TRACE 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:12 log4j2.logger.cluster.name=akka.cluster 02:19:12 log4j2.logger.cluster.level=DEBUG 02:19:12 log4j2.logger.remote.name=akka.remote 02:19:12 log4j2.logger.remote.level=DEBUG 02:19:12 Set AKKA debug on 10.30.171.89 02:19:12 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:12 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:12 Enable AKKA debug 02:19:12 sed: can't read /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 Dump /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:12 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 Dump /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:12 ################################################################################ 02:19:12 # 02:19:12 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:12 # contributor license agreements. See the NOTICE file distributed with 02:19:12 # this work for additional information regarding copyright ownership. 02:19:12 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:12 # (the "License"); you may not use this file except in compliance with 02:19:12 # the License. You may obtain a copy of the License at 02:19:12 # 02:19:12 # http://www.apache.org/licenses/LICENSE-2.0 02:19:12 # 02:19:12 # Unless required by applicable law or agreed to in writing, software 02:19:12 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:12 # See the License for the specific language governing permissions and 02:19:12 # limitations under the License. 02:19:12 # 02:19:12 ################################################################################ 02:19:12 02:19:12 # Common pattern layout for appenders 02:19:12 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:12 02:19:12 # Root logger 02:19:12 log4j2.rootLogger.level = INFO 02:19:12 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:12 #log4j2.rootLogger.type = asyncRoot 02:19:12 #log4j2.rootLogger.includeLocation = false 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:12 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:12 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:12 02:19:12 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:12 02:19:12 # Loggers configuration 02:19:12 02:19:12 # Spifly logger 02:19:12 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:12 log4j2.logger.spifly.level = WARN 02:19:12 02:19:12 # Security audit logger 02:19:12 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:12 log4j2.logger.audit.level = INFO 02:19:12 log4j2.logger.audit.additivity = false 02:19:12 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:12 02:19:12 # Appenders configuration 02:19:12 02:19:12 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:12 log4j2.appender.console.type = Console 02:19:12 log4j2.appender.console.name = Console 02:19:12 log4j2.appender.console.layout.type = PatternLayout 02:19:12 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:12 02:19:12 # Rolling file appender 02:19:12 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:12 log4j2.appender.rolling.name = RollingFile 02:19:12 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:12 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:12 # uncomment to not force a disk flush 02:19:12 #log4j2.appender.rolling.immediateFlush = false 02:19:12 log4j2.appender.rolling.append = true 02:19:12 log4j2.appender.rolling.layout.type = PatternLayout 02:19:12 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.rolling.policies.type = Policies 02:19:12 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.rolling.policies.size.size = 1GB 02:19:12 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.rolling.strategy.max = 7 02:19:12 02:19:12 # Audit file appender 02:19:12 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:12 log4j2.appender.audit.name = AuditRollingFile 02:19:12 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:12 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:12 log4j2.appender.audit.append = true 02:19:12 log4j2.appender.audit.layout.type = PatternLayout 02:19:12 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.audit.policies.type = Policies 02:19:12 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.audit.policies.size.size = 8MB 02:19:12 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.audit.strategy.max = 7 02:19:12 02:19:12 # OSGi appender 02:19:12 log4j2.appender.osgi.type = PaxOsgi 02:19:12 log4j2.appender.osgi.name = PaxOsgi 02:19:12 log4j2.appender.osgi.filter = * 02:19:12 02:19:12 # help with identification of maven-related problems with pax-url-aether 02:19:12 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:12 #log4j2.logger.aether.level = TRACE 02:19:12 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:12 #log4j2.logger.http-headers.level = DEBUG 02:19:12 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:12 #log4j2.logger.maven.level = TRACE 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:12 log4j2.logger.cluster.name=akka.cluster 02:19:12 log4j2.logger.cluster.level=DEBUG 02:19:12 log4j2.logger.remote.name=akka.remote 02:19:12 log4j2.logger.remote.level=DEBUG 02:19:12 Set AKKA debug on 10.30.171.50 02:19:12 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:12 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:12 Enable AKKA debug 02:19:12 sed: can't read /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 Dump /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:12 Dump /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:12 cat: /tmp/karaf-0.22.0/configuration/initial/pekko.conf: No such file or directory 02:19:12 ################################################################################ 02:19:12 # 02:19:12 # Licensed to the Apache Software Foundation (ASF) under one or more 02:19:12 # contributor license agreements. See the NOTICE file distributed with 02:19:12 # this work for additional information regarding copyright ownership. 02:19:12 # The ASF licenses this file to You under the Apache License, Version 2.0 02:19:12 # (the "License"); you may not use this file except in compliance with 02:19:12 # the License. You may obtain a copy of the License at 02:19:12 # 02:19:12 # http://www.apache.org/licenses/LICENSE-2.0 02:19:12 # 02:19:12 # Unless required by applicable law or agreed to in writing, software 02:19:12 # distributed under the License is distributed on an "AS IS" BASIS, 02:19:12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 02:19:12 # See the License for the specific language governing permissions and 02:19:12 # limitations under the License. 02:19:12 # 02:19:12 ################################################################################ 02:19:12 02:19:12 # Common pattern layout for appenders 02:19:12 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 02:19:12 02:19:12 # Root logger 02:19:12 log4j2.rootLogger.level = INFO 02:19:12 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 02:19:12 #log4j2.rootLogger.type = asyncRoot 02:19:12 #log4j2.rootLogger.includeLocation = false 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 02:19:12 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 02:19:12 log4j2.rootLogger.appenderRef.Console.ref = Console 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 02:19:12 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 02:19:12 02:19:12 # Filters for logs marked by org.opendaylight.odlparent.Markers 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 02:19:12 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 02:19:12 02:19:12 # Loggers configuration 02:19:12 02:19:12 # Spifly logger 02:19:12 log4j2.logger.spifly.name = org.apache.aries.spifly 02:19:12 log4j2.logger.spifly.level = WARN 02:19:12 02:19:12 # Security audit logger 02:19:12 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 02:19:12 log4j2.logger.audit.level = INFO 02:19:12 log4j2.logger.audit.additivity = false 02:19:12 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 02:19:12 02:19:12 # Appenders configuration 02:19:12 02:19:12 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 02:19:12 log4j2.appender.console.type = Console 02:19:12 log4j2.appender.console.name = Console 02:19:12 log4j2.appender.console.layout.type = PatternLayout 02:19:12 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 02:19:12 02:19:12 # Rolling file appender 02:19:12 log4j2.appender.rolling.type = RollingRandomAccessFile 02:19:12 log4j2.appender.rolling.name = RollingFile 02:19:12 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 02:19:12 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 02:19:12 # uncomment to not force a disk flush 02:19:12 #log4j2.appender.rolling.immediateFlush = false 02:19:12 log4j2.appender.rolling.append = true 02:19:12 log4j2.appender.rolling.layout.type = PatternLayout 02:19:12 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.rolling.policies.type = Policies 02:19:12 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.rolling.policies.size.size = 1GB 02:19:12 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.rolling.strategy.max = 7 02:19:12 02:19:12 # Audit file appender 02:19:12 log4j2.appender.audit.type = RollingRandomAccessFile 02:19:12 log4j2.appender.audit.name = AuditRollingFile 02:19:12 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 02:19:12 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 02:19:12 log4j2.appender.audit.append = true 02:19:12 log4j2.appender.audit.layout.type = PatternLayout 02:19:12 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 02:19:12 log4j2.appender.audit.policies.type = Policies 02:19:12 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 02:19:12 log4j2.appender.audit.policies.size.size = 8MB 02:19:12 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 02:19:12 log4j2.appender.audit.strategy.max = 7 02:19:12 02:19:12 # OSGi appender 02:19:12 log4j2.appender.osgi.type = PaxOsgi 02:19:12 log4j2.appender.osgi.name = PaxOsgi 02:19:12 log4j2.appender.osgi.filter = * 02:19:12 02:19:12 # help with identification of maven-related problems with pax-url-aether 02:19:12 #log4j2.logger.aether.name = shaded.org.eclipse.aether 02:19:12 #log4j2.logger.aether.level = TRACE 02:19:12 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 02:19:12 #log4j2.logger.http-headers.level = DEBUG 02:19:12 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 02:19:12 #log4j2.logger.maven.level = TRACE 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 02:19:12 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 02:19:12 log4j2.logger.cluster.name=akka.cluster 02:19:12 log4j2.logger.cluster.level=DEBUG 02:19:12 log4j2.logger.remote.name=akka.remote 02:19:12 log4j2.logger.remote.level=DEBUG 02:19:12 Finished running config plans 02:19:12 Starting member-1 with IP address 10.30.170.189 02:19:13 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:13 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:13 Redirecting karaf console output to karaf_console.log 02:19:13 Starting controller... 02:19:13 start: Redirecting Karaf output to /tmp/karaf-0.22.0/data/log/karaf_console.log 02:19:13 Starting member-2 with IP address 10.30.171.89 02:19:13 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:13 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:19:13 Redirecting karaf console output to karaf_console.log 02:19:13 Starting controller... 02:19:13 start: Redirecting Karaf output to /tmp/karaf-0.22.0/data/log/karaf_console.log 02:19:13 Starting member-3 with IP address 10.30.171.50 02:19:13 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:14 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:19:14 Redirecting karaf console output to karaf_console.log 02:19:14 Starting controller... 02:19:14 start: Redirecting Karaf output to /tmp/karaf-0.22.0/data/log/karaf_console.log 02:19:14 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins436251332600683377.sh 02:19:14 common-functions.sh is being sourced 02:19:14 common-functions environment: 02:19:14 MAVENCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.url.mvn.cfg 02:19:14 ACTUALFEATURES: 02:19:14 FEATURESCONF: /tmp/karaf-0.22.0/etc/org.apache.karaf.features.cfg 02:19:14 CUSTOMPROP: /tmp/karaf-0.22.0/etc/custom.properties 02:19:14 LOGCONF: /tmp/karaf-0.22.0/etc/org.ops4j.pax.logging.cfg 02:19:14 MEMCONF: /tmp/karaf-0.22.0/bin/setenv 02:19:14 CONTROLLERMEM: 2048m 02:19:14 AKKACONF: /tmp/karaf-0.22.0/configuration/initial/pekko.conf 02:19:14 MODULESCONF: /tmp/karaf-0.22.0/configuration/initial/modules.conf 02:19:14 MODULESHARDSCONF: /tmp/karaf-0.22.0/configuration/initial/module-shards.conf 02:19:14 SUITES: 02:19:14 02:19:14 + echo '#################################################' 02:19:14 ################################################# 02:19:14 + echo '## Verify Cluster is UP ##' 02:19:14 ## Verify Cluster is UP ## 02:19:14 + echo '#################################################' 02:19:14 ################################################# 02:19:14 + create_post_startup_script 02:19:14 + cat 02:19:14 + copy_and_run_post_startup_script 02:19:14 + seed_index=1 02:19:14 ++ seq 1 3 02:19:14 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:19:14 + CONTROLLERIP=ODL_SYSTEM_1_IP 02:19:14 + echo 'Execute the post startup script on controller 10.30.170.189' 02:19:14 Execute the post startup script on controller 10.30.170.189 02:19:14 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.170.189:/tmp/ 02:19:14 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:14 + ssh 10.30.170.189 'bash /tmp/post-startup-script.sh 1' 02:19:14 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:19:14 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:19 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:24 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:29 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:34 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:39 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:44 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:49 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:54 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:19:59 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:04 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:09 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:14 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 02:20:19 2025-07-18T02:19:31,776 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:20:19 Controller is UP 02:20:19 2025-07-18T02:19:31,776 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:20:19 Listing all open ports on controller system... 02:20:19 /tmp/post-startup-script.sh: line 51: netstat: command not found 02:20:19 looking for "BindException: Address already in use" in log file 02:20:19 looking for "server is unhealthy" in log file 02:20:19 + '[' 1 == 0 ']' 02:20:19 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:20:19 + CONTROLLERIP=ODL_SYSTEM_2_IP 02:20:19 + echo 'Execute the post startup script on controller 10.30.171.89' 02:20:19 Execute the post startup script on controller 10.30.171.89 02:20:19 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.89:/tmp/ 02:20:19 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:20:20 + ssh 10.30.171.89 'bash /tmp/post-startup-script.sh 2' 02:20:20 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:20:20 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:25 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:30 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:35 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:40 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:45 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:50 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:20:55 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:00 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:05 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:10 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:15 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:20 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 02:21:25 2025-07-18T02:19:34,751 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:21:25 Controller is UP 02:21:25 2025-07-18T02:19:34,751 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:21:25 Listing all open ports on controller system... 02:21:25 /tmp/post-startup-script.sh: line 51: netstat: command not found 02:21:25 looking for "BindException: Address already in use" in log file 02:21:25 looking for "server is unhealthy" in log file 02:21:25 + '[' 2 == 0 ']' 02:21:25 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:21:25 + CONTROLLERIP=ODL_SYSTEM_3_IP 02:21:25 + echo 'Execute the post startup script on controller 10.30.171.50' 02:21:25 Execute the post startup script on controller 10.30.171.50 02:21:25 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.50:/tmp/ 02:21:25 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:21:26 + ssh 10.30.171.50 'bash /tmp/post-startup-script.sh 3' 02:21:26 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:21:26 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:31 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:36 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:41 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:46 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:51 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:21:56 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:01 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:06 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:11 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:16 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:21 /tmp/post-startup-script.sh: line 4: netstat: command not found 02:22:26 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 02:22:31 2025-07-18T02:19:32,400 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:22:31 Controller is UP 02:22:31 2025-07-18T02:19:32,400 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 02:22:31 Listing all open ports on controller system... 02:22:31 /tmp/post-startup-script.sh: line 51: netstat: command not found 02:22:31 looking for "BindException: Address already in use" in log file 02:22:31 looking for "server is unhealthy" in log file 02:22:31 + '[' 0 == 0 ']' 02:22:31 + seed_index=1 02:22:31 + dump_controller_threads 02:22:31 ++ seq 1 3 02:22:31 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:31 + CONTROLLERIP=ODL_SYSTEM_1_IP 02:22:31 + echo 'Let'\''s take the karaf thread dump' 02:22:31 Let's take the karaf thread dump 02:22:31 + ssh 10.30.170.189 'sudo ps aux' 02:22:31 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:22:31 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 02:22:31 ++ grep -v grep 02:22:31 ++ tr -s ' ' 02:22:31 ++ cut -f2 '-d ' 02:22:31 + pid=2128 02:22:31 + echo 'karaf main: org.apache.karaf.main.Main, pid:2128' 02:22:31 karaf main: org.apache.karaf.main.Main, pid:2128 02:22:31 + ssh 10.30.170.189 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2128' 02:22:32 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:22:32 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:32 + CONTROLLERIP=ODL_SYSTEM_2_IP 02:22:32 + echo 'Let'\''s take the karaf thread dump' 02:22:32 Let's take the karaf thread dump 02:22:32 + ssh 10.30.171.89 'sudo ps aux' 02:22:32 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:22:33 ++ tr -s ' ' 02:22:33 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 02:22:33 ++ grep -v grep 02:22:33 ++ cut -f2 '-d ' 02:22:33 + pid=2139 02:22:33 + echo 'karaf main: org.apache.karaf.main.Main, pid:2139' 02:22:33 karaf main: org.apache.karaf.main.Main, pid:2139 02:22:33 + ssh 10.30.171.89 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2139' 02:22:33 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:22:33 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:33 + CONTROLLERIP=ODL_SYSTEM_3_IP 02:22:33 + echo 'Let'\''s take the karaf thread dump' 02:22:33 Let's take the karaf thread dump 02:22:33 + ssh 10.30.171.50 'sudo ps aux' 02:22:33 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:22:34 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 02:22:34 ++ grep -v grep 02:22:34 ++ tr -s ' ' 02:22:34 ++ cut -f2 '-d ' 02:22:34 + pid=2125 02:22:34 + echo 'karaf main: org.apache.karaf.main.Main, pid:2125' 02:22:34 karaf main: org.apache.karaf.main.Main, pid:2125 02:22:34 + ssh 10.30.171.50 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2125' 02:22:34 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:22:34 + '[' 0 -gt 0 ']' 02:22:34 + echo 'Generating controller variables...' 02:22:34 Generating controller variables... 02:22:34 ++ seq 1 3 02:22:34 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:34 + CONTROLLERIP=ODL_SYSTEM_1_IP 02:22:34 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.189' 02:22:34 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:34 + CONTROLLERIP=ODL_SYSTEM_2_IP 02:22:34 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.189 -v ODL_SYSTEM_2_IP:10.30.171.89' 02:22:34 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:22:34 + CONTROLLERIP=ODL_SYSTEM_3_IP 02:22:34 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.189 -v ODL_SYSTEM_2_IP:10.30.171.89 -v ODL_SYSTEM_3_IP:10.30.171.50' 02:22:34 + echo 'Generating mininet variables...' 02:22:34 Generating mininet variables... 02:22:34 ++ seq 1 1 02:22:34 + for i in $(seq 1 "${NUM_TOOLS_SYSTEM}") 02:22:34 + MININETIP=TOOLS_SYSTEM_1_IP 02:22:34 + tools_variables=' -v TOOLS_SYSTEM_1_IP:10.30.171.44' 02:22:34 + get_test_suites SUITES 02:22:34 + local __suite_list=SUITES 02:22:34 + echo 'Locating test plan to use...' 02:22:34 Locating test plan to use... 02:22:34 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt 02:22:34 + '[' '!' -f /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt ']' 02:22:34 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 02:22:34 + '[' disabled '!=' disabled ']' 02:22:34 + echo 'Changing the testplan path...' 02:22:34 Changing the testplan path... 02:22:34 + sed s:integration:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 02:22:34 + cat testplan.txt 02:22:34 # Place the suites in run order: 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot 02:22:34 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 02:22:34 + '[' -z '' ']' 02:22:34 ++ grep -E -v '(^[[:space:]]*#|^[[:space:]]*$)' testplan.txt 02:22:34 ++ tr '\012' ' ' 02:22:34 + suite_list='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 02:22:34 + eval 'SUITES='\''/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot '\''' 02:22:34 ++ SUITES='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 02:22:34 + echo 'Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ...' 02:22:34 Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ... 02:22:34 + robot -N openflowplugin-clustering.txt --removekeywords wuks -e exclude -e skip_if_titanium -v BUNDLEFOLDER:karaf-0.22.0 -v BUNDLE_URL:https://nexus.opendaylight.org/content/repositories//autorelease-9021/org/opendaylight/integration/karaf/0.22.0/karaf-0.22.0.zip -v CONTROLLER:10.30.170.189 -v CONTROLLER1:10.30.171.89 -v CONTROLLER2:10.30.171.50 -v CONTROLLER_USER:jenkins -v JAVA_HOME:/usr/lib/jvm/java-21-openjdk-amd64 -v JDKVERSION:openjdk21 -v JENKINS_WORKSPACE:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium -v MININET:10.30.171.44 -v MININET1: -v MININET2: -v MININET_USER:jenkins -v NEXUSURL_PREFIX:https://nexus.opendaylight.org -v NUM_ODL_SYSTEM:3 -v NUM_TOOLS_SYSTEM:1 -v ODL_STREAM:titanium -v ODL_SYSTEM_IP:10.30.170.189 -v ODL_SYSTEM_1_IP:10.30.170.189 -v ODL_SYSTEM_2_IP:10.30.171.89 -v ODL_SYSTEM_3_IP:10.30.171.50 -v ODL_SYSTEM_USER:jenkins -v TOOLS_SYSTEM_IP:10.30.171.44 -v TOOLS_SYSTEM_1_IP:10.30.171.44 -v TOOLS_SYSTEM_USER:jenkins -v USER_HOME:/home/jenkins -v IS_KARAF_APPL:True -v WORKSPACE:/tmp -v ODL_OF_PLUGIN:lithium /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 02:22:34 ============================================================================== 02:22:34 openflowplugin-clustering.txt 02:22:34 ============================================================================== 02:22:35 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test suite for C... 02:22:35 ============================================================================== 02:22:39 Check Shards Status Before Fail :: Check Status for all shards in ... | FAIL | 02:22:42 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 02:22:42 [ Message content over the limit has been removed. ] 02:22:42 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:22:42 ------------------------------------------------------------------------------ 02:22:42 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 02:22:51 ------------------------------------------------------------------------------ 02:22:51 Check Entity Owner Status And Find Owner and Successor Before Fail... | FAIL | 02:23:21 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:23:21 Lengths are different: 2 != 0 02:23:21 ------------------------------------------------------------------------------ 02:23:21 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 02:23:21 Variable '@{original_successor_list}' not found. 02:23:21 ------------------------------------------------------------------------------ 02:23:21 Check Network Operational Information Before Fail :: Check devices... | FAIL | 02:23:27 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Add Configuration In Owner and Verify Before Fail :: Add Flow in O... | FAIL | 02:23:27 Variable '${original_owner}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Modify Configuration In Owner and Verify Before Fail :: Modify Flo... | FAIL | 02:23:27 Variable '${original_owner}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Delete Configuration In Owner and Verify Before Fail :: Delete Flo... | FAIL | 02:23:27 Variable '${original_owner}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Add Configuration In Successor and Verify Before Fail :: Add Flow ... | FAIL | 02:23:27 Variable '${original_successor}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Modify Configuration In Successor and Verify Before Fail :: Modify... | FAIL | 02:23:27 Variable '${original_successor}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Delete Configuration In Successor and Verify Before Fail :: Delete... | FAIL | 02:23:27 Variable '${original_successor}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Send RPC Add to Owner and Verify Before Fail :: Add Flow in Owner ... | FAIL | 02:23:27 Variable '${original_owner}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Send RPC Delete to Owner and Verify Before Fail :: Delete Flow in ... | FAIL | 02:23:27 Variable '${original_owner}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Send RPC Add to Successor and Verify Before Fail :: Add Flow in Su... | FAIL | 02:23:27 Variable '${original_successor}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Send RPC Delete to Successor and Verify Before Fail :: Delete Flow... | FAIL | 02:23:27 Variable '${original_successor}' not found. 02:23:27 ------------------------------------------------------------------------------ 02:23:27 Modify Network And Verify Before Fail :: Take a link down and veri... | FAIL | 02:23:48 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 02:23:48 ------------------------------------------------------------------------------ 02:23:48 Restore Network And Verify Before Fail :: Take the link up and ver... | FAIL | 02:23:59 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Kill Owner Instance :: Kill Owner Instance and verify it is dead | FAIL | 02:23:59 Variable '${original_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Check Shards Status After Fail :: Create original cluster list and... | FAIL | 02:23:59 Variable '${new_cluster_list}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Check Entity Owner Status And Find Owner and Successor After Fail ... | FAIL | 02:23:59 Variable '${original_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Check Network Operational Information After Fail :: Check devices ... | FAIL | 02:23:59 Variable '${new_cluster_list}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Add Configuration In Owner and Verify After Fail :: Add Flow in Ow... | FAIL | 02:23:59 Variable '${new_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Modify Configuration In Owner and Verify After Fail :: Modify Flow... | FAIL | 02:23:59 Variable '${new_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Delete Configuration In Owner and Verify After Fail :: Delete Flow... | FAIL | 02:23:59 Variable '${new_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Add Configuration In Successor and Verify After Fail :: Add Flow i... | FAIL | 02:23:59 Variable '${new_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Modify Configuration In Successor and Verify After Fail :: Modify ... | FAIL | 02:23:59 Variable '${new_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Delete Configuration In Successor and Verify After Fail :: Delete ... | FAIL | 02:23:59 Variable '${new_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Send RPC Add to Owner and Verify After Fail :: Add Flow in Owner a... | FAIL | 02:23:59 Variable '${new_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Send RPC Delete to Owner and Verify After Fail :: Delete Flow in O... | FAIL | 02:23:59 Variable '${new_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Send RPC Add to Successor and Verify After Fail :: Add Flow in Suc... | FAIL | 02:23:59 Variable '${new_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Send RPC Delete to Successor and Verify After Fail :: Delete Flow ... | FAIL | 02:23:59 Variable '${new_successor}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Modify Network and Verify After Fail :: Take a link down and verif... | FAIL | 02:23:59 Variable '${new_cluster_list}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Restore Network and Verify After Fail :: Take the link up and veri... | FAIL | 02:23:59 Variable '${new_cluster_list}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 02:23:59 This test fails due to https://jira.opendaylight.org/browse/CONTROLLER-1849 02:23:59 02:23:59 Variable '${original_owner}' not found. 02:23:59 ------------------------------------------------------------------------------ 02:23:59 Check Shards Status After Recover :: Create original cluster list ... | FAIL | 02:25:29 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 02:25:29 [ Message content over the limit has been removed. ] 02:25:29 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:25:29 ------------------------------------------------------------------------------ 02:25:29 Check Entity Owner Status After Recover :: Check Entity Owner Stat... | FAIL | 02:26:00 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:26:00 Lengths are different: 2 != 0 02:26:00 ------------------------------------------------------------------------------ 02:26:00 Check Network Operational Information After Recover :: Check devic... | FAIL | 02:26:06 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Add Configuration In Owner and Verify After Recover :: Add Flow in... | FAIL | 02:26:06 Variable '${new_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Modify Configuration In Owner and Verify After Recover :: Modify F... | FAIL | 02:26:06 Variable '${new_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Delete Configuration In Owner and Verify After Recover :: Delete F... | FAIL | 02:26:06 Variable '${new_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Add Configuration In Old Owner and Verify After Recover :: Add Flo... | FAIL | 02:26:06 Variable '${original_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Modify Configuration In Old Owner and Verify After Recover :: Modi... | FAIL | 02:26:06 Variable '${original_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Delete Configuration In Old Owner and Verify After Recover :: Dele... | FAIL | 02:26:06 Variable '${original_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Send RPC Add to Owner and Verify After Recover :: Add Flow in Owne... | FAIL | 02:26:06 Variable '${new_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Send RPC Delete to Owner and Verify After Recover :: Delete Flow i... | FAIL | 02:26:06 Variable '${new_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Send RPC Add to Old Owner and Verify After Recover :: Add Flow in ... | FAIL | 02:26:06 Variable '${original_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Send RPC Delete to Old Owner and Verify After Recover :: Delete Fl... | FAIL | 02:26:06 Variable '${original_owner}' not found. 02:26:06 ------------------------------------------------------------------------------ 02:26:06 Modify Network and Verify After Recover :: Take a link down and ve... | FAIL | 02:26:27 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 02:26:27 ------------------------------------------------------------------------------ 02:26:27 Restore Network and Verify After Recover :: Take the link up and v... | FAIL | 02:26:38 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 02:26:38 ------------------------------------------------------------------------------ 02:26:38 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 02:26:40 ------------------------------------------------------------------------------ 02:26:40 Check No Network Operational Information :: Check device is not in... | PASS | 02:26:40 ------------------------------------------------------------------------------ 02:26:40 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test su... | FAIL | 02:26:40 51 tests, 3 passed, 48 failed 02:26:40 ============================================================================== 02:26:41 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test suite for Cl... 02:26:41 ============================================================================== 02:26:44 Check Shards Status Before Stop :: Check Status for all shards in ... | FAIL | 02:26:44 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 02:26:44 [ Message content over the limit has been removed. ] 02:26:44 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:26:44 ------------------------------------------------------------------------------ 02:26:44 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 02:26:53 ------------------------------------------------------------------------------ 02:26:53 Check Entity Owner Status And Find Owner and Successor Before Stop... | FAIL | 02:27:24 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:27:24 Lengths are different: 2 != 0 02:27:24 ------------------------------------------------------------------------------ 02:27:24 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 02:27:24 Variable '@{original_successor_list}' not found. 02:27:24 ------------------------------------------------------------------------------ 02:27:24 Check Network Operational Information Before Stop :: Check devices... | FAIL | 02:27:29 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:1","source":{"source-node":"openflow:1","source-tp":"openflow:1:1"},"destination":{"dest-tp":"openflow:2:3","dest-node":"openflow:2"}},{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 17 times, not 21 times. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Add Configuration In Owner and Verify Before Stop :: Add Flow in O... | FAIL | 02:27:29 Variable '${original_owner}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Modify Configuration In Owner and Verify Before Stop :: Modify Flo... | FAIL | 02:27:29 Variable '${original_owner}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Delete Configuration In Owner and Verify Before Stop :: Delete Flo... | FAIL | 02:27:29 Variable '${original_owner}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Add Configuration In Successor and Verify Before Stop :: Add Flow ... | FAIL | 02:27:29 Variable '${original_successor}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Modify Configuration In Successor and Verify Before Stop :: Modify... | FAIL | 02:27:29 Variable '${original_successor}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Delete Configuration In Successor and Verify Before Stop :: Delete... | FAIL | 02:27:29 Variable '${original_successor}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Send RPC Add to Owner and Verify Before Stop :: Add Flow in Owner ... | FAIL | 02:27:29 Variable '${original_owner}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Send RPC Delete to Owner and Verify Before Stop :: Delete Flow in ... | FAIL | 02:27:29 Variable '${original_owner}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Send RPC Add to Successor and Verify Before Stop :: Add Flow in Su... | FAIL | 02:27:29 Variable '${original_successor}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Send RPC Delete to Successor and Verify Before Stop :: Delete Flow... | FAIL | 02:27:29 Variable '${original_successor}' not found. 02:27:29 ------------------------------------------------------------------------------ 02:27:29 Modify Network And Verify Before Stop :: Take a link down and veri... | FAIL | 02:27:50 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 16 times. 02:27:50 ------------------------------------------------------------------------------ 02:27:50 Restore Network And Verify Before Stop :: Take the link up and ver... | FAIL | 02:28:01 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:1","source":{"source-node":"openflow:1","source-tp":"openflow:1:1"},"destination":{"dest-tp":"openflow:2:3","dest-node":"openflow:2"}},{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 17 times, not 21 times. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Stop Owner Instance :: Stop Owner Instance and verify it is dead | FAIL | 02:28:01 Variable '${original_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Check Shards Status After Stop :: Create original cluster list and... | FAIL | 02:28:01 Variable '${new_cluster_list}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Check Entity Owner Status And Find Owner and Successor After Stop ... | FAIL | 02:28:01 Variable '${original_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Check Network Operational Information After Stop :: Check devices ... | FAIL | 02:28:01 Variable '${new_cluster_list}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Add Configuration In Owner and Verify After Stop :: Add Flow in Ow... | FAIL | 02:28:01 Variable '${new_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Modify Configuration In Owner and Verify After Stop :: Modify Flow... | FAIL | 02:28:01 Variable '${new_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Delete Configuration In Owner and Verify After Stop :: Delete Flow... | FAIL | 02:28:01 Variable '${new_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Add Configuration In Successor and Verify After Stop :: Add Flow i... | FAIL | 02:28:01 Variable '${new_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Modify Configuration In Successor and Verify After Stop :: Modify ... | FAIL | 02:28:01 Variable '${new_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Delete Configuration In Successor and Verify After Stop :: Delete ... | FAIL | 02:28:01 Variable '${new_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Send RPC Add to Owner and Verify After Stop :: Add Flow in Owner a... | FAIL | 02:28:01 Variable '${new_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Send RPC Delete to Owner and Verify After Stop :: Delete Flow in O... | FAIL | 02:28:01 Variable '${new_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Send RPC Add to Successor and Verify After Stop :: Add Flow in Suc... | FAIL | 02:28:01 Variable '${new_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Send RPC Delete to Successor and Verify After Stop :: Delete Flow ... | FAIL | 02:28:01 Variable '${new_successor}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Modify Network and Verify After Stop :: Take a link down and verif... | FAIL | 02:28:01 Variable '${new_cluster_list}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Restore Network and Verify After Stop :: Take the link up and veri... | FAIL | 02:28:01 Variable '${new_cluster_list}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 02:28:01 Variable '${original_owner}' not found. 02:28:01 ------------------------------------------------------------------------------ 02:28:01 Check Shards Status After Start :: Create original cluster list an... | FAIL | 02:29:31 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 02:29:31 [ Message content over the limit has been removed. ] 02:29:31 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:29:31 ------------------------------------------------------------------------------ 02:29:31 Check Entity Owner Status After Start :: Check Entity Owner Status... | FAIL | 02:30:02 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:30:02 Lengths are different: 2 != 0 02:30:02 ------------------------------------------------------------------------------ 02:30:02 Check Network Operational Information After Start :: Check devices... | FAIL | 02:30:08 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:1","source":{"source-node":"openflow:1","source-tp":"openflow:1:1"},"destination":{"dest-tp":"openflow:2:3","dest-node":"openflow:2"}},{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 17 times, not 21 times. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Add Configuration In Owner and Verify After Start :: Add Flow in O... | FAIL | 02:30:08 Variable '${new_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Modify Configuration In Owner and Verify After Start :: Modify Flo... | FAIL | 02:30:08 Variable '${new_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Delete Configuration In Owner and Verify After Start :: Delete Flo... | FAIL | 02:30:08 Variable '${new_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Add Configuration In Old Owner and Verify After Start :: Add Flow ... | FAIL | 02:30:08 Variable '${original_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Modify Configuration In Old Owner and Verify After Start :: Modify... | FAIL | 02:30:08 Variable '${original_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Delete Configuration In Old Owner and Verify After Start :: Delete... | FAIL | 02:30:08 Variable '${original_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Send RPC Add to Owner and Verify After Start :: Add Flow in Owner ... | FAIL | 02:30:08 Variable '${new_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Send RPC Delete to Owner and Verify After Start :: Delete Flow in ... | FAIL | 02:30:08 Variable '${new_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Send RPC Add to Old Owner and Verify After Start :: Add Flow in Ow... | FAIL | 02:30:08 Variable '${original_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Send RPC Delete to Old Owner and Verify After Start :: Delete Flow... | FAIL | 02:30:08 Variable '${original_owner}' not found. 02:30:08 ------------------------------------------------------------------------------ 02:30:08 Modify Network and Verify After Start :: Take a link down and veri... | FAIL | 02:30:29 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 16 times. 02:30:29 ------------------------------------------------------------------------------ 02:30:29 Restore Network and Verify After Start :: Take the link up and ver... | FAIL | 02:30:40 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:1","source":{"source-node":"openflow:1","source-tp":"openflow:1:1"},"destination":{"dest-tp":"openflow:2:3","dest-node":"openflow:2"}},{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 17 times, not 21 times. 02:30:40 ------------------------------------------------------------------------------ 02:30:40 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 02:30:42 ------------------------------------------------------------------------------ 02:30:42 Check No Network Operational Information :: Check device is not in... | PASS | 02:30:42 ------------------------------------------------------------------------------ 02:30:42 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test sui... | FAIL | 02:30:42 51 tests, 3 passed, 48 failed 02:30:42 ============================================================================== 02:30:42 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Follower Fail... 02:30:42 ============================================================================== 02:30:45 Check Shards Status Before Leader Restart :: Check Status for all ... | FAIL | 02:30:46 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 02:30:46 [ Message content over the limit has been removed. ] 02:30:46 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:30:46 ------------------------------------------------------------------------------ 02:30:46 Get inventory Leader Before Leader Restart :: Find leader in the i... | FAIL | 02:30:57 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 02:30:57 [ Message content over the limit has been removed. ] 02:30:57 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 02:30:57 ------------------------------------------------------------------------------ 02:30:57 Start Mininet Connect To Follower Node1 :: Start mininet with conn... | FAIL | 02:30:58 Variable '${follower_node_1}' not found. 02:30:58 ------------------------------------------------------------------------------ 02:30:58 Add Flows In Follower Node2 and Verify Before Leader Restart :: Ad... | FAIL | 02:30:58 Variable '${follower_node_2}' not found. 02:30:58 ------------------------------------------------------------------------------ 02:30:58 Stop Mininet Connected To Follower Node1 and Exit :: Stop mininet ... | FAIL | 02:30:59 Variable '${mininet_conn_id}' not found. 02:30:59 ------------------------------------------------------------------------------ 02:30:59 Restart Leader From Cluster Node :: Stop Leader Node and Start it ... | FAIL | 02:30:59 Variable '${inventory_leader}' not found. 02:30:59 ------------------------------------------------------------------------------ 02:30:59 Get inventory Follower After Leader Restart :: Find new Followers ... | FAIL | 02:31:10 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 02:31:10 [ Message content over the limit has been removed. ] 02:31:10 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 02:31:10 ------------------------------------------------------------------------------ 02:31:10 Start Mininet Connect To Old Leader :: Start mininet with connecti... | FAIL | 02:31:10 Variable '${inventory_leader_old}' not found. 02:31:10 ------------------------------------------------------------------------------ 02:31:10 Verify Flows In Switch After Leader Restart :: Verify flows are in... | FAIL | 02:31:26 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 02:31:26 ------------------------------------------------------------------------------ 02:31:26 Stop Mininet Connected To Old Leader and Exit :: Stop mininet and ... | FAIL | 02:31:27 Variable '${mininet_conn_id}' not found. 02:31:27 ------------------------------------------------------------------------------ 02:31:27 Restart Follower Node2 :: Stop Follower Node2 and Start it Up, Ver... | FAIL | 02:31:27 Variable '${follower_node_2}' not found. 02:31:27 ------------------------------------------------------------------------------ 02:31:27 Get inventory Follower After Follower Restart :: Find Followers an... | FAIL | 02:31:38 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 02:31:38 [ Message content over the limit has been removed. ] 02:31:38 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 02:31:38 ------------------------------------------------------------------------------ 02:31:38 Start Mininet Connect To Leader :: Start mininet with connection t... | FAIL | 02:31:39 Variable '${inventory_leader}' not found. 02:31:39 ------------------------------------------------------------------------------ 02:31:39 Verify Flows In Switch After Follower Restart :: Verify flows are ... | FAIL | 02:31:55 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 02:31:55 ------------------------------------------------------------------------------ 02:31:55 Stop Mininet Connected To Leader and Exit :: Stop mininet Connecte... | FAIL | 02:31:55 Variable '${mininet_conn_id}' not found. 02:31:55 ------------------------------------------------------------------------------ 02:31:55 Restart Full Cluster :: Stop all Cluster Nodes and Start it Up All. | PASS | 02:32:21 ------------------------------------------------------------------------------ 02:32:21 Get inventory Status After Cluster Restart :: Find New Followers a... | FAIL | 02:33:05 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 02:33:05 [ Message content over the limit has been removed. ] 02:33:05 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 02:33:05 ------------------------------------------------------------------------------ 02:33:05 Start Mininet Connect To Follower Node2 After Cluster Restart :: S... | FAIL | 02:33:05 Variable '${follower_node_2}' not found. 02:33:05 ------------------------------------------------------------------------------ 02:33:05 Verify Flows In Switch After Cluster Restart :: Verify flows are i... | FAIL | 02:33:21 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 02:33:21 ------------------------------------------------------------------------------ 02:33:21 Delete Flows In Follower Node1 and Verify After Leader Restart :: ... | FAIL | 02:33:22 Variable '${follower_node_1}' not found. 02:33:22 ------------------------------------------------------------------------------ 02:33:22 Stop Mininet Connected To Follower Node2 and Exit After Cluster Re... | FAIL | 02:33:23 Variable '${mininet_conn_id}' not found. 02:33:23 ------------------------------------------------------------------------------ 02:33:23 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Foll... | FAIL | 02:33:23 21 tests, 1 passed, 20 failed 02:33:23 ============================================================================== 02:33:23 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/libraries/VsctlListParser.py:61: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:33:23 if ctl_ref is not "": 02:33:23 openflowplugin-clustering.txt.010 Group Flows :: Switch connections and clu... 02:33:23 ============================================================================== 02:33:26 Add Groups And Flows :: Add 100 groups 1&2 and flows in every switch. | PASS | 02:33:30 ------------------------------------------------------------------------------ 02:33:30 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 02:33:39 ------------------------------------------------------------------------------ 02:33:39 Check Linear Topology :: Check Linear Topology. | FAIL | 02:34:10 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:34:10 ------------------------------------------------------------------------------ 02:34:10 Check Stats Are Not Frozen :: Check that duration flow stat is inc... | FAIL | 02:34:40 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 02:34:40 ------------------------------------------------------------------------------ 02:34:40 Check Flows In Operational DS :: Check Flows in operational DS. | FAIL | 02:34:51 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 10 seconds. The last error was: 3 != 303 02:34:51 ------------------------------------------------------------------------------ 02:34:51 Check Groups In Operational DS :: Check Groups in operational DS. | FAIL | 02:35:02 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 02:35:02 ------------------------------------------------------------------------------ 02:35:02 Check Flows In Switch :: Check Flows in switch. | FAIL | 02:35:02 3.0 != 303.0 02:35:02 ------------------------------------------------------------------------------ 02:35:02 Check Entity Owner Status And Find Owner and Successor Before Fail... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 | FAIL | 02:35:34 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:35:34 Lengths are different: 2 != 0 02:35:34 ------------------------------------------------------------------------------ 02:35:34 Disconnect Mininet From Owner :: Disconnect mininet from the owner :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 | FAIL | 02:35:34 Variable '${original_owner}' not found. 02:35:34 ------------------------------------------------------------------------------ 02:35:34 Check Entity Owner Status And Find Owner and Successor After Fail ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 | FAIL | 02:35:45 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${new_cluster_list}' not found. 02:35:45 ------------------------------------------------------------------------------ 02:35:45 Check Switch Moves To New Master :: Check switch s1 is connected t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 | FAIL | 02:35:45 Variable '${new_owner}' not found. 02:35:45 ------------------------------------------------------------------------------ 02:35:45 Check Linear Topology After Disconnect :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:35:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:15 | FAIL | 02:36:15 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:36:15 ------------------------------------------------------------------------------ 02:36:15 Check Stats Are Not Frozen After Disconnect :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:46 | FAIL | 02:36:46 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 02:36:46 ------------------------------------------------------------------------------ 02:36:46 Remove Flows And Groups After Mininet Is Disconnected :: Remove 1 ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:47 | PASS | 02:36:47 ------------------------------------------------------------------------------ 02:36:47 Check Flows In Operational DS After Mininet Is Disconnected :: Che... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:36:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:17 | FAIL | 02:37:17 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 300 02:37:17 ------------------------------------------------------------------------------ 02:37:17 Check Groups In Operational DS After Mininet Is Disconnected :: Ch... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:28 | FAIL | 02:37:28 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 594 02:37:28 ------------------------------------------------------------------------------ 02:37:28 Check Flows In Switch After Mininet Is Disconnected :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:28 | FAIL | 02:37:28 3.0 != 300.0 02:37:28 ------------------------------------------------------------------------------ 02:37:28 Reconnect Mininet To Owner :: Reconnect mininet to switch 1 owner. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:29 | FAIL | 02:37:29 Variable '${original_owner_list}' not found. 02:37:29 ------------------------------------------------------------------------------ 02:37:29 Check Entity Owner Status And Find Owner and Successor After Recon... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:59 | FAIL | 02:37:59 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:37:59 Lengths are different: 2 != 0 02:37:59 ------------------------------------------------------------------------------ 02:37:59 Add Flows And Groups After Owner Reconnect :: Add 1 group type 1&2... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:37:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:01 | PASS | 02:38:01 ------------------------------------------------------------------------------ 02:38:01 Check Stats Are Not Frozen After Owner Reconnect :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:32 | FAIL | 02:38:32 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 02:38:32 ------------------------------------------------------------------------------ 02:38:32 Check Flows After Owner Reconnect In Operational DS :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:38:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:02 | FAIL | 02:39:02 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 02:39:02 ------------------------------------------------------------------------------ 02:39:02 Check Groups After Owner Reconnect In Operational DS :: Check Grou... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:13 | FAIL | 02:39:13 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 02:39:13 ------------------------------------------------------------------------------ 02:39:13 Check Flows After Owner Reconnect In Switch :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 | FAIL | 02:39:14 3.0 != 303.0 02:39:14 ------------------------------------------------------------------------------ 02:39:14 Check Switches Generate Slave Connection :: Check switches are con... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 | FAIL | 02:39:14 Variable '${original_owner}' not found. 02:39:14 ------------------------------------------------------------------------------ 02:39:14 Disconnect Mininet From Successor :: Disconnect mininet from the S... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 | FAIL | 02:39:14 Variable '${new_successor_list}' not found. 02:39:14 ------------------------------------------------------------------------------ 02:39:14 Check Entity Owner Status And Find New Owner and Successor After D... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 | FAIL | 02:39:25 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${owner_list}' not found. 02:39:25 ------------------------------------------------------------------------------ 02:39:25 Disconnect Mininet From Current Owner :: Disconnect mininet from t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 | FAIL | 02:39:25 Variable '${current_owner}' not found. 02:39:25 ------------------------------------------------------------------------------ 02:39:25 Check Entity Owner Status And Find Current Owner and Successor Aft... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 | FAIL | 02:39:36 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${original_owner_list}' not found. 02:39:36 ------------------------------------------------------------------------------ 02:39:36 Check Switch Moves To Current Master :: Check switch s1 is connect... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 | FAIL | 02:39:36 Variable '${current_new_owner}' not found. 02:39:36 ------------------------------------------------------------------------------ 02:39:36 Check Linear Topology After Owner Disconnect :: Check Linear Topol... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:39:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:06 | FAIL | 02:40:06 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:40:06 ------------------------------------------------------------------------------ 02:40:06 Check Stats Are Not Frozen After Owner Disconnect :: Check that du... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:37 | FAIL | 02:40:37 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 02:40:37 ------------------------------------------------------------------------------ 02:40:37 Remove Flows And Groups After Owner Disconnected :: Remove 1 group... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:37 | PASS | 02:40:37 ------------------------------------------------------------------------------ 02:40:37 Check Flows In Operational DS After Owner Disconnected :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:40:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:09 | FAIL | 02:41:09 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 300 02:41:09 ------------------------------------------------------------------------------ 02:41:09 Check Groups In Operational DS After Owner Disconnected :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:19 | FAIL | 02:41:19 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 594 02:41:19 ------------------------------------------------------------------------------ 02:41:19 Check Flows In Switch After Owner Disconnected :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 | FAIL | 02:41:20 3.0 != 300.0 02:41:20 ------------------------------------------------------------------------------ 02:41:20 Disconnect Mininet From Cluster :: Disconnect Mininet from Cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 | FAIL | 02:41:20 Variable '${original_owner_list}' not found. 02:41:20 ------------------------------------------------------------------------------ 02:41:20 Check No Switches After Disconnect :: Check no switches in topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:51 | FAIL | 02:41:51 Keyword 'ClusterOpenFlow.Check No Switches On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' contains 'openflow:1' 02:41:51 ------------------------------------------------------------------------------ 02:41:51 Check Switch Is Not Connected :: Check switch s1 is not connected ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:41:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:02 | FAIL | 02:42:02 Keyword 'OvsManager.Should Be Disconnected' failed after retrying for 10 seconds. The last error was: Dictionary does not contain key 's1'. 02:42:02 ------------------------------------------------------------------------------ 02:42:02 Reconnect Mininet To Cluster :: Reconnect mininet to cluster by re... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:02 10.30.170.189 02:42:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:03 10.30.171.89 02:42:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:04 10.30.171.50 02:42:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:06 | PASS | 02:42:06 ------------------------------------------------------------------------------ 02:42:06 Check Linear Topology After Mininet Reconnects :: Check Linear Top... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:16 | FAIL | 02:42:16 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:42:16 ------------------------------------------------------------------------------ 02:42:16 Add Flows And Groups After Mininet Reconnects :: Add 1 group type ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:18 | PASS | 02:42:18 ------------------------------------------------------------------------------ 02:42:18 Check Flows In Operational DS After Mininet Reconnects :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:49 | FAIL | 02:42:49 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 02:42:49 ------------------------------------------------------------------------------ 02:42:49 Check Groups In Operational DS After Mininet Reconnects :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:42:59 | FAIL | 02:42:59 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 02:42:59 ------------------------------------------------------------------------------ 02:42:59 Check Flows In Switch After Mininet Reconnects :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:00 | FAIL | 02:43:00 3.0 != 303.0 02:43:00 ------------------------------------------------------------------------------ 02:43:00 Check Entity Owner Status And Find Owner and Successor Before Owne... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:31 | FAIL | 02:43:31 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:43:31 Lengths are different: 2 != 0 02:43:31 ------------------------------------------------------------------------------ 02:43:31 Check Switch Generates Slave Connection Before Owner Stop :: Check... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 | FAIL | 02:43:32 Variable '${original_successor}' not found. 02:43:32 ------------------------------------------------------------------------------ 02:43:32 Check Shards Status Before Owner Stop :: Check Status for all shar... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 | FAIL | 02:43:32 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 02:43:32 [ Message content over the limit has been removed. ] 02:43:32 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 02:43:32 ------------------------------------------------------------------------------ 02:43:32 Stop Owner Instance :: Stop Owner Instance and verify it is shutdown :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 | FAIL | 02:43:33 Variable '${original_owner}' not found. 02:43:33 ------------------------------------------------------------------------------ 02:43:33 Check Shards Status After Stop :: Check Status for all shards in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 | FAIL | 02:43:33 Variable '${new_cluster_list}' not found. 02:43:33 ------------------------------------------------------------------------------ 02:43:33 Check Entity Owner Status And Find Owner and Successor After Stop ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 | FAIL | 02:43:33 Variable '${original_successor}' not found. 02:43:33 ------------------------------------------------------------------------------ 02:43:33 Check Stats Are Not Frozen After Owner Stop :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:43:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 | FAIL | 02:44:04 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 02:44:04 ------------------------------------------------------------------------------ 02:44:04 Remove Configuration In Owner and Verify After Owner Stop :: Remov... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 | FAIL | 02:44:04 Variable '${new_owner}' not found. 02:44:04 ------------------------------------------------------------------------------ 02:44:04 Check Flows After Owner Stop In Operational DS :: Check Flows in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:35 | FAIL | 02:44:35 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 02:44:35 ------------------------------------------------------------------------------ 02:44:35 Check Groups After Owner Stop In Operational DS :: Check Groups in... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:45 | FAIL | 02:44:45 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: Variable '${new_owner}' not found. 02:44:45 ------------------------------------------------------------------------------ 02:44:45 Check Flows In Switch After Owner Stop :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:45 | FAIL | 02:44:45 3.0 != 300.0 02:44:45 ------------------------------------------------------------------------------ 02:44:45 Start Old Owner Instance :: Start old Owner Instance and verify it... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:46 | FAIL | 02:44:46 Variable '${original_owner}' not found. 02:44:46 ------------------------------------------------------------------------------ 02:44:46 Check Entity Owner Status And Find Owner and Successor After Start... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:44:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:17 | FAIL | 02:45:17 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:45:17 Lengths are different: 2 != 0 02:45:17 ------------------------------------------------------------------------------ 02:45:17 Check Linear Topology After Owner Restart :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:27 | FAIL | 02:45:27 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:45:27 ------------------------------------------------------------------------------ 02:45:27 Add Configuration In Owner and Verify After Owner Restart :: Add 1... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 | FAIL | 02:45:28 Variable '${new_owner}' not found. 02:45:28 ------------------------------------------------------------------------------ 02:45:28 Check Stats Are Not Frozen After Owner Restart :: Check that durat... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 | FAIL | 02:45:28 Variable '${new_owner}' not found. 02:45:28 ------------------------------------------------------------------------------ 02:45:28 Check Flows In Operational DS After Owner Restart :: Check Flows i... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:58 | FAIL | 02:45:58 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 02:45:58 ------------------------------------------------------------------------------ 02:45:58 Check Groups In Operational DS After Owner Restart :: Check Groups... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:45:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:09 | FAIL | 02:46:09 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 02:46:09 ------------------------------------------------------------------------------ 02:46:09 Check Flows In Switch After Owner Restart :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:09 | FAIL | 02:46:09 3.0 != 303.0 02:46:09 ------------------------------------------------------------------------------ 02:46:09 Restart Cluster :: Stop and Start cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:46 | PASS | 02:46:46 ------------------------------------------------------------------------------ 02:46:46 Check Linear Topology After Controller Restarts :: Check Linear To... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:56 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:46:56 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:47:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:20 | FAIL | 02:52:20 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 5 minutes. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 02:52:20 ------------------------------------------------------------------------------ 02:52:20 Check Stats Are Not Frozen After Cluster Restart :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:51 | FAIL | 02:52:51 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.189:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 02:52:51 ------------------------------------------------------------------------------ 02:52:51 Check Flows In Operational DS After Controller Restarts :: Check F... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:52:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:22 | FAIL | 02:53:22 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 02:53:22 ------------------------------------------------------------------------------ 02:53:22 Check Groups In Operational DS After Controller Restarts :: Check ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:22 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:22 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:33 | FAIL | 02:53:33 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 342 != 600 02:53:33 ------------------------------------------------------------------------------ 02:53:33 Check Flows In Switch After Controller Restarts :: Check Flows in ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:34 | FAIL | 02:53:34 3.0 != 303.0 02:53:34 ------------------------------------------------------------------------------ 02:53:34 Stop Mininet :: Stop Mininet. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:34 | PASS | 02:53:34 ------------------------------------------------------------------------------ 02:53:34 Check No Switches :: Check no switches in topology. | PASS | 02:53:35 ------------------------------------------------------------------------------ 02:53:37 openflowplugin-clustering.txt.010 Group Flows :: Switch connection... | FAIL | 02:53:37 72 tests, 10 passed, 62 failed 02:53:37 ============================================================================== 02:53:38 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite for entit... 02:53:38 ============================================================================== 02:53:42 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 02:53:42 Parent suite setup failed: 02:53:42 Dictionary does not contain key 's1'. 02:53:42 ------------------------------------------------------------------------------ 02:53:42 Reconnecting Switch s1 | FAIL | 02:53:42 Parent suite setup failed: 02:53:42 Dictionary does not contain key 's1'. 02:53:42 ------------------------------------------------------------------------------ 02:53:42 Switches Still Be Connected To All Nodes | FAIL | 02:53:42 Parent suite setup failed: 02:53:42 Dictionary does not contain key 's1'. 02:53:42 ------------------------------------------------------------------------------ 02:53:42 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite ... | FAIL | 02:53:42 Suite setup failed: 02:53:42 Dictionary does not contain key 's1'. 02:53:42 02:53:42 3 tests, 0 passed, 3 failed 02:53:42 ============================================================================== 02:53:42 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test suite for en... 02:53:42 ============================================================================== 02:53:47 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 02:53:47 Parent suite setup failed: 02:53:47 Dictionary does not contain key 's1'. 02:53:47 ------------------------------------------------------------------------------ 02:53:47 Restarting Owner Of Switch s1 | FAIL | 02:53:47 Parent suite setup failed: 02:53:47 Dictionary does not contain key 's1'. 02:53:47 ------------------------------------------------------------------------------ 02:53:47 Switches Still Be Connected To All Nodes | FAIL | 02:53:47 Parent suite setup failed: 02:53:47 Dictionary does not contain key 's1'. 02:53:47 ------------------------------------------------------------------------------ 02:53:47 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test sui... | FAIL | 02:53:47 Suite setup failed: 02:53:47 Dictionary does not contain key 's1'. 02:53:47 02:53:47 3 tests, 0 passed, 3 failed 02:53:47 ============================================================================== 02:53:47 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test suite for e... 02:53:47 ============================================================================== 02:53:49 Start Mininet To All Nodes | FAIL | 02:53:51 Dictionary does not contain key 's1'. 02:53:51 ------------------------------------------------------------------------------ 02:53:51 Switches To Be Connected To All Nodes :: Initial check for correct... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:53:51 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:07 | FAIL | 02:54:07 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 02:54:07 ------------------------------------------------------------------------------ 02:54:07 Isolating Owner Of Switch s1 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:38 | FAIL | 02:54:38 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 02:54:38 02:54:38 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 02:54:38 ------------------------------------------------------------------------------ 02:54:38 Switches Still Be Connected To All Nodes :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:39 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:39 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:54 | FAIL | 02:54:54 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 02:54:54 02:54:54 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 02:54:54 ------------------------------------------------------------------------------ 02:54:54 Stop Mininet And Verify No Owners :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:54:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 02:55:09 | FAIL | 02:55:09 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 02:55:09 02:55:09 Keyword 'Check No Device Owners In Controller' failed after retrying 15 times. The last error was: Dictionary does not contain key '1'. 02:55:09 ------------------------------------------------------------------------------ 02:55:11 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test su... | FAIL | 02:55:11 5 tests, 0 passed, 5 failed 02:55:11 ============================================================================== 02:55:11 openflowplugin-clustering.txt.9145 :: Switch connections and cluster are re... 02:55:11 ============================================================================== 02:55:11 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 02:55:19 ------------------------------------------------------------------------------ 02:55:19 Check Entity Owner Status And Find Owner and Successor :: Check En... | FAIL | 02:55:50 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=9145 02:55:50 02:55:50 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 02:55:50 Lengths are different: 2 != 0 02:55:50 ------------------------------------------------------------------------------ 02:55:50 Stop Mininet :: Stop Mininet. | PASS | 02:55:50 ------------------------------------------------------------------------------ 02:55:50 openflowplugin-clustering.txt.9145 :: Switch connections and clust... | FAIL | 02:55:50 3 tests, 2 passed, 1 failed 02:55:50 ============================================================================== 02:55:50 openflowplugin-clustering.txt | FAIL | 02:55:50 209 tests, 19 passed, 190 failed 02:55:50 ============================================================================== 02:55:50 Output: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/output.xml 02:55:58 Log: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/log.html 02:55:58 Report: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/report.html 02:55:58 + true 02:55:58 + echo 'Examining the files in data/log and checking filesize' 02:55:58 Examining the files in data/log and checking filesize 02:55:58 + ssh 10.30.170.189 'ls -altr /tmp/karaf-0.22.0/data/log/' 02:55:58 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:55:58 total 1516 02:55:58 drwxrwxr-x 2 jenkins jenkins 4096 Jul 18 02:19 . 02:55:58 -rw-rw-r-- 1 jenkins jenkins 1720 Jul 18 02:19 karaf_console.log 02:55:58 drwxrwxr-x 9 jenkins jenkins 4096 Jul 18 02:19 .. 02:55:58 -rw-rw-r-- 1 jenkins jenkins 1537025 Jul 18 02:55 karaf.log 02:55:58 + ssh 10.30.170.189 'du -hs /tmp/karaf-0.22.0/data/log/*' 02:55:58 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:55:59 1.5M /tmp/karaf-0.22.0/data/log/karaf.log 02:55:59 4.0K /tmp/karaf-0.22.0/data/log/karaf_console.log 02:55:59 + ssh 10.30.171.89 'ls -altr /tmp/karaf-0.22.0/data/log/' 02:55:59 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:55:59 total 1112 02:55:59 drwxrwxr-x 2 jenkins jenkins 4096 Jul 18 02:19 . 02:55:59 -rw-rw-r-- 1 jenkins jenkins 1720 Jul 18 02:19 karaf_console.log 02:55:59 drwxrwxr-x 9 jenkins jenkins 4096 Jul 18 02:19 .. 02:55:59 -rw-rw-r-- 1 jenkins jenkins 1126322 Jul 18 02:55 karaf.log 02:55:59 + ssh 10.30.171.89 'du -hs /tmp/karaf-0.22.0/data/log/*' 02:55:59 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:55:59 1.1M /tmp/karaf-0.22.0/data/log/karaf.log 02:55:59 4.0K /tmp/karaf-0.22.0/data/log/karaf_console.log 02:55:59 + ssh 10.30.171.50 'ls -altr /tmp/karaf-0.22.0/data/log/' 02:55:59 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:55:59 total 1204 02:55:59 drwxrwxr-x 2 jenkins jenkins 4096 Jul 18 02:19 . 02:55:59 -rw-rw-r-- 1 jenkins jenkins 1720 Jul 18 02:19 karaf_console.log 02:55:59 drwxrwxr-x 9 jenkins jenkins 4096 Jul 18 02:19 .. 02:55:59 -rw-rw-r-- 1 jenkins jenkins 1218784 Jul 18 02:55 karaf.log 02:55:59 + ssh 10.30.171.50 'du -hs /tmp/karaf-0.22.0/data/log/*' 02:55:59 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:55:59 1.2M /tmp/karaf-0.22.0/data/log/karaf.log 02:55:59 4.0K /tmp/karaf-0.22.0/data/log/karaf_console.log 02:55:59 + set +e 02:55:59 ++ seq 1 3 02:55:59 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:55:59 + CONTROLLERIP=ODL_SYSTEM_1_IP 02:55:59 + echo 'Let'\''s take the karaf thread dump again' 02:55:59 Let's take the karaf thread dump again 02:55:59 + ssh 10.30.170.189 'sudo ps aux' 02:55:59 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:00 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 02:56:00 ++ grep -v grep 02:56:00 ++ tr -s ' ' 02:56:00 ++ cut -f2 '-d ' 02:56:00 + pid=6453 02:56:00 + echo 'karaf main: org.apache.karaf.main.Main, pid:6453' 02:56:00 karaf main: org.apache.karaf.main.Main, pid:6453 02:56:00 + ssh 10.30.170.189 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 6453' 02:56:00 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:00 + echo 'killing karaf process...' 02:56:00 killing karaf process... 02:56:00 + ssh 10.30.170.189 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 02:56:00 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:00 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:56:00 + CONTROLLERIP=ODL_SYSTEM_2_IP 02:56:00 + echo 'Let'\''s take the karaf thread dump again' 02:56:00 Let's take the karaf thread dump again 02:56:00 + ssh 10.30.171.89 'sudo ps aux' 02:56:00 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:01 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 02:56:01 ++ grep -v grep 02:56:01 ++ tr -s ' ' 02:56:01 ++ cut -f2 '-d ' 02:56:01 + pid=5712 02:56:01 + echo 'karaf main: org.apache.karaf.main.Main, pid:5712' 02:56:01 karaf main: org.apache.karaf.main.Main, pid:5712 02:56:01 + ssh 10.30.171.89 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 5712' 02:56:01 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:01 + echo 'killing karaf process...' 02:56:01 killing karaf process... 02:56:01 + ssh 10.30.171.89 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 02:56:01 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:01 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:56:01 + CONTROLLERIP=ODL_SYSTEM_3_IP 02:56:01 + echo 'Let'\''s take the karaf thread dump again' 02:56:01 Let's take the karaf thread dump again 02:56:01 + ssh 10.30.171.50 'sudo ps aux' 02:56:01 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:02 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 02:56:02 ++ tr -s ' ' 02:56:02 ++ grep -v grep 02:56:02 ++ cut -f2 '-d ' 02:56:02 + pid=5942 02:56:02 + echo 'karaf main: org.apache.karaf.main.Main, pid:5942' 02:56:02 karaf main: org.apache.karaf.main.Main, pid:5942 02:56:02 + ssh 10.30.171.50 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 5942' 02:56:02 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:02 + echo 'killing karaf process...' 02:56:02 killing karaf process... 02:56:02 + ssh 10.30.171.50 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 02:56:02 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:02 + sleep 5 02:56:07 ++ seq 1 3 02:56:07 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:56:07 + CONTROLLERIP=ODL_SYSTEM_1_IP 02:56:07 + echo 'Compressing karaf.log 1' 02:56:07 Compressing karaf.log 1 02:56:07 + ssh 10.30.170.189 gzip --best /tmp/karaf-0.22.0/data/log/karaf.log 02:56:08 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:08 + echo 'Fetching compressed karaf.log 1' 02:56:08 Fetching compressed karaf.log 1 02:56:08 + scp 10.30.170.189:/tmp/karaf-0.22.0/data/log/karaf.log.gz odl1_karaf.log.gz 02:56:08 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:08 + ssh 10.30.170.189 rm -f /tmp/karaf-0.22.0/data/log/karaf.log.gz 02:56:08 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:08 + scp 10.30.170.189:/tmp/karaf-0.22.0/data/log/karaf_console.log odl1_karaf_console.log 02:56:08 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:08 + ssh 10.30.170.189 rm -f /tmp/karaf-0.22.0/data/log/karaf_console.log 02:56:08 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:09 + echo 'Fetch GC logs' 02:56:09 Fetch GC logs 02:56:09 + mkdir -p gclogs-1 02:56:09 + scp '10.30.170.189:/tmp/karaf-0.22.0/data/log/*.log' gclogs-1/ 02:56:09 Warning: Permanently added '10.30.170.189' (ECDSA) to the list of known hosts. 02:56:09 scp: /tmp/karaf-0.22.0/data/log/*.log: No such file or directory 02:56:09 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:56:09 + CONTROLLERIP=ODL_SYSTEM_2_IP 02:56:09 + echo 'Compressing karaf.log 2' 02:56:09 Compressing karaf.log 2 02:56:09 + ssh 10.30.171.89 gzip --best /tmp/karaf-0.22.0/data/log/karaf.log 02:56:09 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:09 + echo 'Fetching compressed karaf.log 2' 02:56:09 Fetching compressed karaf.log 2 02:56:09 + scp 10.30.171.89:/tmp/karaf-0.22.0/data/log/karaf.log.gz odl2_karaf.log.gz 02:56:09 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:09 + ssh 10.30.171.89 rm -f /tmp/karaf-0.22.0/data/log/karaf.log.gz 02:56:09 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:09 + scp 10.30.171.89:/tmp/karaf-0.22.0/data/log/karaf_console.log odl2_karaf_console.log 02:56:09 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:10 + ssh 10.30.171.89 rm -f /tmp/karaf-0.22.0/data/log/karaf_console.log 02:56:10 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:10 + echo 'Fetch GC logs' 02:56:10 Fetch GC logs 02:56:10 + mkdir -p gclogs-2 02:56:10 + scp '10.30.171.89:/tmp/karaf-0.22.0/data/log/*.log' gclogs-2/ 02:56:10 Warning: Permanently added '10.30.171.89' (ECDSA) to the list of known hosts. 02:56:10 scp: /tmp/karaf-0.22.0/data/log/*.log: No such file or directory 02:56:10 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 02:56:10 + CONTROLLERIP=ODL_SYSTEM_3_IP 02:56:10 + echo 'Compressing karaf.log 3' 02:56:10 Compressing karaf.log 3 02:56:10 + ssh 10.30.171.50 gzip --best /tmp/karaf-0.22.0/data/log/karaf.log 02:56:10 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:11 + echo 'Fetching compressed karaf.log 3' 02:56:11 Fetching compressed karaf.log 3 02:56:11 + scp 10.30.171.50:/tmp/karaf-0.22.0/data/log/karaf.log.gz odl3_karaf.log.gz 02:56:11 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:11 + ssh 10.30.171.50 rm -f /tmp/karaf-0.22.0/data/log/karaf.log.gz 02:56:11 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:11 + scp 10.30.171.50:/tmp/karaf-0.22.0/data/log/karaf_console.log odl3_karaf_console.log 02:56:11 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:12 + ssh 10.30.171.50 rm -f /tmp/karaf-0.22.0/data/log/karaf_console.log 02:56:12 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:12 + echo 'Fetch GC logs' 02:56:12 Fetch GC logs 02:56:12 + mkdir -p gclogs-3 02:56:12 + scp '10.30.171.50:/tmp/karaf-0.22.0/data/log/*.log' gclogs-3/ 02:56:12 Warning: Permanently added '10.30.171.50' (ECDSA) to the list of known hosts. 02:56:12 scp: /tmp/karaf-0.22.0/data/log/*.log: No such file or directory 02:56:12 + echo 'Examine copied files' 02:56:12 Examine copied files 02:56:12 + ls -lt 02:56:12 total 120092 02:56:12 drwxrwxr-x. 2 jenkins jenkins 6 Jul 18 02:56 gclogs-3 02:56:12 -rw-rw-r--. 1 jenkins jenkins 1720 Jul 18 02:56 odl3_karaf_console.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 96325 Jul 18 02:56 odl3_karaf.log.gz 02:56:12 drwxrwxr-x. 2 jenkins jenkins 6 Jul 18 02:56 gclogs-2 02:56:12 -rw-rw-r--. 1 jenkins jenkins 1720 Jul 18 02:56 odl2_karaf_console.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 93348 Jul 18 02:56 odl2_karaf.log.gz 02:56:12 drwxrwxr-x. 2 jenkins jenkins 6 Jul 18 02:56 gclogs-1 02:56:12 -rw-rw-r--. 1 jenkins jenkins 1720 Jul 18 02:56 odl1_karaf_console.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 103260 Jul 18 02:56 odl1_karaf.log.gz 02:56:12 -rw-rw-r--. 1 jenkins jenkins 134177 Jul 18 02:56 karaf_3_5942_threads_after.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 13528 Jul 18 02:56 ps_after.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 135924 Jul 18 02:56 karaf_2_5712_threads_after.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 148185 Jul 18 02:56 karaf_1_6453_threads_after.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 287260 Jul 18 02:55 report.html 02:56:12 -rw-rw-r--. 1 jenkins jenkins 2782047 Jul 18 02:55 log.html 02:56:12 -rw-rw-r--. 1 jenkins jenkins 118780359 Jul 18 02:55 output.xml 02:56:12 -rw-rw-r--. 1 jenkins jenkins 1180 Jul 18 02:22 testplan.txt 02:56:12 -rw-rw-r--. 1 jenkins jenkins 95204 Jul 18 02:22 karaf_3_2125_threads_before.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 13614 Jul 18 02:22 ps_before.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 93501 Jul 18 02:22 karaf_2_2139_threads_before.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 94362 Jul 18 02:22 karaf_1_2128_threads_before.log 02:56:12 -rw-rw-r--. 1 jenkins jenkins 3043 Jul 18 02:19 post-startup-script.sh 02:56:12 -rw-rw-r--. 1 jenkins jenkins 1160 Jul 18 02:19 set_akka_debug.sh 02:56:12 -rw-rw-r--. 1 jenkins jenkins 133 Jul 18 02:19 configplan.txt 02:56:12 -rw-rw-r--. 1 jenkins jenkins 225 Jul 18 02:19 startup-script.sh 02:56:12 -rw-rw-r--. 1 jenkins jenkins 3278 Jul 18 02:19 configuration-script.sh 02:56:12 -rw-rw-r--. 1 jenkins jenkins 266 Jul 18 02:18 detect_variables.env 02:56:12 -rw-rw-r--. 1 jenkins jenkins 92 Jul 18 02:18 set_variables.env 02:56:12 -rw-rw-r--. 1 jenkins jenkins 355 Jul 18 02:18 slave_addresses.txt 02:56:12 -rw-rw-r--. 1 jenkins jenkins 570 Jul 18 02:18 requirements.txt 02:56:12 -rw-rw-r--. 1 jenkins jenkins 26 Jul 18 02:18 env.properties 02:56:12 -rw-rw-r--. 1 jenkins jenkins 334 Jul 18 02:15 stack-parameters.yaml 02:56:12 drwxrwxr-x. 7 jenkins jenkins 4096 Jul 18 02:14 test 02:56:12 drwxrwxr-x. 2 jenkins jenkins 6 Jul 18 02:14 test@tmp 02:56:12 + true 02:56:12 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins17159955790739196158.sh 02:56:12 Cleaning up Robot installation... 02:56:12 $ ssh-agent -k 02:56:12 unset SSH_AUTH_SOCK; 02:56:12 unset SSH_AGENT_PID; 02:56:12 echo Agent pid 5299 killed; 02:56:12 [ssh-agent] Stopped. 02:56:12 Recording plot data 02:56:13 Robot results publisher started... 02:56:13 INFO: Checking test criticality is deprecated and will be dropped in a future release! 02:56:13 -Parsing output xml: 02:56:15 Done! 02:56:15 -Copying log files to build dir: 02:56:18 Done! 02:56:18 -Assigning results to build: 02:56:18 Done! 02:56:18 -Checking thresholds: 02:56:18 Done! 02:56:18 Done publishing Robot results. 02:56:18 Build step 'Publish Robot Framework test results' changed build result to UNSTABLE 02:56:18 [PostBuildScript] - [INFO] Executing post build scripts. 02:56:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins12110613135598200545.sh 02:56:19 Archiving csit artifacts 02:56:19 mv: cannot stat '*_1.png': No such file or directory 02:56:19 mv: cannot stat '/tmp/odl1_*': No such file or directory 02:56:19 mv: cannot stat '*_2.png': No such file or directory 02:56:19 mv: cannot stat '/tmp/odl2_*': No such file or directory 02:56:19 mv: cannot stat '*_3.png': No such file or directory 02:56:19 mv: cannot stat '/tmp/odl3_*': No such file or directory 02:56:19 % Total % Received % Xferd Average Speed Time Time Time Current 02:56:19 Dload Upload Total Spent Left Speed 02:56:19 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1544k 0 1544k 0 0 5984k 0 --:--:-- --:--:-- --:--:-- 5961k 100 7208k 0 7208k 0 0 5734k 0 --:--:-- 0:00:01 --:--:-- 5729k 100 9.9M 0 9.9M 0 0 5430k 0 --:--:-- 0:00:01 --:--:-- 5430k 02:56:20 Archive: robot-plugin.zip 02:56:20 inflating: ./archives/robot-plugin/log.html 02:56:20 inflating: ./archives/robot-plugin/output.xml 02:56:21 inflating: ./archives/robot-plugin/report.html 02:56:21 mv: cannot stat '*.log.gz': No such file or directory 02:56:21 mv: cannot stat '*.csv': No such file or directory 02:56:21 mv: cannot stat '*.png': No such file or directory 02:56:21 [PostBuildScript] - [INFO] Executing post build scripts. 02:56:21 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins814630645740548028.sh 02:56:21 [PostBuildScript] - [INFO] Executing post build scripts. 02:56:21 [EnvInject] - Injecting environment variables from a build step. 02:56:21 [EnvInject] - Injecting as environment variables the properties content 02:56:21 OS_CLOUD=vex 02:56:21 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:56:21 02:56:21 [EnvInject] - Variables injected successfully. 02:56:21 provisioning config files... 02:56:21 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 02:56:21 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins14126876613086143478.sh 02:56:21 ---> openstack-stack-delete.sh 02:56:21 Setup pyenv: 02:56:22 system 02:56:22 3.8.13 02:56:22 3.9.13 02:56:22 3.10.13 02:56:22 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:56:22 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:56:24 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 02:56:24 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 02:56:24 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 02:56:42 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:56:42 INFO: Retrieving stack cost for: releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:56:47 DEBUG: Successfully retrieved stack cost: total: 0.38999999999999996 02:56:59 INFO: Deleting stack releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:56:59 Successfully deleted stack releng-openflowplugin-csit-3node-clustering-only-titanium-313 02:56:59 [PostBuildScript] - [INFO] Executing post build scripts. 02:56:59 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins2356529691930064116.sh 02:56:59 ---> sysstat.sh 02:56:59 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins17498575748418027022.sh 02:56:59 ---> package-listing.sh 02:56:59 ++ tr '[:upper:]' '[:lower:]' 02:56:59 ++ facter osfamily 02:57:00 + OS_FAMILY=redhat 02:57:00 + workspace=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium 02:57:00 + START_PACKAGES=/tmp/packages_start.txt 02:57:00 + END_PACKAGES=/tmp/packages_end.txt 02:57:00 + DIFF_PACKAGES=/tmp/packages_diff.txt 02:57:00 + PACKAGES=/tmp/packages_start.txt 02:57:00 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 02:57:00 + PACKAGES=/tmp/packages_end.txt 02:57:00 + case "${OS_FAMILY}" in 02:57:00 + rpm -qa 02:57:00 + sort 02:57:00 + '[' -f /tmp/packages_start.txt ']' 02:57:00 + '[' -f /tmp/packages_end.txt ']' 02:57:00 + diff /tmp/packages_start.txt /tmp/packages_end.txt 02:57:00 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 02:57:00 + mkdir -p /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 02:57:00 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 02:57:00 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins2586600889851718268.sh 02:57:00 ---> capture-instance-metadata.sh 02:57:00 Setup pyenv: 02:57:01 system 02:57:01 3.8.13 02:57:01 3.9.13 02:57:01 3.10.13 02:57:01 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:57:01 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:57:03 lf-activate-venv(): INFO: Installing: lftools 02:57:13 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:57:13 INFO: Running in OpenStack, capturing instance metadata 02:57:14 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins1740915155104354807.sh 02:57:14 provisioning config files... 02:57:14 Could not find credentials [logs] for openflowplugin-csit-3node-clustering-only-titanium #313 02:57:14 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/config8426643914822233344tmp 02:57:14 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 02:57:14 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 02:57:14 provisioning config files... 02:57:14 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 02:57:14 [EnvInject] - Injecting environment variables from a build step. 02:57:14 [EnvInject] - Injecting as environment variables the properties content 02:57:14 SERVER_ID=logs 02:57:14 02:57:14 [EnvInject] - Variables injected successfully. 02:57:14 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins9400168801884147212.sh 02:57:14 ---> create-netrc.sh 02:57:15 WARN: Log server credential not found. 02:57:15 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins12464037549377030996.sh 02:57:15 ---> python-tools-install.sh 02:57:15 Setup pyenv: 02:57:15 system 02:57:15 3.8.13 02:57:15 3.9.13 02:57:15 3.10.13 02:57:15 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:57:15 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:57:17 lf-activate-venv(): INFO: Installing: lftools 02:57:27 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:57:27 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins11558803494173978623.sh 02:57:27 ---> sudo-logs.sh 02:57:27 Archiving 'sudo' log.. 02:57:28 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins14108039962107556167.sh 02:57:28 ---> job-cost.sh 02:57:28 Setup pyenv: 02:57:28 system 02:57:28 3.8.13 02:57:28 3.9.13 02:57:28 3.10.13 02:57:28 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:57:28 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:57:30 lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 02:57:38 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:57:38 DEBUG: total: 0.38999999999999996 02:57:38 INFO: Retrieving Stack Cost... 02:57:39 INFO: Retrieving Pricing Info for: v3-standard-2 02:57:39 INFO: Archiving Costs 02:57:39 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins1374851622896999362.sh 02:57:39 ---> logs-deploy.sh 02:57:39 Setup pyenv: 02:57:39 system 02:57:39 3.8.13 02:57:39 3.9.13 02:57:39 3.10.13 02:57:39 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 02:57:39 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-zBC1 from file:/tmp/.os_lf_venv 02:57:41 lf-activate-venv(): INFO: Installing: lftools 02:57:55 lf-activate-venv(): INFO: Adding /tmp/venv-zBC1/bin to PATH 02:57:55 WARNING: Nexus logging server not set 02:57:55 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/openflowplugin-csit-3node-clustering-only-titanium/313/ 02:57:55 INFO: archiving logs to S3 02:57:56 ---> uname -a: 02:57:56 Linux prd-centos8-robot-2c-8g-43664.novalocal 4.18.0-553.5.1.el8.x86_64 #1 SMP Tue May 21 05:46:01 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 02:57:56 02:57:56 02:57:56 ---> lscpu: 02:57:56 Architecture: x86_64 02:57:56 CPU op-mode(s): 32-bit, 64-bit 02:57:56 Byte Order: Little Endian 02:57:56 CPU(s): 2 02:57:56 On-line CPU(s) list: 0,1 02:57:56 Thread(s) per core: 1 02:57:56 Core(s) per socket: 1 02:57:56 Socket(s): 2 02:57:56 NUMA node(s): 1 02:57:56 Vendor ID: AuthenticAMD 02:57:56 CPU family: 23 02:57:56 Model: 49 02:57:56 Model name: AMD EPYC-Rome Processor 02:57:56 Stepping: 0 02:57:56 CPU MHz: 2800.000 02:57:56 BogoMIPS: 5600.00 02:57:56 Virtualization: AMD-V 02:57:56 Hypervisor vendor: KVM 02:57:56 Virtualization type: full 02:57:56 L1d cache: 32K 02:57:56 L1i cache: 32K 02:57:56 L2 cache: 512K 02:57:56 L3 cache: 16384K 02:57:56 NUMA node0 CPU(s): 0,1 02:57:56 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 02:57:56 02:57:56 02:57:56 ---> nproc: 02:57:56 2 02:57:56 02:57:56 02:57:56 ---> df -h: 02:57:56 Filesystem Size Used Avail Use% Mounted on 02:57:56 devtmpfs 3.8G 0 3.8G 0% /dev 02:57:56 tmpfs 3.8G 0 3.8G 0% /dev/shm 02:57:56 tmpfs 3.8G 17M 3.8G 1% /run 02:57:56 tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup 02:57:56 /dev/vda1 40G 8.5G 32G 22% / 02:57:56 tmpfs 770M 0 770M 0% /run/user/1001 02:57:56 02:57:56 02:57:56 ---> free -m: 02:57:56 total used free shared buff/cache available 02:57:56 Mem: 7697 599 4676 19 2422 6799 02:57:56 Swap: 1023 0 1023 02:57:56 02:57:56 02:57:56 ---> ip addr: 02:57:56 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 02:57:56 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 02:57:56 inet 127.0.0.1/8 scope host lo 02:57:56 valid_lft forever preferred_lft forever 02:57:56 inet6 ::1/128 scope host 02:57:56 valid_lft forever preferred_lft forever 02:57:56 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 02:57:56 link/ether fa:16:3e:11:25:f3 brd ff:ff:ff:ff:ff:ff 02:57:56 altname enp0s3 02:57:56 altname ens3 02:57:56 inet 10.30.171.238/23 brd 10.30.171.255 scope global dynamic noprefixroute eth0 02:57:56 valid_lft 83724sec preferred_lft 83724sec 02:57:56 inet6 fe80::f816:3eff:fe11:25f3/64 scope link 02:57:56 valid_lft forever preferred_lft forever 02:57:56 02:57:56 02:57:56 ---> sar -b -r -n DEV: 02:57:56 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 07/18/2025 _x86_64_ (2 CPU) 02:57:56 02:57:56 02:13:15 LINUX RESTART (2 CPU) 02:57:56 02:57:56 02:14:03 AM tps rtps wtps bread/s bwrtn/s 02:57:56 02:15:01 AM 39.06 8.32 30.73 1470.67 3065.72 02:57:56 02:16:01 AM 97.17 0.22 96.95 11.33 8463.43 02:57:56 02:17:01 AM 39.74 0.68 39.05 51.58 6387.02 02:57:56 02:18:01 AM 24.34 0.38 23.96 64.25 1845.20 02:57:56 02:19:01 AM 59.29 6.97 52.32 1292.98 3338.99 02:57:56 02:20:01 AM 27.46 0.00 27.46 0.00 5997.42 02:57:56 02:21:01 AM 2.30 0.00 2.30 0.00 66.08 02:57:56 02:22:01 AM 0.20 0.00 0.20 0.00 2.80 02:57:56 02:23:01 AM 0.27 0.02 0.25 0.13 6.85 02:57:56 02:24:01 AM 1.98 0.00 1.98 0.00 163.44 02:57:56 02:25:01 AM 0.32 0.00 0.32 0.00 289.95 02:57:56 02:26:01 AM 0.35 0.00 0.35 0.00 279.20 02:57:56 02:27:01 AM 0.17 0.00 0.17 0.00 87.17 02:57:56 02:28:01 AM 0.30 0.00 0.30 0.00 83.44 02:57:56 02:29:01 AM 1.62 0.03 1.58 0.27 369.92 02:57:56 02:30:01 AM 0.30 0.00 0.30 0.00 183.31 02:57:56 02:31:01 AM 0.22 0.00 0.22 0.00 105.77 02:57:56 02:32:01 AM 0.45 0.00 0.45 0.00 215.56 02:57:56 02:33:01 AM 9.01 5.98 3.03 576.74 122.13 02:57:56 02:34:01 AM 0.72 0.00 0.72 0.00 188.07 02:57:56 02:35:01 AM 0.32 0.00 0.32 0.00 71.09 02:57:56 02:36:01 AM 0.15 0.00 0.15 0.00 100.12 02:57:56 02:37:01 AM 0.20 0.00 0.20 0.00 77.07 02:57:56 02:38:01 AM 0.28 0.00 0.28 0.00 165.11 02:57:56 02:39:01 AM 0.32 0.00 0.32 0.00 29.75 02:57:56 02:40:01 AM 0.25 0.00 0.25 0.00 187.02 02:57:56 02:41:01 AM 0.28 0.00 0.28 0.00 42.95 02:57:56 02:42:01 AM 0.37 0.00 0.37 0.00 181.84 02:57:56 02:43:01 AM 0.22 0.00 0.22 0.00 152.24 02:57:56 02:44:01 AM 0.37 0.00 0.37 0.00 132.19 02:57:56 02:45:01 AM 0.30 0.00 0.30 0.00 7.05 02:57:56 02:46:01 AM 0.30 0.00 0.30 0.00 126.11 02:57:56 02:47:01 AM 0.22 0.00 0.22 0.00 112.41 02:57:56 02:48:01 AM 0.17 0.00 0.17 0.00 22.78 02:57:56 02:49:01 AM 0.27 0.00 0.27 0.00 27.81 02:57:56 02:50:01 AM 0.25 0.00 0.25 0.00 26.01 02:57:56 02:51:01 AM 0.28 0.00 0.28 0.00 23.76 02:57:56 02:52:01 AM 0.30 0.00 0.30 0.00 30.06 02:57:56 02:53:01 AM 0.28 0.00 0.28 0.00 116.61 02:57:56 02:54:01 AM 0.33 0.00 0.33 0.00 302.33 02:57:56 02:55:01 AM 0.43 0.00 0.43 0.00 58.15 02:57:56 02:56:01 AM 0.25 0.00 0.25 0.00 52.17 02:57:56 02:57:01 AM 17.64 0.42 17.23 47.72 5073.29 02:57:56 Average: 7.63 0.53 7.10 80.68 890.90 02:57:56 02:57:56 02:14:03 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 02:57:56 02:15:01 AM 5632460 7118008 2249964 28.54 2688 1680312 637348 7.14 178420 1795536 22484 02:57:56 02:16:01 AM 5244988 7038604 2637436 33.46 2688 1970692 667668 7.48 187372 2108740 129060 02:57:56 02:17:01 AM 5213228 7058324 2669196 33.86 2688 2021080 647848 7.25 208728 2114504 42716 02:57:56 02:18:01 AM 5267668 7106872 2614756 33.17 2688 2015344 602560 6.75 219076 2050096 8 02:57:56 02:19:01 AM 5001500 7086584 2880924 36.55 2688 2253012 600044 6.72 262740 2246808 169068 02:57:56 02:20:01 AM 4995860 7080652 2886564 36.62 2688 2252984 616388 6.90 262768 2246368 8 02:57:56 02:21:01 AM 4996572 7081372 2885852 36.61 2688 2252984 616388 6.90 262772 2245968 8 02:57:56 02:22:01 AM 4996404 7081208 2886020 36.61 2688 2252988 616388 6.90 262772 2245968 4 02:57:56 02:23:01 AM 4950664 7036620 2931760 37.19 2688 2254156 701952 7.86 262900 2291184 1152 02:57:56 02:24:01 AM 4940596 7029080 2941828 37.32 2688 2256668 701952 7.86 262900 2301184 688 02:57:56 02:25:01 AM 4913852 7012904 2968572 37.66 2688 2267276 701952 7.86 262900 2326928 2628 02:57:56 02:26:01 AM 4894356 6999988 2988068 37.91 2688 2273840 717652 8.04 262900 2347324 860 02:57:56 02:27:01 AM 4888660 6996304 2993764 37.98 2688 2275844 757224 8.48 262900 2353128 300 02:57:56 02:28:01 AM 4885700 6995652 2996724 38.02 2688 2278184 778184 8.71 262900 2355432 228 02:57:56 02:29:01 AM 4874496 6995092 3007928 38.16 2688 2288860 757176 8.48 263056 2366348 756 02:57:56 02:30:01 AM 4867908 6994856 3014516 38.24 2688 2295140 763128 8.54 263056 2373244 1608 02:57:56 02:31:01 AM 4863640 6995176 3018784 38.30 2688 2299740 763128 8.54 263056 2377744 3076 02:57:56 02:32:01 AM 4859564 6995332 3022860 38.35 2688 2303940 779524 8.73 263140 2382000 956 02:57:56 02:33:01 AM 4842200 6997104 3040224 38.57 2688 2323004 714500 8.00 265772 2396540 1212 02:57:56 02:34:01 AM 4835552 6994876 3046872 38.65 2688 2327460 747516 8.37 265828 2402864 396 02:57:56 02:35:01 AM 4831472 6994028 3050952 38.71 2688 2330648 747516 8.37 265828 2406032 1564 02:57:56 02:36:01 AM 4831008 6995140 3051416 38.71 2688 2332272 747516 8.37 265828 2407608 212 02:57:56 02:37:01 AM 4827512 6994124 3054912 38.76 2688 2334728 747516 8.37 265828 2410084 392 02:57:56 02:38:01 AM 4823480 6995108 3058944 38.81 2688 2339716 747516 8.37 265828 2415212 484 02:57:56 02:39:01 AM 4818072 6993600 3064352 38.88 2688 2343616 747516 8.37 265828 2419056 3644 02:57:56 02:40:01 AM 4816472 6994508 3065952 38.90 2688 2346188 761248 8.52 265828 2421824 644 02:57:56 02:41:01 AM 4812912 6994092 3069512 38.94 2688 2349304 761248 8.52 265828 2424648 2564 02:57:56 02:42:01 AM 4809928 6994460 3072496 38.98 2688 2352636 761248 8.52 265828 2428332 544 02:57:56 02:43:01 AM 4802912 6993928 3079512 39.07 2688 2359084 761248 8.52 265828 2434976 2468 02:57:56 02:44:01 AM 4802360 6994748 3080064 39.08 2688 2360456 761248 8.52 265828 2436072 24 02:57:56 02:45:01 AM 4801680 6994776 3080744 39.08 2688 2361160 761248 8.52 265828 2436832 584 02:57:56 02:46:01 AM 4796344 6994524 3086080 39.15 2688 2366248 761248 8.52 265828 2441796 1968 02:57:56 02:47:01 AM 4795260 6995144 3087164 39.17 2688 2367936 725212 8.12 265828 2443444 344 02:57:56 02:48:01 AM 4794336 6994716 3088088 39.18 2688 2368456 725212 8.12 265828 2444220 252 02:57:56 02:49:01 AM 4793764 6994816 3088660 39.18 2688 2369120 725212 8.12 265828 2444804 192 02:57:56 02:50:01 AM 4792912 6994628 3089512 39.19 2688 2369764 715456 8.01 265828 2445524 168 02:57:56 02:51:01 AM 4792660 6995040 3089764 39.20 2688 2370432 715456 8.01 265828 2445952 208 02:57:56 02:52:01 AM 4792120 6995128 3090304 39.20 2688 2371076 715456 8.01 265828 2446580 80 02:57:56 02:53:01 AM 4787448 6993792 3094976 39.26 2688 2374428 715456 8.01 265828 2450028 4 02:57:56 02:54:01 AM 4775748 6991960 3106676 39.41 2688 2384240 788796 8.83 265976 2462328 800 02:57:56 02:55:01 AM 4774172 6991800 3108252 39.43 2688 2385648 788796 8.83 265984 2463668 688 02:57:56 02:56:01 AM 4858892 7080952 3023532 38.36 2688 2390068 609848 6.83 266316 2379904 3604 02:57:56 02:57:01 AM 4848572 7018584 3033852 38.49 2688 2343648 642296 7.19 575268 2105192 24232 02:57:56 Average: 4896416 7015679 2986008 37.88 2688 2288707 716791 8.03 265819 2347489 9834 02:57:56 02:57:56 02:14:03 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 02:57:56 02:15:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:15:01 AM eth0 151.79 100.38 668.28 37.61 0.00 0.00 0.00 0.00 02:57:56 02:16:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:16:01 AM eth0 81.01 64.95 851.07 9.91 0.00 0.00 0.00 0.00 02:57:56 02:17:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:17:01 AM eth0 42.12 27.87 562.65 3.57 0.00 0.00 0.00 0.00 02:57:56 02:18:01 AM lo 0.07 0.07 0.01 0.01 0.00 0.00 0.00 0.00 02:57:56 02:18:01 AM eth0 15.54 14.76 5.58 4.59 0.00 0.00 0.00 0.00 02:57:56 02:19:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:19:01 AM eth0 44.03 39.04 318.45 14.03 0.00 0.00 0.00 0.00 02:57:56 02:20:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:20:01 AM eth0 440.99 321.81 85.97 72.94 0.00 0.00 0.00 0.00 02:57:56 02:21:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:21:01 AM eth0 3.78 2.00 0.52 0.43 0.00 0.00 0.00 0.00 02:57:56 02:22:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:22:01 AM eth0 4.03 1.72 0.49 0.38 0.00 0.00 0.00 0.00 02:57:56 02:23:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:23:01 AM eth0 49.89 34.26 13.65 3.94 0.00 0.00 0.00 0.00 02:57:56 02:24:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:24:01 AM eth0 8.28 8.68 9.84 1.60 0.00 0.00 0.00 0.00 02:57:56 02:25:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:25:01 AM eth0 11.55 11.81 49.62 1.63 0.00 0.00 0.00 0.00 02:57:56 02:26:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:26:01 AM eth0 13.66 12.75 31.24 1.77 0.00 0.00 0.00 0.00 02:57:56 02:27:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:27:01 AM eth0 24.56 22.73 7.48 3.71 0.00 0.00 0.00 0.00 02:57:56 02:28:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:28:01 AM eth0 7.46 7.80 8.16 1.27 0.00 0.00 0.00 0.00 02:57:56 02:29:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:29:01 AM eth0 13.83 11.96 50.49 1.89 0.00 0.00 0.00 0.00 02:57:56 02:30:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:30:01 AM eth0 10.43 11.10 26.29 1.62 0.00 0.00 0.00 0.00 02:57:56 02:31:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:31:01 AM eth0 83.69 81.49 30.74 6.66 0.00 0.00 0.00 0.00 02:57:56 02:32:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:32:01 AM eth0 78.76 78.67 28.99 6.06 0.00 0.00 0.00 0.00 02:57:56 02:33:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:33:01 AM eth0 11.23 9.48 10.14 1.74 0.00 0.00 0.00 0.00 02:57:56 02:34:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:34:01 AM eth0 75.54 75.72 16.02 14.55 0.00 0.00 0.00 0.00 02:57:56 02:35:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:35:01 AM eth0 22.81 23.66 43.70 1.97 0.00 0.00 0.00 0.00 02:57:56 02:36:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:36:01 AM eth0 44.80 46.68 9.28 3.72 0.00 0.00 0.00 0.00 02:57:56 02:37:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:37:01 AM eth0 24.71 25.01 30.96 2.15 0.00 0.00 0.00 0.00 02:57:56 02:38:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:38:01 AM eth0 43.09 45.32 57.85 3.78 0.00 0.00 0.00 0.00 02:57:56 02:39:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:39:01 AM eth0 20.25 20.58 56.32 1.82 0.00 0.00 0.00 0.00 02:57:56 02:40:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:40:01 AM eth0 62.25 62.78 35.64 4.76 0.00 0.00 0.00 0.00 02:57:56 02:41:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:41:01 AM eth0 29.34 26.49 44.41 2.25 0.00 0.00 0.00 0.00 02:57:56 02:42:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:42:01 AM eth0 41.25 37.69 43.17 3.21 0.00 0.00 0.00 0.00 02:57:56 02:43:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:43:01 AM eth0 58.56 58.84 88.01 7.86 0.00 0.00 0.00 0.00 02:57:56 02:44:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:44:01 AM eth0 39.71 41.58 7.61 3.44 0.00 0.00 0.00 0.00 02:57:56 02:45:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:45:01 AM eth0 40.07 40.95 6.48 3.09 0.00 0.00 0.00 0.00 02:57:56 02:46:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:46:01 AM eth0 39.23 40.63 69.29 3.27 0.00 0.00 0.00 0.00 02:57:56 02:47:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:47:01 AM eth0 22.66 21.35 21.18 2.62 0.00 0.00 0.00 0.00 02:57:56 02:48:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:48:01 AM eth0 2.37 2.10 1.77 0.55 0.00 0.00 0.00 0.00 02:57:56 02:49:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:49:01 AM eth0 0.82 1.08 1.27 0.27 0.00 0.00 0.00 0.00 02:57:56 02:50:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:50:01 AM eth0 0.93 1.03 1.30 0.22 0.00 0.00 0.00 0.00 02:57:56 02:51:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:51:01 AM eth0 0.80 0.97 1.20 0.21 0.00 0.00 0.00 0.00 02:57:56 02:52:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:52:01 AM eth0 0.90 1.17 1.24 0.22 0.00 0.00 0.00 0.00 02:57:56 02:53:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:53:01 AM eth0 20.24 21.01 48.33 1.77 0.00 0.00 0.00 0.00 02:57:56 02:54:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:54:01 AM eth0 94.23 90.55 155.12 7.99 0.00 0.00 0.00 0.00 02:57:56 02:55:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:55:01 AM eth0 43.42 32.06 7.49 2.68 0.00 0.00 0.00 0.00 02:57:56 02:56:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:56:01 AM eth0 41.64 25.56 11.30 3.56 0.00 0.00 0.00 0.00 02:57:56 02:57:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 02:57:01 AM eth0 139.25 92.90 264.21 326.27 0.00 0.00 0.00 0.00 02:57:56 Average: lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 02:57:56 Average: eth0 46.56 39.46 87.52 13.41 0.00 0.00 0.00 0.00 02:57:56 02:57:56 02:57:56 ---> sar -P ALL: 02:57:56 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 07/18/2025 _x86_64_ (2 CPU) 02:57:56 02:57:56 02:13:15 LINUX RESTART (2 CPU) 02:57:56 02:57:56 02:14:03 AM CPU %user %nice %system %iowait %steal %idle 02:57:56 02:15:01 AM all 14.95 0.00 2.53 19.24 0.08 63.20 02:57:56 02:15:01 AM 0 11.38 0.00 2.32 17.96 0.07 68.27 02:57:56 02:15:01 AM 1 18.52 0.00 2.74 20.53 0.09 58.13 02:57:56 02:16:01 AM all 34.71 0.00 4.98 4.05 0.10 56.17 02:57:56 02:16:01 AM 0 30.26 0.00 4.75 5.11 0.10 59.78 02:57:56 02:16:01 AM 1 39.16 0.00 5.20 2.98 0.10 52.55 02:57:56 02:17:01 AM all 11.67 0.00 2.04 10.56 0.06 75.67 02:57:56 02:17:01 AM 0 5.62 0.00 1.69 10.51 0.03 82.14 02:57:56 02:17:01 AM 1 17.73 0.00 2.40 10.61 0.08 69.18 02:57:56 02:18:01 AM all 10.81 0.00 1.57 0.19 0.05 87.38 02:57:56 02:18:01 AM 0 9.90 0.00 1.34 0.20 0.05 88.51 02:57:56 02:18:01 AM 1 11.72 0.00 1.80 0.18 0.05 86.24 02:57:56 02:19:01 AM all 32.98 0.00 5.67 1.39 0.12 59.84 02:57:56 02:19:01 AM 0 18.40 0.00 4.97 1.27 0.12 75.25 02:57:56 02:19:01 AM 1 47.60 0.00 6.37 1.52 0.12 44.39 02:57:56 02:20:01 AM all 1.27 0.00 1.11 0.64 0.03 96.95 02:57:56 02:20:01 AM 0 1.11 0.00 1.11 0.37 0.02 97.40 02:57:56 02:20:01 AM 1 1.44 0.00 1.12 0.91 0.03 96.50 02:57:56 02:21:01 AM all 0.38 0.00 0.11 0.02 0.03 99.47 02:57:56 02:21:01 AM 0 0.17 0.00 0.15 0.00 0.03 99.65 02:57:56 02:21:01 AM 1 0.58 0.00 0.07 0.03 0.02 99.30 02:57:56 02:22:01 AM all 0.34 0.00 0.11 0.00 0.03 99.52 02:57:56 02:22:01 AM 0 0.38 0.00 0.13 0.00 0.02 99.47 02:57:56 02:22:01 AM 1 0.30 0.00 0.08 0.00 0.03 99.58 02:57:56 02:23:01 AM all 4.58 0.00 0.47 0.00 0.05 94.90 02:57:56 02:23:01 AM 0 4.97 0.00 0.50 0.00 0.05 94.48 02:57:56 02:23:01 AM 1 4.19 0.00 0.44 0.00 0.05 95.32 02:57:56 02:24:01 AM all 4.49 0.00 0.38 0.01 0.06 95.07 02:57:56 02:24:01 AM 0 2.86 0.00 0.27 0.00 0.05 96.82 02:57:56 02:24:01 AM 1 6.12 0.00 0.48 0.02 0.07 93.31 02:57:56 02:25:01 AM all 13.47 0.00 0.49 0.01 0.07 85.97 02:57:56 02:25:01 AM 0 7.48 0.00 0.32 0.00 0.05 92.15 02:57:56 02:25:01 AM 1 19.44 0.00 0.67 0.02 0.08 79.79 02:57:56 02:57:56 02:25:01 AM CPU %user %nice %system %iowait %steal %idle 02:57:56 02:26:01 AM all 9.52 0.00 0.62 0.02 0.07 89.78 02:57:56 02:26:01 AM 0 6.29 0.00 0.54 0.03 0.07 93.07 02:57:56 02:26:01 AM 1 12.75 0.00 0.70 0.00 0.07 86.48 02:57:56 02:27:01 AM all 5.99 0.00 0.41 0.00 0.05 93.55 02:57:56 02:27:01 AM 0 4.72 0.00 0.38 0.00 0.05 94.85 02:57:56 02:27:01 AM 1 7.26 0.00 0.44 0.00 0.05 92.26 02:57:56 02:28:01 AM all 4.03 0.00 0.36 0.00 0.05 95.56 02:57:56 02:28:01 AM 0 2.21 0.00 0.33 0.00 0.05 97.41 02:57:56 02:28:01 AM 1 5.85 0.00 0.38 0.00 0.05 93.72 02:57:56 02:29:01 AM all 13.88 0.00 0.69 0.02 0.08 85.34 02:57:56 02:29:01 AM 0 5.45 0.00 0.64 0.03 0.07 93.81 02:57:56 02:29:01 AM 1 22.30 0.00 0.75 0.00 0.08 76.87 02:57:56 02:30:01 AM all 9.93 0.00 0.63 0.01 0.07 89.37 02:57:56 02:30:01 AM 0 11.96 0.00 0.72 0.02 0.07 87.24 02:57:56 02:30:01 AM 1 7.91 0.00 0.53 0.00 0.07 91.49 02:57:56 02:31:01 AM all 7.59 0.00 1.12 0.00 0.08 91.21 02:57:56 02:31:01 AM 0 6.68 0.00 1.11 0.00 0.08 92.12 02:57:56 02:31:01 AM 1 8.49 0.00 1.14 0.00 0.07 90.30 02:57:56 02:32:01 AM all 7.53 0.00 1.37 0.02 0.06 91.03 02:57:56 02:32:01 AM 0 3.00 0.00 0.55 0.03 0.05 96.36 02:57:56 02:32:01 AM 1 12.06 0.00 2.18 0.00 0.07 85.70 02:57:56 02:33:01 AM all 8.34 0.26 1.00 0.13 0.06 90.21 02:57:56 02:33:01 AM 0 2.66 0.00 0.30 0.02 0.05 96.97 02:57:56 02:33:01 AM 1 14.28 0.54 1.73 0.24 0.07 83.14 02:57:56 02:34:01 AM all 8.38 0.00 0.67 0.00 0.07 90.88 02:57:56 02:34:01 AM 0 8.84 0.00 0.49 0.00 0.05 90.62 02:57:56 02:34:01 AM 1 7.92 0.00 0.86 0.00 0.08 91.14 02:57:56 02:35:01 AM all 1.82 0.00 0.20 0.00 0.04 97.93 02:57:56 02:35:01 AM 0 1.74 0.00 0.18 0.00 0.03 98.04 02:57:56 02:35:01 AM 1 1.91 0.00 0.22 0.00 0.05 97.82 02:57:56 02:36:01 AM all 4.50 0.00 0.54 0.00 0.06 94.90 02:57:56 02:36:01 AM 0 2.81 0.00 0.30 0.00 0.07 96.82 02:57:56 02:36:01 AM 1 6.18 0.00 0.79 0.00 0.05 92.98 02:57:56 02:57:56 02:36:01 AM CPU %user %nice %system %iowait %steal %idle 02:57:56 02:37:01 AM all 1.87 0.00 0.20 0.00 0.06 97.88 02:57:56 02:37:01 AM 0 2.26 0.00 0.20 0.00 0.07 97.47 02:57:56 02:37:01 AM 1 1.47 0.00 0.20 0.00 0.05 98.28 02:57:56 02:38:01 AM all 5.62 0.00 0.72 0.01 0.06 93.59 02:57:56 02:38:01 AM 0 5.63 0.00 0.55 0.00 0.05 93.77 02:57:56 02:38:01 AM 1 5.61 0.00 0.89 0.02 0.07 93.42 02:57:56 02:39:01 AM all 1.65 0.00 0.20 0.00 0.04 98.11 02:57:56 02:39:01 AM 0 1.89 0.00 0.18 0.00 0.05 97.87 02:57:56 02:39:01 AM 1 1.40 0.00 0.22 0.00 0.03 98.34 02:57:56 02:40:01 AM all 2.83 0.00 0.31 0.01 0.05 96.80 02:57:56 02:40:01 AM 0 1.93 0.00 0.34 0.00 0.03 97.70 02:57:56 02:40:01 AM 1 3.74 0.00 0.29 0.02 0.07 95.89 02:57:56 02:41:01 AM all 1.71 0.00 0.22 0.00 0.05 98.03 02:57:56 02:41:01 AM 0 1.52 0.00 0.22 0.00 0.05 98.21 02:57:56 02:41:01 AM 1 1.89 0.00 0.22 0.00 0.05 97.84 02:57:56 02:42:01 AM all 2.71 0.00 0.28 0.01 0.04 96.96 02:57:56 02:42:01 AM 0 3.52 0.00 0.22 0.00 0.05 96.21 02:57:56 02:42:01 AM 1 1.91 0.00 0.33 0.02 0.03 97.71 02:57:56 02:43:01 AM all 4.93 0.00 0.39 0.01 0.05 94.63 02:57:56 02:43:01 AM 0 1.64 0.00 0.45 0.02 0.05 97.84 02:57:56 02:43:01 AM 1 8.21 0.00 0.32 0.00 0.05 91.42 02:57:56 02:44:01 AM all 4.25 0.00 0.60 0.00 0.05 95.10 02:57:56 02:44:01 AM 0 3.23 0.00 0.55 0.00 0.03 96.18 02:57:56 02:44:01 AM 1 5.26 0.00 0.65 0.00 0.07 94.02 02:57:56 02:45:01 AM all 2.93 0.00 0.44 0.01 0.06 96.56 02:57:56 02:45:01 AM 0 2.58 0.00 0.49 0.02 0.05 96.87 02:57:56 02:45:01 AM 1 3.28 0.00 0.40 0.00 0.07 96.25 02:57:56 02:46:01 AM all 3.16 0.00 0.47 0.00 0.05 96.32 02:57:56 02:46:01 AM 0 4.42 0.00 0.62 0.00 0.05 94.91 02:57:56 02:46:01 AM 1 1.91 0.00 0.32 0.00 0.05 97.72 02:57:56 02:47:01 AM all 5.93 0.00 0.48 0.00 0.05 93.53 02:57:56 02:47:01 AM 0 2.36 0.00 0.25 0.00 0.03 97.36 02:57:56 02:47:01 AM 1 9.58 0.00 0.72 0.00 0.07 89.63 02:57:56 02:57:56 02:47:01 AM CPU %user %nice %system %iowait %steal %idle 02:57:56 02:48:01 AM all 2.93 0.00 0.25 0.00 0.05 96.76 02:57:56 02:48:01 AM 0 0.87 0.00 0.17 0.00 0.05 98.91 02:57:56 02:48:01 AM 1 5.05 0.00 0.34 0.00 0.05 94.56 02:57:56 02:49:01 AM all 1.20 0.00 0.13 0.00 0.04 98.63 02:57:56 02:49:01 AM 0 1.41 0.00 0.15 0.00 0.05 98.39 02:57:56 02:49:01 AM 1 0.99 0.00 0.12 0.00 0.03 98.86 02:57:56 02:50:01 AM all 1.24 0.00 0.13 0.00 0.05 98.57 02:57:56 02:50:01 AM 0 1.09 0.00 0.13 0.00 0.05 98.73 02:57:56 02:50:01 AM 1 1.40 0.00 0.13 0.00 0.05 98.41 02:57:56 02:51:01 AM all 1.18 0.00 0.13 0.00 0.04 98.65 02:57:56 02:51:01 AM 0 0.82 0.00 0.13 0.00 0.03 99.01 02:57:56 02:51:01 AM 1 1.54 0.00 0.12 0.00 0.05 98.29 02:57:56 02:52:01 AM all 1.15 0.00 0.14 0.00 0.03 98.67 02:57:56 02:52:01 AM 0 1.76 0.00 0.15 0.00 0.03 98.06 02:57:56 02:52:01 AM 1 0.55 0.00 0.13 0.00 0.03 99.28 02:57:56 02:53:01 AM all 1.49 0.00 0.22 0.00 0.04 98.25 02:57:56 02:53:01 AM 0 1.07 0.00 0.23 0.00 0.05 98.64 02:57:56 02:53:01 AM 1 1.91 0.00 0.20 0.00 0.03 97.86 02:57:56 02:54:01 AM all 6.24 0.00 0.71 0.02 0.06 92.98 02:57:56 02:54:01 AM 0 8.87 0.00 0.93 0.00 0.05 90.16 02:57:56 02:54:01 AM 1 3.62 0.00 0.49 0.03 0.07 95.79 02:57:56 02:55:01 AM all 4.21 0.00 0.83 0.00 0.07 94.89 02:57:56 02:55:01 AM 0 4.73 0.00 0.70 0.00 0.07 94.50 02:57:56 02:55:01 AM 1 3.70 0.00 0.95 0.00 0.07 95.28 02:57:56 02:56:01 AM all 11.68 0.00 0.84 0.00 0.06 87.43 02:57:56 02:56:01 AM 0 2.95 0.00 0.60 0.00 0.05 96.40 02:57:56 02:56:01 AM 1 20.38 0.00 1.07 0.00 0.07 78.48 02:57:56 02:57:01 AM all 23.28 0.00 3.28 0.25 0.07 73.12 02:57:56 02:57:01 AM 0 19.69 0.00 3.23 0.18 0.07 76.83 02:57:56 02:57:01 AM 1 26.86 0.00 3.34 0.32 0.07 69.41 02:57:56 Average: all 7.05 0.01 0.88 0.84 0.06 91.16 02:57:56 Average: 0 5.19 0.00 0.78 0.82 0.05 93.16 02:57:56 Average: 1 8.92 0.01 0.99 0.86 0.06 89.16 02:57:56 02:57:56 02:57:56