04:50:56 Started by upstream project "integration-distribution-test-titanium" build number 417 04:50:56 originally caused by: 04:50:56 Started by upstream project "autorelease-release-titanium-mvn39-openjdk21" build number 384 04:50:56 originally caused by: 04:50:56 Started by timer 04:50:56 Running as SYSTEM 04:50:56 [EnvInject] - Loading node environment variables. 04:50:56 Building remotely on prd-centos8-robot-2c-8g-52087 (centos8-robot-2c-8g) in workspace /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 04:50:57 [ssh-agent] Looking for ssh-agent implementation... 04:50:57 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 04:50:57 $ ssh-agent 04:50:57 SSH_AUTH_SOCK=/tmp/ssh-nLagsWFoKZgg/agent.5292 04:50:57 SSH_AGENT_PID=5293 04:50:57 [ssh-agent] Started. 04:50:57 Running ssh-add (command line suppressed) 04:50:57 Identity added: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_10962718145675783846.key (/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_10962718145675783846.key) 04:50:57 [ssh-agent] Using credentials jenkins (Release Engineering Jenkins Key) 04:50:57 The recommended git tool is: NONE 04:50:59 using credential opendaylight-jenkins-ssh 04:50:59 Wiping out workspace first. 04:50:59 Cloning the remote Git repository 04:50:59 Cloning repository git://devvexx.opendaylight.org/mirror/integration/test 04:50:59 > git init /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test # timeout=10 04:51:00 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 04:51:00 > git --version # timeout=10 04:51:00 > git --version # 'git version 2.43.0' 04:51:00 using GIT_SSH to set credentials Release Engineering Jenkins Key 04:51:00 [INFO] Currently running in a labeled security context 04:51:00 [INFO] Currently SELinux is 'enforcing' on the host 04:51:00 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh15562521934281933246.key 04:51:00 Verifying host key using known hosts file 04:51:00 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 04:51:00 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test +refs/heads/*:refs/remotes/origin/* # timeout=10 04:51:06 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 04:51:06 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 04:51:06 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 04:51:06 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 04:51:06 using GIT_SSH to set credentials Release Engineering Jenkins Key 04:51:06 [INFO] Currently running in a labeled security context 04:51:06 [INFO] Currently SELinux is 'enforcing' on the host 04:51:06 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh12095199926812569517.key 04:51:06 Verifying host key using known hosts file 04:51:06 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 04:51:06 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test master # timeout=10 04:51:07 > git rev-parse FETCH_HEAD^{commit} # timeout=10 04:51:07 Checking out Revision 9e7a2f1bec76f24ac7173c3a00f09ed1af208887 (origin/master) 04:51:07 > git config core.sparsecheckout # timeout=10 04:51:07 > git checkout -f 9e7a2f1bec76f24ac7173c3a00f09ed1af208887 # timeout=10 04:51:07 Commit message: "Add pekko templates" 04:51:07 > git rev-parse FETCH_HEAD^{commit} # timeout=10 04:51:07 > git rev-list --no-walk e12906d887353b3b6c7ca6e293959c75cf9a8409 # timeout=10 04:51:08 No emails were triggered. 04:51:08 provisioning config files... 04:51:08 copy managed file [npmrc] to file:/home/jenkins/.npmrc 04:51:08 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 04:51:08 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 04:51:08 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins5760007592330502153.sh 04:51:08 ---> python-tools-install.sh 04:51:08 Setup pyenv: 04:51:08 system 04:51:08 * 3.8.13 (set by /opt/pyenv/version) 04:51:08 * 3.9.13 (set by /opt/pyenv/version) 04:51:08 * 3.10.13 (set by /opt/pyenv/version) 04:51:08 * 3.11.7 (set by /opt/pyenv/version) 04:51:13 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-l3Cx 04:51:13 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 04:51:17 lf-activate-venv(): INFO: Installing: lftools 04:51:41 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 04:51:41 Generating Requirements File 04:52:02 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:52:02 httplib2 0.31.0 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. 04:52:02 Python 3.11.7 04:52:03 pip 25.2 from /tmp/venv-l3Cx/lib/python3.11/site-packages/pip (python 3.11) 04:52:03 appdirs==1.4.4 04:52:03 argcomplete==3.6.2 04:52:03 aspy.yaml==1.3.0 04:52:03 attrs==25.3.0 04:52:03 autopage==0.5.2 04:52:03 beautifulsoup4==4.13.5 04:52:03 boto3==1.40.30 04:52:03 botocore==1.40.30 04:52:03 bs4==0.0.2 04:52:03 cachetools==5.5.2 04:52:03 certifi==2025.8.3 04:52:03 cffi==2.0.0 04:52:03 cfgv==3.4.0 04:52:03 chardet==5.2.0 04:52:03 charset-normalizer==3.4.3 04:52:03 click==8.2.1 04:52:03 cliff==4.11.0 04:52:03 cmd2==2.7.0 04:52:03 cryptography==3.3.2 04:52:03 debtcollector==3.0.0 04:52:03 decorator==5.2.1 04:52:03 defusedxml==0.7.1 04:52:03 Deprecated==1.2.18 04:52:03 distlib==0.4.0 04:52:03 dnspython==2.8.0 04:52:03 docker==7.1.0 04:52:03 dogpile.cache==1.4.1 04:52:03 durationpy==0.10 04:52:03 email-validator==2.3.0 04:52:03 filelock==3.19.1 04:52:03 future==1.0.0 04:52:03 gitdb==4.0.12 04:52:03 GitPython==3.1.45 04:52:03 google-auth==2.40.3 04:52:03 httplib2==0.31.0 04:52:03 identify==2.6.14 04:52:03 idna==3.10 04:52:03 importlib-resources==1.5.0 04:52:03 iso8601==2.1.0 04:52:03 Jinja2==3.1.6 04:52:03 jmespath==1.0.1 04:52:03 jsonpatch==1.33 04:52:03 jsonpointer==3.0.0 04:52:03 jsonschema==4.25.1 04:52:03 jsonschema-specifications==2025.9.1 04:52:03 keystoneauth1==5.12.0 04:52:03 kubernetes==33.1.0 04:52:03 lftools==0.37.13 04:52:03 lxml==6.0.1 04:52:03 markdown-it-py==4.0.0 04:52:03 MarkupSafe==3.0.2 04:52:03 mdurl==0.1.2 04:52:03 msgpack==1.1.1 04:52:03 multi_key_dict==2.0.3 04:52:03 munch==4.0.0 04:52:03 netaddr==1.3.0 04:52:03 niet==1.4.2 04:52:03 nodeenv==1.9.1 04:52:03 oauth2client==4.1.3 04:52:03 oauthlib==3.3.1 04:52:03 openstacksdk==4.7.1 04:52:03 os-service-types==1.8.0 04:52:03 osc-lib==4.2.0 04:52:03 oslo.config==10.0.0 04:52:03 oslo.context==6.1.0 04:52:03 oslo.i18n==6.6.0 04:52:03 oslo.log==7.2.1 04:52:03 oslo.serialization==5.8.0 04:52:03 oslo.utils==9.1.0 04:52:03 packaging==25.0 04:52:03 pbr==7.0.1 04:52:03 platformdirs==4.4.0 04:52:03 prettytable==3.16.0 04:52:03 psutil==7.0.0 04:52:03 pyasn1==0.6.1 04:52:03 pyasn1_modules==0.4.2 04:52:03 pycparser==2.23 04:52:03 pygerrit2==2.0.15 04:52:03 PyGithub==2.8.1 04:52:03 Pygments==2.19.2 04:52:03 PyJWT==2.10.1 04:52:03 PyNaCl==1.6.0 04:52:03 pyparsing==2.4.7 04:52:03 pyperclip==1.9.0 04:52:03 pyrsistent==0.20.0 04:52:03 python-cinderclient==9.8.0 04:52:03 python-dateutil==2.9.0.post0 04:52:03 python-heatclient==4.3.0 04:52:03 python-jenkins==1.8.3 04:52:03 python-keystoneclient==5.7.0 04:52:03 python-magnumclient==4.9.0 04:52:03 python-openstackclient==8.2.0 04:52:03 python-swiftclient==4.8.0 04:52:03 PyYAML==6.0.2 04:52:03 referencing==0.36.2 04:52:03 requests==2.32.5 04:52:03 requests-oauthlib==2.0.0 04:52:03 requestsexceptions==1.4.0 04:52:03 rfc3986==2.0.0 04:52:03 rich==14.1.0 04:52:03 rich-argparse==1.7.1 04:52:03 rpds-py==0.27.1 04:52:03 rsa==4.9.1 04:52:03 ruamel.yaml==0.18.15 04:52:03 ruamel.yaml.clib==0.2.12 04:52:03 s3transfer==0.14.0 04:52:03 simplejson==3.20.1 04:52:03 six==1.17.0 04:52:03 smmap==5.0.2 04:52:03 soupsieve==2.8 04:52:03 stevedore==5.5.0 04:52:03 tabulate==0.9.0 04:52:03 toml==0.10.2 04:52:03 tomlkit==0.13.3 04:52:03 tqdm==4.67.1 04:52:03 typing_extensions==4.15.0 04:52:03 tzdata==2025.2 04:52:03 urllib3==1.26.20 04:52:03 virtualenv==20.34.0 04:52:03 wcwidth==0.2.13 04:52:03 websocket-client==1.8.0 04:52:03 wrapt==1.17.3 04:52:03 xdg==6.0.0 04:52:03 xmltodict==1.0.0 04:52:03 yq==3.4.3 04:52:03 [EnvInject] - Injecting environment variables from a build step. 04:52:03 [EnvInject] - Injecting as environment variables the properties content 04:52:03 OS_STACK_TEMPLATE=csit-2-instance-type.yaml 04:52:03 OS_CLOUD=vex 04:52:03 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-406 04:52:03 OS_STACK_TEMPLATE_DIR=openstack-hot 04:52:03 04:52:03 [EnvInject] - Variables injected successfully. 04:52:03 provisioning config files... 04:52:03 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 04:52:03 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins8035193616318396150.sh 04:52:03 ---> Create parameters file for OpenStack HOT 04:52:03 OpenStack Heat parameters generated 04:52:03 ----------------------------------- 04:52:03 parameters: 04:52:03 vm_0_count: '3' 04:52:03 vm_0_flavor: 'v3-standard-4' 04:52:03 vm_0_image: 'ZZCI - Ubuntu 22.04 - builder - x86_64 - 20250201-010426.857' 04:52:03 vm_1_count: '1' 04:52:03 vm_1_flavor: 'v3-standard-2' 04:52:03 vm_1_image: 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 - 20250201-060151.911' 04:52:03 04:52:03 job_name: '62057-406' 04:52:03 silo: 'releng' 04:52:03 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins910167425470045593.sh 04:52:03 ---> Create HEAT stack 04:52:03 + source /home/jenkins/lf-env.sh 04:52:03 + lf-activate-venv --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 04:52:03 ++ mktemp -d /tmp/venv-XXXX 04:52:03 + lf_venv=/tmp/venv-AIRk 04:52:03 + local venv_file=/tmp/.os_lf_venv 04:52:03 + local python=python3 04:52:03 + local options 04:52:03 + local set_path=true 04:52:03 + local install_args= 04:52:03 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 04:52:03 + options=' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 04:52:03 + eval set -- ' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 04:52:03 ++ set -- --python python3 -- 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 04:52:03 + true 04:52:03 + case $1 in 04:52:03 + python=python3 04:52:03 + shift 2 04:52:03 + true 04:52:03 + case $1 in 04:52:03 + shift 04:52:03 + break 04:52:03 + case $python in 04:52:03 + local pkg_list= 04:52:03 + [[ -d /opt/pyenv ]] 04:52:03 + echo 'Setup pyenv:' 04:52:03 Setup pyenv: 04:52:03 + export PYENV_ROOT=/opt/pyenv 04:52:03 + PYENV_ROOT=/opt/pyenv 04:52:03 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:03 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:03 + pyenv versions 04:52:04 system 04:52:04 3.8.13 04:52:04 3.9.13 04:52:04 3.10.13 04:52:04 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 04:52:04 + command -v pyenv 04:52:04 ++ pyenv init - --no-rehash 04:52:04 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 04:52:04 for i in ${!paths[@]}; do 04:52:04 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 04:52:04 fi; done; 04:52:04 echo "${paths[*]}"'\'')" 04:52:04 export PATH="/opt/pyenv/shims:${PATH}" 04:52:04 export PYENV_SHELL=bash 04:52:04 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 04:52:04 pyenv() { 04:52:04 local command 04:52:04 command="${1:-}" 04:52:04 if [ "$#" -gt 0 ]; then 04:52:04 shift 04:52:04 fi 04:52:04 04:52:04 case "$command" in 04:52:04 rehash|shell) 04:52:04 eval "$(pyenv "sh-$command" "$@")" 04:52:04 ;; 04:52:04 *) 04:52:04 command pyenv "$command" "$@" 04:52:04 ;; 04:52:04 esac 04:52:04 }' 04:52:04 +++ bash --norc -ec 'IFS=:; paths=($PATH); 04:52:04 for i in ${!paths[@]}; do 04:52:04 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 04:52:04 fi; done; 04:52:04 echo "${paths[*]}"' 04:52:04 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:04 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:04 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:04 ++ export PYENV_SHELL=bash 04:52:04 ++ PYENV_SHELL=bash 04:52:04 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 04:52:04 +++ complete -F _pyenv pyenv 04:52:04 ++ lf-pyver python3 04:52:04 ++ local py_version_xy=python3 04:52:04 ++ local py_version_xyz= 04:52:04 ++ awk '{ print $1 }' 04:52:04 ++ pyenv versions 04:52:04 ++ local command 04:52:04 ++ command=versions 04:52:04 ++ '[' 1 -gt 0 ']' 04:52:04 ++ shift 04:52:04 ++ case "$command" in 04:52:04 ++ command pyenv versions 04:52:04 ++ pyenv versions 04:52:04 ++ sed 's/^[ *]* //' 04:52:04 ++ grep -E '^[0-9.]*[0-9]$' 04:52:04 ++ [[ ! -s /tmp/.pyenv_versions ]] 04:52:04 +++ grep '^3' /tmp/.pyenv_versions 04:52:04 +++ sort -V 04:52:04 +++ tail -n 1 04:52:04 ++ py_version_xyz=3.11.7 04:52:04 ++ [[ -z 3.11.7 ]] 04:52:04 ++ echo 3.11.7 04:52:04 ++ return 0 04:52:04 + pyenv local 3.11.7 04:52:04 + local command 04:52:04 + command=local 04:52:04 + '[' 2 -gt 0 ']' 04:52:04 + shift 04:52:04 + case "$command" in 04:52:04 + command pyenv local 3.11.7 04:52:04 + pyenv local 3.11.7 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='lftools[openstack] ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='kubernetes ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='niet ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='python-heatclient ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='python-openstackclient ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='python-magnumclient ' 04:52:04 + for arg in "$@" 04:52:04 + case $arg in 04:52:04 + pkg_list+='yq ' 04:52:04 + [[ -f /tmp/.os_lf_venv ]] 04:52:04 ++ cat /tmp/.os_lf_venv 04:52:04 + lf_venv=/tmp/venv-l3Cx 04:52:04 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from' file:/tmp/.os_lf_venv 04:52:04 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 04:52:04 + /tmp/venv-l3Cx/bin/python3 -m pip install --upgrade --quiet pip 'setuptools<66' virtualenv 04:52:06 + [[ -z lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ]] 04:52:06 + echo 'lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ' 04:52:06 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 04:52:06 + /tmp/venv-l3Cx/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 04:52:24 + type python3 04:52:24 + true 04:52:24 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH' 04:52:24 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 04:52:24 + PATH=/tmp/venv-l3Cx/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:52:24 + return 0 04:52:24 + openstack --os-cloud vex limits show --absolute 04:52:26 +--------------------------+---------+ 04:52:26 | Name | Value | 04:52:26 +--------------------------+---------+ 04:52:26 | maxTotalInstances | -1 | 04:52:26 | maxTotalCores | 450 | 04:52:26 | maxTotalRAMSize | 1000000 | 04:52:26 | maxServerMeta | 128 | 04:52:26 | maxImageMeta | 128 | 04:52:26 | maxPersonality | 5 | 04:52:26 | maxPersonalitySize | 10240 | 04:52:26 | maxTotalKeypairs | 100 | 04:52:26 | maxServerGroups | 10 | 04:52:26 | maxServerGroupMembers | 10 | 04:52:26 | maxTotalFloatingIps | -1 | 04:52:26 | maxSecurityGroups | -1 | 04:52:26 | maxSecurityGroupRules | -1 | 04:52:26 | totalRAMUsed | 729088 | 04:52:26 | totalCoresUsed | 178 | 04:52:26 | totalInstancesUsed | 60 | 04:52:26 | totalFloatingIpsUsed | 0 | 04:52:26 | totalSecurityGroupsUsed | 0 | 04:52:26 | totalServerGroupsUsed | 0 | 04:52:26 | maxTotalVolumes | -1 | 04:52:26 | maxTotalSnapshots | 10 | 04:52:26 | maxTotalVolumeGigabytes | 4096 | 04:52:26 | maxTotalBackups | 10 | 04:52:26 | maxTotalBackupGigabytes | 1000 | 04:52:26 | totalVolumesUsed | 3 | 04:52:26 | totalGigabytesUsed | 60 | 04:52:26 | totalSnapshotsUsed | 0 | 04:52:26 | totalBackupsUsed | 0 | 04:52:26 | totalBackupGigabytesUsed | 0 | 04:52:26 +--------------------------+---------+ 04:52:26 + pushd /opt/ciman/openstack-hot 04:52:26 /opt/ciman/openstack-hot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 04:52:26 + lftools openstack --os-cloud vex stack create releng-openflowplugin-csit-3node-clustering-only-titanium-406 csit-2-instance-type.yaml /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/stack-parameters.yaml 04:52:54 Creating stack releng-openflowplugin-csit-3node-clustering-only-titanium-406 04:52:54 Waiting to initialize infrastructure... 04:52:54 Stack initialization successful. 04:52:54 ------------------------------------ 04:52:54 Stack Details 04:52:54 ------------------------------------ 04:52:54 {'added': None, 04:52:54 'capabilities': [], 04:52:54 'created_at': '2025-09-13T04:52:29Z', 04:52:54 'deleted': None, 04:52:54 'deleted_at': None, 04:52:54 'description': 'No description', 04:52:54 'environment': None, 04:52:54 'environment_files': None, 04:52:54 'files': None, 04:52:54 'files_container': None, 04:52:54 'id': '36366a9a-c627-4fb9-a21e-b71e813ff412', 04:52:54 'is_rollback_disabled': True, 04:52:54 'links': [{'href': 'https://orchestration.public.mtl1.vexxhost.net/v1/12c36e260d8e4bb2913965203b1b491f/stacks/releng-openflowplugin-csit-3node-clustering-only-titanium-406/36366a9a-c627-4fb9-a21e-b71e813ff412', 04:52:54 'rel': 'self'}], 04:52:54 'location': Munch({'cloud': 'vex', 'region_name': 'ca-ymq-1', 'zone': None, 'project': Munch({'id': '12c36e260d8e4bb2913965203b1b491f', 'name': '61975f2c-7c17-4d69-82fa-c3ae420ad6fd', 'domain_id': None, 'domain_name': 'Default'})}), 04:52:54 'name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-406', 04:52:54 'notification_topics': [], 04:52:54 'outputs': [{'description': 'IP addresses of the 1st vm types', 04:52:54 'output_key': 'vm_0_ips', 04:52:54 'output_value': ['10.30.170.73', 04:52:54 '10.30.171.201', 04:52:54 '10.30.170.175']}, 04:52:54 {'description': 'IP addresses of the 2nd vm types', 04:52:54 'output_key': 'vm_1_ips', 04:52:54 'output_value': ['10.30.171.2']}], 04:52:54 'owner_id': ****, 04:52:54 'parameters': {'OS::project_id': '12c36e260d8e4bb2913965203b1b491f', 04:52:54 'OS::stack_id': '36366a9a-c627-4fb9-a21e-b71e813ff412', 04:52:54 'OS::stack_name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-406', 04:52:54 'job_name': '62057-406', 04:52:54 'silo': 'releng', 04:52:54 'vm_0_count': '3', 04:52:54 'vm_0_flavor': 'v3-standard-4', 04:52:54 'vm_0_image': 'ZZCI - Ubuntu 22.04 - builder - x86_64 - ' 04:52:54 '20250201-010426.857', 04:52:54 'vm_1_count': '1', 04:52:54 'vm_1_flavor': 'v3-standard-2', 04:52:54 'vm_1_image': 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 ' 04:52:54 '- 20250201-060151.911'}, 04:52:54 'parent_id': None, 04:52:54 'replaced': None, 04:52:54 'status': 'CREATE_COMPLETE', 04:52:54 'status_reason': 'Stack CREATE completed successfully', 04:52:54 'tags': [], 04:52:54 'template': None, 04:52:54 'template_description': 'No description', 04:52:54 'template_url': None, 04:52:54 'timeout_mins': 15, 04:52:54 'unchanged': None, 04:52:54 'updated': None, 04:52:54 'updated_at': None, 04:52:54 'user_project_id': '311d8c7e88e24e419852de288f8c3811'} 04:52:54 ------------------------------------ 04:52:54 + popd 04:52:54 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 04:52:54 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins14543885209083928644.sh 04:52:54 ---> Copy SSH public keys to CSIT lab 04:52:54 Setup pyenv: 04:52:55 system 04:52:55 3.8.13 04:52:55 3.9.13 04:52:55 3.10.13 04:52:55 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 04:52:55 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 04:52:57 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 04:53:20 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 04:53:22 SSH not responding on 10.30.170.73. Retrying in 10 seconds... 04:53:22 SSH not responding on 10.30.171.201. Retrying in 10 seconds... 04:53:23 Warning: Permanently added '10.30.171.2' (ECDSA) to the list of known hosts. 04:53:24 releng-62057-406-1-mininet-ovs-217-0 04:53:24 Successfully copied public keys to slave 10.30.171.2 04:53:24 Process 6515 ready. 04:53:24 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:53:24 releng-62057-406-0-builder-2 04:53:24 Successfully copied public keys to slave 10.30.170.175 04:53:32 Ping to 10.30.170.73 successful. 04:53:32 Ping to 10.30.171.201 successful. 04:53:33 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:53:34 releng-62057-406-0-builder-0 04:53:34 Successfully copied public keys to slave 10.30.170.73 04:53:34 Process 6516 ready. 04:53:34 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:53:34 releng-62057-406-0-builder-1 04:53:34 Successfully copied public keys to slave 10.30.171.201 04:53:34 Process 6517 ready. 04:53:34 Process 6518 ready. 04:53:34 SSH ready on all stack servers. 04:53:34 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins16724180209623694070.sh 04:53:35 Setup pyenv: 04:53:35 system 04:53:35 3.8.13 04:53:35 3.9.13 04:53:35 3.10.13 04:53:35 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 04:53:39 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-jtl7 04:53:39 lf-activate-venv(): INFO: Save venv in file: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.robot_venv 04:53:43 lf-activate-venv(): INFO: Installing: setuptools wheel 04:53:44 lf-activate-venv(): INFO: Adding /tmp/venv-jtl7/bin to PATH 04:53:44 + echo 'Installing Python Requirements' 04:53:44 Installing Python Requirements 04:53:44 + cat 04:53:44 + python -m pip install -r requirements.txt 04:53:45 Looking in indexes: https://nexus3.opendaylight.org/repository/PyPi/simple 04:53:45 Collecting docker-py (from -r requirements.txt (line 1)) 04:53:45 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-py/1.10.6/docker_py-1.10.6-py2.py3-none-any.whl (50 kB) 04:53:45 Collecting ipaddr (from -r requirements.txt (line 2)) 04:53:45 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ipaddr/2.2.0/ipaddr-2.2.0.tar.gz (26 kB) 04:53:45 Preparing metadata (setup.py): started 04:53:45 Preparing metadata (setup.py): finished with status 'done' 04:53:45 Collecting netaddr (from -r requirements.txt (line 3)) 04:53:45 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/netaddr/1.3.0/netaddr-1.3.0-py3-none-any.whl (2.3 MB) 04:53:45 Collecting netifaces (from -r requirements.txt (line 4)) 04:53:45 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/netifaces/0.11.0/netifaces-0.11.0.tar.gz (30 kB) 04:53:45 Preparing metadata (setup.py): started 04:53:45 Preparing metadata (setup.py): finished with status 'done' 04:53:45 Collecting pyhocon (from -r requirements.txt (line 5)) 04:53:45 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyhocon/0.3.61/pyhocon-0.3.61-py3-none-any.whl (25 kB) 04:53:45 Collecting requests (from -r requirements.txt (line 6)) 04:53:46 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/requests/2.32.5/requests-2.32.5-py3-none-any.whl (64 kB) 04:53:46 Collecting robotframework (from -r requirements.txt (line 7)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework/7.3.2/robotframework-7.3.2-py3-none-any.whl (795 kB) 04:53:46 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 795.1/795.1 kB 20.5 MB/s 0:00:00 04:53:46 Collecting robotframework-httplibrary (from -r requirements.txt (line 8)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-httplibrary/0.4.2/robotframework-httplibrary-0.4.2.tar.gz (9.1 kB) 04:53:46 Preparing metadata (setup.py): started 04:53:46 Preparing metadata (setup.py): finished with status 'done' 04:53:46 Collecting robotframework-requests==0.9.7 (from -r requirements.txt (line 9)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-requests/0.9.7/robotframework_requests-0.9.7-py3-none-any.whl (21 kB) 04:53:46 Collecting robotframework-selenium2library (from -r requirements.txt (line 10)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-selenium2library/3.0.0/robotframework_selenium2library-3.0.0-py2.py3-none-any.whl (6.2 kB) 04:53:46 Collecting robotframework-sshlibrary==3.8.0 (from -r requirements.txt (line 11)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-sshlibrary/3.8.0/robotframework-sshlibrary-3.8.0.tar.gz (51 kB) 04:53:46 Preparing metadata (setup.py): started 04:53:46 Preparing metadata (setup.py): finished with status 'done' 04:53:46 Collecting scapy (from -r requirements.txt (line 12)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scapy/2.6.1/scapy-2.6.1-py3-none-any.whl (2.4 MB) 04:53:46 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 29.0 MB/s 0:00:00 04:53:46 Collecting jsonpath-rw (from -r requirements.txt (line 15)) 04:53:46 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpath-rw/1.4.0/jsonpath-rw-1.4.0.tar.gz (13 kB) 04:53:46 Preparing metadata (setup.py): started 04:53:46 Preparing metadata (setup.py): finished with status 'done' 04:53:47 Collecting elasticsearch (from -r requirements.txt (line 18)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/9.1.1/elasticsearch-9.1.1-py3-none-any.whl (937 kB) 04:53:47 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 937.5/937.5 kB 20.6 MB/s 0:00:00 04:53:47 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.18.0/elasticsearch_dsl-8.18.0-py3-none-any.whl (10 kB) 04:53:47 Collecting pyangbind (from -r requirements.txt (line 22)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyangbind/0.8.6/pyangbind-0.8.6-py3-none-any.whl (52 kB) 04:53:47 Collecting isodate (from -r requirements.txt (line 25)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/isodate/0.7.2/isodate-0.7.2-py3-none-any.whl (22 kB) 04:53:47 Collecting jmespath (from -r requirements.txt (line 28)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jmespath/1.0.1/jmespath-1.0.1-py3-none-any.whl (20 kB) 04:53:47 Collecting jsonpatch (from -r requirements.txt (line 31)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpatch/1.33/jsonpatch-1.33-py2.py3-none-any.whl (12 kB) 04:53:47 Collecting paramiko>=1.15.3 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/paramiko/4.0.0/paramiko-4.0.0-py3-none-any.whl (223 kB) 04:53:47 Collecting scp>=0.13.0 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scp/0.15.0/scp-0.15.0-py2.py3-none-any.whl (8.8 kB) 04:53:47 Collecting docker-pycreds>=0.2.1 (from docker-py->-r requirements.txt (line 1)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-pycreds/0.4.0/docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) 04:53:47 Collecting six>=1.4.0 (from docker-py->-r requirements.txt (line 1)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/six/1.17.0/six-1.17.0-py2.py3-none-any.whl (11 kB) 04:53:47 Collecting websocket-client>=0.32.0 (from docker-py->-r requirements.txt (line 1)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/websocket-client/1.8.0/websocket_client-1.8.0-py3-none-any.whl (58 kB) 04:53:47 Collecting pyparsing<4,>=2 (from pyhocon->-r requirements.txt (line 5)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pyparsing/3.2.3/pyparsing-3.2.3-py3-none-any.whl (111 kB) 04:53:47 Collecting charset_normalizer<4,>=2 (from requests->-r requirements.txt (line 6)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/charset-normalizer/3.4.3/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (150 kB) 04:53:47 Collecting idna<4,>=2.5 (from requests->-r requirements.txt (line 6)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/idna/3.10/idna-3.10-py3-none-any.whl (70 kB) 04:53:47 Collecting urllib3<3,>=1.21.1 (from requests->-r requirements.txt (line 6)) 04:53:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/urllib3/2.5.0/urllib3-2.5.0-py3-none-any.whl (129 kB) 04:53:47 Collecting certifi>=2017.4.17 (from requests->-r requirements.txt (line 6)) 04:53:47 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/certifi/2025.8.3/certifi-2025.8.3-py3-none-any.whl (161 kB) 04:53:48 Collecting webtest>=2.0 (from robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webtest/3.0.6/webtest-3.0.6-py3-none-any.whl (32 kB) 04:53:48 Collecting jsonpointer (from robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:48 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpointer/3.0.0/jsonpointer-3.0.0-py2.py3-none-any.whl (7.6 kB) 04:53:48 Collecting robotframework-seleniumlibrary>=3.0.0 (from robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-seleniumlibrary/6.7.1/robotframework_seleniumlibrary-6.7.1-py2.py3-none-any.whl (104 kB) 04:53:48 Collecting ply (from jsonpath-rw->-r requirements.txt (line 15)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ply/3.11/ply-3.11-py2.py3-none-any.whl (49 kB) 04:53:48 Collecting decorator (from jsonpath-rw->-r requirements.txt (line 15)) 04:53:48 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/decorator/5.2.1/decorator-5.2.1-py3-none-any.whl (9.2 kB) 04:53:48 Collecting elastic-transport<10,>=9.1.0 (from elasticsearch->-r requirements.txt (line 18)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/9.1.0/elastic_transport-9.1.0-py3-none-any.whl (65 kB) 04:53:48 Collecting python-dateutil (from elasticsearch->-r requirements.txt (line 18)) 04:53:48 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/python-dateutil/2.9.0.post0/python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) 04:53:48 Collecting typing-extensions (from elasticsearch->-r requirements.txt (line 18)) 04:53:48 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/typing-extensions/4.15.0/typing_extensions-4.15.0-py3-none-any.whl (44 kB) 04:53:48 Collecting elasticsearch (from -r requirements.txt (line 18)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/8.19.1/elasticsearch-8.19.1-py3-none-any.whl (940 kB) 04:53:48 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 940.5/940.5 kB 12.9 MB/s 0:00:00 04:53:48 INFO: pip is looking at multiple versions of elasticsearch-dsl to determine which version is compatible with other requirements. This could take a while. 04:53:48 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.1/elasticsearch_dsl-8.17.1-py3-none-any.whl (158 kB) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.0/elasticsearch_dsl-8.17.0-py3-none-any.whl (158 kB) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.16.0/elasticsearch_dsl-8.16.0-py3-none-any.whl (158 kB) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.15.4/elasticsearch_dsl-8.15.4-py3-none-any.whl (104 kB) 04:53:48 Collecting elastic-transport<9,>=8.15.1 (from elasticsearch->-r requirements.txt (line 18)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/8.17.1/elastic_transport-8.17.1-py3-none-any.whl (64 kB) 04:53:48 Collecting pyang (from pyangbind->-r requirements.txt (line 22)) 04:53:48 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyang/2.7.1/pyang-2.7.1-py2.py3-none-any.whl (598 kB) 04:53:48 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 598.5/598.5 kB 23.9 MB/s 0:00:00 04:53:49 Collecting lxml (from pyangbind->-r requirements.txt (line 22)) 04:53:49 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/lxml/6.0.1/lxml-6.0.1-cp311-cp311-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl (5.2 MB) 04:53:49 Collecting regex (from pyangbind->-r requirements.txt (line 22)) 04:53:49 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/regex/2025.9.1/regex-2025.9.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (798 kB) 04:53:50 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 799.0/799.0 kB 13.8 MB/s 0:00:00 04:53:50 Collecting enum34 (from pyangbind->-r requirements.txt (line 22)) 04:53:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/enum34/1.1.10/enum34-1.1.10-py3-none-any.whl (11 kB) 04:53:50 Collecting bcrypt>=3.2 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/bcrypt/4.3.0/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_x86_64.whl (284 kB) 04:53:50 Collecting cryptography>=3.3 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:50 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cryptography/45.0.7/cryptography-45.0.7-cp311-abi3-manylinux_2_28_x86_64.whl (4.5 MB) 04:53:50 Collecting invoke>=2.0 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/invoke/2.2.0/invoke-2.2.0-py3-none-any.whl (160 kB) 04:53:50 Collecting pynacl>=1.5 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:50 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pynacl/1.6.0/pynacl-1.6.0-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl (1.4 MB) 04:53:51 Collecting cffi>=1.14 (from cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:51 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cffi/2.0.0/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (215 kB) 04:53:51 Collecting pycparser (from cffi>=1.14->cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:53:51 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pycparser/2.23/pycparser-2.23-py3-none-any.whl (118 kB) 04:53:51 Collecting selenium>=4.3.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/selenium/4.35.0/selenium-4.35.0-py3-none-any.whl (9.6 MB) 04:53:51 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.6/9.6 MB 68.6 MB/s 0:00:00 04:53:51 Collecting robotframework-pythonlibcore>=4.4.1 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-pythonlibcore/4.4.1/robotframework_pythonlibcore-4.4.1-py2.py3-none-any.whl (12 kB) 04:53:51 Collecting click>=8.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/click/8.2.1/click-8.2.1-py3-none-any.whl (102 kB) 04:53:51 Collecting trio~=0.30.0 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio/0.30.0/trio-0.30.0-py3-none-any.whl (499 kB) 04:53:51 Collecting trio-websocket~=0.12.2 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio-websocket/0.12.2/trio_websocket-0.12.2-py3-none-any.whl (21 kB) 04:53:51 Collecting typing-extensions (from elasticsearch-dsl->-r requirements.txt (line 19)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/typing-extensions/4.14.1/typing_extensions-4.14.1-py3-none-any.whl (43 kB) 04:53:51 Collecting attrs>=23.2.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/attrs/25.3.0/attrs-25.3.0-py3-none-any.whl (63 kB) 04:53:51 Collecting sortedcontainers (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sortedcontainers/2.4.0/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) 04:53:51 Collecting outcome (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/outcome/1.3.0.post0/outcome-1.3.0.post0-py2.py3-none-any.whl (10 kB) 04:53:51 Collecting sniffio>=1.3.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl (10 kB) 04:53:51 Collecting wsproto>=0.14 (from trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/wsproto/1.2.0/wsproto-1.2.0-py3-none-any.whl (24 kB) 04:53:51 Collecting pysocks!=1.5.7,<2.0,>=1.5.6 (from urllib3[socks]<3.0,>=2.5.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pysocks/1.7.1/PySocks-1.7.1-py3-none-any.whl (16 kB) 04:53:51 Collecting WebOb>=1.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webob/1.8.9/WebOb-1.8.9-py2.py3-none-any.whl (115 kB) 04:53:51 Collecting waitress>=3.0.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:51 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/waitress/3.0.2/waitress-3.0.2-py3-none-any.whl (56 kB) 04:53:53 Collecting beautifulsoup4 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:53 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/beautifulsoup4/4.13.5/beautifulsoup4-4.13.5-py3-none-any.whl (105 kB) 04:53:53 Collecting h11<1,>=0.9.0 (from wsproto>=0.14->trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:53:53 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/h11/0.16.0/h11-0.16.0-py3-none-any.whl (37 kB) 04:53:53 Collecting soupsieve>1.2 (from beautifulsoup4->webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:53:53 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/soupsieve/2.8/soupsieve-2.8-py3-none-any.whl (36 kB) 04:53:53 Building wheels for collected packages: robotframework-sshlibrary, ipaddr, netifaces, robotframework-httplibrary, jsonpath-rw 04:53:53 DEPRECATION: Building 'robotframework-sshlibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-sshlibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 04:53:53 Building wheel for robotframework-sshlibrary (setup.py): started 04:53:53 Building wheel for robotframework-sshlibrary (setup.py): finished with status 'done' 04:53:53 Created wheel for robotframework-sshlibrary: filename=robotframework_sshlibrary-3.8.0-py3-none-any.whl size=55205 sha256=82156247f1359fe4b3d6e096dd6d5e205bd4a82abcda8e0f57b510d9370632fc 04:53:53 Stored in directory: /home/jenkins/.cache/pip/wheels/f7/c9/b3/a977b7bcc410d45ae27d240df3d00a12585509180e373ecccc 04:53:53 DEPRECATION: Building 'ipaddr' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'ipaddr'. Discussion can be found at https://github.com/pypa/pip/issues/6334 04:53:53 Building wheel for ipaddr (setup.py): started 04:53:53 Building wheel for ipaddr (setup.py): finished with status 'done' 04:53:53 Created wheel for ipaddr: filename=ipaddr-2.2.0-py3-none-any.whl size=18353 sha256=baa55b6b9dbf25a8e534289dfce9509b84b20b96de2ce6928ab42a0163d6016e 04:53:53 Stored in directory: /home/jenkins/.cache/pip/wheels/dc/6c/04/da2d847fa8d45c59af3e1d83e2acc29cb8adcbaf04c0898dbf 04:53:53 DEPRECATION: Building 'netifaces' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'netifaces'. Discussion can be found at https://github.com/pypa/pip/issues/6334 04:53:53 Building wheel for netifaces (setup.py): started 04:53:55 Building wheel for netifaces (setup.py): finished with status 'done' 04:53:55 Created wheel for netifaces: filename=netifaces-0.11.0-cp311-cp311-linux_x86_64.whl size=41082 sha256=f55760937ccf918bd966c897785419867c84c0436e976157cb3a98235f291fa9 04:53:55 Stored in directory: /home/jenkins/.cache/pip/wheels/f8/18/88/e61d54b995bea304bdb1d040a92b72228a1bf72ca2a3eba7c9 04:53:55 DEPRECATION: Building 'robotframework-httplibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-httplibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 04:53:55 Building wheel for robotframework-httplibrary (setup.py): started 04:53:55 Building wheel for robotframework-httplibrary (setup.py): finished with status 'done' 04:53:55 Created wheel for robotframework-httplibrary: filename=robotframework_httplibrary-0.4.2-py3-none-any.whl size=10014 sha256=f7131eacef27ca97659b9b579b53242fb634bb918b6e94ca84ee9579860712e8 04:53:55 Stored in directory: /home/jenkins/.cache/pip/wheels/aa/bc/0d/9a20dd51effef392aae2733cb4c7b66c6fa29fca33d88b57ed 04:53:55 DEPRECATION: Building 'jsonpath-rw' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'jsonpath-rw'. Discussion can be found at https://github.com/pypa/pip/issues/6334 04:53:55 Building wheel for jsonpath-rw (setup.py): started 04:53:55 Building wheel for jsonpath-rw (setup.py): finished with status 'done' 04:53:55 Created wheel for jsonpath-rw: filename=jsonpath_rw-1.4.0-py3-none-any.whl size=15176 sha256=1310c6e68f4126be3d7a0e347a1595aba6e36a1233954e9079b7011f3b26953b 04:53:55 Stored in directory: /home/jenkins/.cache/pip/wheels/f1/54/63/9a8da38cefae13755097b36cc852decc25d8ef69c37d58d4eb 04:53:55 Successfully built robotframework-sshlibrary ipaddr netifaces robotframework-httplibrary jsonpath-rw 04:53:55 Installing collected packages: sortedcontainers, ply, netifaces, ipaddr, enum34, websocket-client, WebOb, waitress, urllib3, typing-extensions, soupsieve, sniffio, six, scapy, robotframework-pythonlibcore, robotframework, regex, pysocks, pyparsing, pycparser, netaddr, lxml, jsonpointer, jmespath, isodate, invoke, idna, h11, decorator, click, charset_normalizer, certifi, bcrypt, attrs, wsproto, requests, python-dateutil, pyhocon, pyang, outcome, jsonpath-rw, jsonpatch, elastic-transport, docker-pycreds, cffi, beautifulsoup4, webtest, trio, robotframework-requests, pynacl, pyangbind, elasticsearch, docker-py, cryptography, trio-websocket, robotframework-httplibrary, paramiko, elasticsearch-dsl, selenium, scp, robotframework-sshlibrary, robotframework-seleniumlibrary, robotframework-selenium2library 04:54:02 04:54:02 Successfully installed WebOb-1.8.9 attrs-25.3.0 bcrypt-4.3.0 beautifulsoup4-4.13.5 certifi-2025.8.3 cffi-2.0.0 charset_normalizer-3.4.3 click-8.2.1 cryptography-45.0.7 decorator-5.2.1 docker-py-1.10.6 docker-pycreds-0.4.0 elastic-transport-8.17.1 elasticsearch-8.19.1 elasticsearch-dsl-8.15.4 enum34-1.1.10 h11-0.16.0 idna-3.10 invoke-2.2.0 ipaddr-2.2.0 isodate-0.7.2 jmespath-1.0.1 jsonpatch-1.33 jsonpath-rw-1.4.0 jsonpointer-3.0.0 lxml-6.0.1 netaddr-1.3.0 netifaces-0.11.0 outcome-1.3.0.post0 paramiko-4.0.0 ply-3.11 pyang-2.7.1 pyangbind-0.8.6 pycparser-2.23 pyhocon-0.3.61 pynacl-1.6.0 pyparsing-3.2.3 pysocks-1.7.1 python-dateutil-2.9.0.post0 regex-2025.9.1 requests-2.32.5 robotframework-7.3.2 robotframework-httplibrary-0.4.2 robotframework-pythonlibcore-4.4.1 robotframework-requests-0.9.7 robotframework-selenium2library-3.0.0 robotframework-seleniumlibrary-6.7.1 robotframework-sshlibrary-3.8.0 scapy-2.6.1 scp-0.15.0 selenium-4.35.0 six-1.17.0 sniffio-1.3.1 sortedcontainers-2.4.0 soupsieve-2.8 trio-0.30.0 trio-websocket-0.12.2 typing-extensions-4.14.1 urllib3-2.5.0 waitress-3.0.2 websocket-client-1.8.0 webtest-3.0.6 wsproto-1.2.0 04:54:02 + pip freeze 04:54:03 attrs==25.3.0 04:54:03 bcrypt==4.3.0 04:54:03 beautifulsoup4==4.13.5 04:54:03 certifi==2025.8.3 04:54:03 cffi==2.0.0 04:54:03 charset-normalizer==3.4.3 04:54:03 click==8.2.1 04:54:03 cryptography==45.0.7 04:54:03 decorator==5.2.1 04:54:03 distlib==0.4.0 04:54:03 docker-py==1.10.6 04:54:03 docker-pycreds==0.4.0 04:54:03 elastic-transport==8.17.1 04:54:03 elasticsearch==8.19.1 04:54:03 elasticsearch-dsl==8.15.4 04:54:03 enum34==1.1.10 04:54:03 filelock==3.19.1 04:54:03 h11==0.16.0 04:54:03 idna==3.10 04:54:03 invoke==2.2.0 04:54:03 ipaddr==2.2.0 04:54:03 isodate==0.7.2 04:54:03 jmespath==1.0.1 04:54:03 jsonpatch==1.33 04:54:03 jsonpath-rw==1.4.0 04:54:03 jsonpointer==3.0.0 04:54:03 lxml==6.0.1 04:54:03 netaddr==1.3.0 04:54:03 netifaces==0.11.0 04:54:03 outcome==1.3.0.post0 04:54:03 paramiko==4.0.0 04:54:03 platformdirs==4.4.0 04:54:03 ply==3.11 04:54:03 pyang==2.7.1 04:54:03 pyangbind==0.8.6 04:54:03 pycparser==2.23 04:54:03 pyhocon==0.3.61 04:54:03 PyNaCl==1.6.0 04:54:03 pyparsing==3.2.3 04:54:03 PySocks==1.7.1 04:54:03 python-dateutil==2.9.0.post0 04:54:03 regex==2025.9.1 04:54:03 requests==2.32.5 04:54:03 robotframework==7.3.2 04:54:03 robotframework-httplibrary==0.4.2 04:54:03 robotframework-pythonlibcore==4.4.1 04:54:03 robotframework-requests==0.9.7 04:54:03 robotframework-selenium2library==3.0.0 04:54:03 robotframework-seleniumlibrary==6.7.1 04:54:03 robotframework-sshlibrary==3.8.0 04:54:03 scapy==2.6.1 04:54:03 scp==0.15.0 04:54:03 selenium==4.35.0 04:54:03 six==1.17.0 04:54:03 sniffio==1.3.1 04:54:03 sortedcontainers==2.4.0 04:54:03 soupsieve==2.8 04:54:03 trio==0.30.0 04:54:03 trio-websocket==0.12.2 04:54:03 typing_extensions==4.14.1 04:54:03 urllib3==2.5.0 04:54:03 virtualenv==20.34.0 04:54:03 waitress==3.0.2 04:54:03 WebOb==1.8.9 04:54:03 websocket-client==1.8.0 04:54:03 WebTest==3.0.6 04:54:03 wsproto==1.2.0 04:54:06 [EnvInject] - Injecting environment variables from a build step. 04:54:06 [EnvInject] - Injecting as environment variables the properties file path 'env.properties' 04:54:06 [EnvInject] - Variables injected successfully. 04:54:06 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins11986776861445691661.sh 04:54:06 Setup pyenv: 04:54:07 system 04:54:07 3.8.13 04:54:07 3.9.13 04:54:07 3.10.13 04:54:07 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 04:54:07 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 04:54:09 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient yq 04:54:16 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:54:16 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:54:16 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 04:54:16 + ODL_SYSTEM=() 04:54:16 + TOOLS_SYSTEM=() 04:54:16 + OPENSTACK_SYSTEM=() 04:54:16 + OPENSTACK_CONTROLLERS=() 04:54:16 + mapfile -t ADDR 04:54:16 ++ openstack stack show -f json -c outputs releng-openflowplugin-csit-3node-clustering-only-titanium-406 04:54:16 ++ jq -r '.outputs[] | select(.output_key | match("^vm_[0-9]+_ips$")) | .output_value | .[]' 04:54:18 + for i in "${ADDR[@]}" 04:54:18 ++ ssh 10.30.170.73 hostname -s 04:54:18 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:18 + REMHOST=releng-62057-406-0-builder-0 04:54:18 + case ${REMHOST} in 04:54:18 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:54:18 + for i in "${ADDR[@]}" 04:54:18 ++ ssh 10.30.171.201 hostname -s 04:54:18 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:18 + REMHOST=releng-62057-406-0-builder-1 04:54:18 + case ${REMHOST} in 04:54:18 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:54:18 + for i in "${ADDR[@]}" 04:54:18 ++ ssh 10.30.170.175 hostname -s 04:54:18 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:18 + REMHOST=releng-62057-406-0-builder-2 04:54:18 + case ${REMHOST} in 04:54:18 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:54:18 + for i in "${ADDR[@]}" 04:54:18 ++ ssh 10.30.171.2 hostname -s 04:54:19 Warning: Permanently added '10.30.171.2' (ECDSA) to the list of known hosts. 04:54:19 + REMHOST=releng-62057-406-1-mininet-ovs-217-0 04:54:19 + case ${REMHOST} in 04:54:19 + TOOLS_SYSTEM=("${TOOLS_SYSTEM[@]}" "${i}") 04:54:19 + echo NUM_ODL_SYSTEM=3 04:54:19 + echo NUM_TOOLS_SYSTEM=1 04:54:19 + '[' '' == yes ']' 04:54:19 + NUM_OPENSTACK_SYSTEM=0 04:54:19 + echo NUM_OPENSTACK_SYSTEM=0 04:54:19 + '[' 0 -eq 2 ']' 04:54:19 + echo ODL_SYSTEM_IP=10.30.170.73 04:54:19 ++ seq 0 2 04:54:19 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:54:19 + echo ODL_SYSTEM_1_IP=10.30.170.73 04:54:19 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:54:19 + echo ODL_SYSTEM_2_IP=10.30.171.201 04:54:19 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:54:19 + echo ODL_SYSTEM_3_IP=10.30.170.175 04:54:19 + echo TOOLS_SYSTEM_IP=10.30.171.2 04:54:19 ++ seq 0 0 04:54:19 + for i in $(seq 0 $(( ${#TOOLS_SYSTEM[@]} - 1 ))) 04:54:19 + echo TOOLS_SYSTEM_1_IP=10.30.171.2 04:54:19 + openstack_index=0 04:54:19 + NUM_OPENSTACK_CONTROL_NODES=1 04:54:19 + echo NUM_OPENSTACK_CONTROL_NODES=1 04:54:19 ++ seq 0 0 04:54:19 + for i in $(seq 0 $((NUM_OPENSTACK_CONTROL_NODES - 1))) 04:54:19 + echo OPENSTACK_CONTROL_NODE_1_IP= 04:54:19 + NUM_OPENSTACK_COMPUTE_NODES=-1 04:54:19 + echo NUM_OPENSTACK_COMPUTE_NODES=-1 04:54:19 + '[' -1 -ge 2 ']' 04:54:19 ++ seq 0 -2 04:54:19 + NUM_OPENSTACK_HAPROXY_NODES=0 04:54:19 + echo NUM_OPENSTACK_HAPROXY_NODES=0 04:54:19 ++ seq 0 -1 04:54:19 + echo 'Contents of slave_addresses.txt:' 04:54:19 Contents of slave_addresses.txt: 04:54:19 + cat slave_addresses.txt 04:54:19 NUM_ODL_SYSTEM=3 04:54:19 NUM_TOOLS_SYSTEM=1 04:54:19 NUM_OPENSTACK_SYSTEM=0 04:54:19 ODL_SYSTEM_IP=10.30.170.73 04:54:19 ODL_SYSTEM_1_IP=10.30.170.73 04:54:19 ODL_SYSTEM_2_IP=10.30.171.201 04:54:19 ODL_SYSTEM_3_IP=10.30.170.175 04:54:19 TOOLS_SYSTEM_IP=10.30.171.2 04:54:19 TOOLS_SYSTEM_1_IP=10.30.171.2 04:54:19 NUM_OPENSTACK_CONTROL_NODES=1 04:54:19 OPENSTACK_CONTROL_NODE_1_IP= 04:54:19 NUM_OPENSTACK_COMPUTE_NODES=-1 04:54:19 NUM_OPENSTACK_HAPROXY_NODES=0 04:54:19 [EnvInject] - Injecting environment variables from a build step. 04:54:19 [EnvInject] - Injecting as environment variables the properties file path 'slave_addresses.txt' 04:54:19 [EnvInject] - Variables injected successfully. 04:54:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins6115873680285583309.sh 04:54:19 Preparing for JRE Version 21 04:54:19 Karaf artifact is karaf 04:54:19 Karaf project is integration 04:54:19 Java home is /usr/lib/jvm/java-21-openjdk-amd64 04:54:19 [EnvInject] - Injecting environment variables from a build step. 04:54:19 [EnvInject] - Injecting as environment variables the properties file path 'set_variables.env' 04:54:19 [EnvInject] - Variables injected successfully. 04:54:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins1247689088044178177.sh 04:54:19 Distribution bundle URL is https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:19 Distribution bundle is karaf-0.22.1.zip 04:54:19 Distribution bundle version is 0.22.1 04:54:19 Distribution folder is karaf-0.22.1 04:54:19 Nexus prefix is https://nexus.opendaylight.org 04:54:19 [EnvInject] - Injecting environment variables from a build step. 04:54:19 [EnvInject] - Injecting as environment variables the properties file path 'detect_variables.env' 04:54:19 [EnvInject] - Variables injected successfully. 04:54:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins2011525381722721904.sh 04:54:19 Setup pyenv: 04:54:19 system 04:54:19 3.8.13 04:54:19 3.9.13 04:54:19 3.10.13 04:54:19 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 04:54:19 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 04:54:21 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:54:21 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:54:21 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient 04:54:27 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:54:27 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:54:28 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 04:54:28 Copying common-functions.sh to /tmp 04:54:29 Copying common-functions.sh to 10.30.170.73:/tmp 04:54:29 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:30 Copying common-functions.sh to 10.30.171.201:/tmp 04:54:30 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:30 Copying common-functions.sh to 10.30.170.175:/tmp 04:54:30 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:31 Copying common-functions.sh to 10.30.171.2:/tmp 04:54:31 Warning: Permanently added '10.30.171.2' (ECDSA) to the list of known hosts. 04:54:31 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins11001405144771700182.sh 04:54:31 common-functions.sh is being sourced 04:54:31 common-functions environment: 04:54:31 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:31 ACTUALFEATURES: 04:54:31 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:31 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 04:54:31 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:31 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 04:54:31 CONTROLLERMEM: 2048m 04:54:31 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:31 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:31 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:31 SUITES: 04:54:31 04:54:31 ################################################# 04:54:31 ## Configure Cluster and Start ## 04:54:31 ################################################# 04:54:31 ACTUALFEATURES: odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer 04:54:31 SPACE_SEPARATED_FEATURES: odl-infrautils-ready odl-jolokia odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer 04:54:31 Locating script plan to use... 04:54:31 Finished running script plans 04:54:31 Configuring member-1 with IP address 10.30.170.73 04:54:31 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:31 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:32 + source /tmp/common-functions.sh karaf-0.22.1 titanium 04:54:32 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:54:32 common-functions.sh is being sourced 04:54:32 ++ echo 'common-functions.sh is being sourced' 04:54:32 ++ BUNDLEFOLDER=karaf-0.22.1 04:54:32 ++ DISTROSTREAM=titanium 04:54:32 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:32 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:32 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:32 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:32 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:32 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:32 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:32 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:32 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:32 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:32 ++ export CONTROLLERMEM= 04:54:32 ++ CONTROLLERMEM= 04:54:32 ++ case "${DISTROSTREAM}" in 04:54:32 ++ CLUSTER_SYSTEM=pekko 04:54:32 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:32 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:32 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:32 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:32 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:32 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:32 ++ print_common_env 04:54:32 ++ cat 04:54:32 common-functions environment: 04:54:32 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:32 ACTUALFEATURES: 04:54:32 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:32 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 04:54:32 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:32 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 04:54:32 CONTROLLERMEM: 04:54:32 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:32 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:32 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:32 SUITES: 04:54:32 04:54:32 ++ SSH='ssh -t -t' 04:54:32 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:54:32 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:54:32 Changing to /tmp 04:54:32 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:32 + echo 'Changing to /tmp' 04:54:32 + cd /tmp 04:54:32 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 04:54:32 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:32 --2025-09-13 04:54:32-- https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:32 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:54:32 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:54:32 HTTP request sent, awaiting response... 200 OK 04:54:32 Length: 236634504 (226M) [application/zip] 04:54:32 Saving to: ‘karaf-0.22.1.zip’ 04:54:32 04:54:32 0K ........ ........ ........ ........ ........ ........ 1% 66.9M 3s 04:54:32 3072K ........ ........ ........ ........ ........ ........ 2% 115M 3s 04:54:32 6144K ........ ........ ........ ........ ........ ........ 3% 131M 2s 04:54:32 9216K ........ ........ ........ ........ ........ ........ 5% 159M 2s 04:54:32 12288K ........ ........ ........ ........ ........ ........ 6% 161M 2s 04:54:32 15360K ........ ........ ........ ........ ........ ........ 7% 188M 2s 04:54:32 18432K ........ ........ ........ ........ ........ ........ 9% 190M 2s 04:54:32 21504K ........ ........ ........ ........ ........ ........ 10% 201M 1s 04:54:32 24576K ........ ........ ........ ........ ........ ........ 11% 194M 1s 04:54:32 27648K ........ ........ ........ ........ ........ ........ 13% 182M 1s 04:54:32 30720K ........ ........ ........ ........ ........ ........ 14% 198M 1s 04:54:32 33792K ........ ........ ........ ........ ........ ........ 15% 236M 1s 04:54:32 36864K ........ ........ ........ ........ ........ ........ 17% 246M 1s 04:54:32 39936K ........ ........ ........ ........ ........ ........ 18% 177M 1s 04:54:32 43008K ........ ........ ........ ........ ........ ........ 19% 200M 1s 04:54:32 46080K ........ ........ ........ ........ ........ ........ 21% 180M 1s 04:54:32 49152K ........ ........ ........ ........ ........ ........ 22% 210M 1s 04:54:32 52224K ........ ........ ........ ........ ........ ........ 23% 220M 1s 04:54:32 55296K ........ ........ ........ ........ ........ ........ 25% 232M 1s 04:54:32 58368K ........ ........ ........ ........ ........ ........ 26% 248M 1s 04:54:32 61440K ........ ........ ........ ........ ........ ........ 27% 219M 1s 04:54:32 64512K ........ ........ ........ ........ ........ ........ 29% 203M 1s 04:54:32 67584K ........ ........ ........ ........ ........ ........ 30% 172M 1s 04:54:32 70656K ........ ........ ........ ........ ........ ........ 31% 176M 1s 04:54:32 73728K ........ ........ ........ ........ ........ ........ 33% 153M 1s 04:54:32 76800K ........ ........ ........ ........ ........ ........ 34% 188M 1s 04:54:32 79872K ........ ........ ........ ........ ........ ........ 35% 230M 1s 04:54:32 82944K ........ ........ ........ ........ ........ ........ 37% 212M 1s 04:54:32 86016K ........ ........ ........ ........ ........ ........ 38% 189M 1s 04:54:32 89088K ........ ........ ........ ........ ........ ........ 39% 169M 1s 04:54:32 92160K ........ ........ ........ ........ ........ ........ 41% 195M 1s 04:54:32 95232K ........ ........ ........ ........ ........ ........ 42% 237M 1s 04:54:32 98304K ........ ........ ........ ........ ........ ........ 43% 202M 1s 04:54:32 101376K ........ ........ ........ ........ ........ ........ 45% 126M 1s 04:54:32 104448K ........ ........ ........ ........ ........ ........ 46% 170M 1s 04:54:32 107520K ........ ........ ........ ........ ........ ........ 47% 214M 1s 04:54:32 110592K ........ ........ ........ ........ ........ ........ 49% 237M 1s 04:54:32 113664K ........ ........ ........ ........ ........ ........ 50% 200M 1s 04:54:32 116736K ........ ........ ........ ........ ........ ........ 51% 164M 1s 04:54:32 119808K ........ ........ ........ ........ ........ ........ 53% 142M 1s 04:54:32 122880K ........ ........ ........ ........ ........ ........ 54% 154M 1s 04:54:32 125952K ........ ........ ........ ........ ........ ........ 55% 98.3M 1s 04:54:32 129024K ........ ........ ........ ........ ........ ........ 57% 134M 1s 04:54:32 132096K ........ ........ ........ ........ ........ ........ 58% 206M 1s 04:54:32 135168K ........ ........ ........ ........ ........ ........ 59% 137M 1s 04:54:32 138240K ........ ........ ........ ........ ........ ........ 61% 127M 1s 04:54:32 141312K ........ ........ ........ ........ ........ ........ 62% 167M 0s 04:54:32 144384K ........ ........ ........ ........ ........ ........ 63% 164M 0s 04:54:33 147456K ........ ........ ........ ........ ........ ........ 65% 164M 0s 04:54:33 150528K ........ ........ ........ ........ ........ ........ 66% 165M 0s 04:54:33 153600K ........ ........ ........ ........ ........ ........ 67% 178M 0s 04:54:33 156672K ........ ........ ........ ........ ........ ........ 69% 163M 0s 04:54:33 159744K ........ ........ ........ ........ ........ ........ 70% 68.9M 0s 04:54:33 162816K ........ ........ ........ ........ ........ ........ 71% 190M 0s 04:54:33 165888K ........ ........ ........ ........ ........ ........ 73% 175M 0s 04:54:33 168960K ........ ........ ........ ........ ........ ........ 74% 144M 0s 04:54:33 172032K ........ ........ ........ ........ ........ ........ 75% 96.2M 0s 04:54:33 175104K ........ ........ ........ ........ ........ ........ 77% 136M 0s 04:54:33 178176K ........ ........ ........ ........ ........ ........ 78% 139M 0s 04:54:33 181248K ........ ........ ........ ........ ........ ........ 79% 191M 0s 04:54:33 184320K ........ ........ ........ ........ ........ ........ 81% 132M 0s 04:54:33 187392K ........ ........ ........ ........ ........ ........ 82% 126M 0s 04:54:33 190464K ........ ........ ........ ........ ........ ........ 83% 143M 0s 04:54:33 193536K ........ ........ ........ ........ ........ ........ 85% 77.7M 0s 04:54:33 196608K ........ ........ ........ ........ ........ ........ 86% 163M 0s 04:54:33 199680K ........ ........ ........ ........ ........ ........ 87% 170M 0s 04:54:33 202752K ........ ........ ........ ........ ........ ........ 89% 111M 0s 04:54:33 205824K ........ ........ ........ ........ ........ ........ 90% 179M 0s 04:54:33 208896K ........ ........ ........ ........ ........ ........ 91% 79.9M 0s 04:54:33 211968K ........ ........ ........ ........ ........ ........ 93% 193M 0s 04:54:33 215040K ........ ........ ........ ........ ........ ........ 94% 145M 0s 04:54:33 218112K ........ ........ ........ ........ ........ ........ 95% 143M 0s 04:54:33 221184K ........ ........ ........ ........ ........ ........ 97% 130M 0s 04:54:33 224256K ........ ........ ........ ........ ........ ........ 98% 102M 0s 04:54:33 227328K ........ ........ ........ ........ ........ ........ 99% 93.9M 0s 04:54:33 230400K ........ .. 100% 111M=1.5s 04:54:33 04:54:33 2025-09-13 04:54:33 (153 MB/s) - ‘karaf-0.22.1.zip’ saved [236634504/236634504] 04:54:33 04:54:33 Extracting the new controller... 04:54:33 + echo 'Extracting the new controller...' 04:54:33 + unzip -q karaf-0.22.1.zip 04:54:35 Adding external repositories... 04:54:35 + echo 'Adding external repositories...' 04:54:35 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:35 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:35 ################################################################################ 04:54:35 # 04:54:35 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:35 # contributor license agreements. See the NOTICE file distributed with 04:54:35 # this work for additional information regarding copyright ownership. 04:54:35 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:35 # (the "License"); you may not use this file except in compliance with 04:54:35 # the License. You may obtain a copy of the License at 04:54:35 # 04:54:35 # http://www.apache.org/licenses/LICENSE-2.0 04:54:35 # 04:54:35 # Unless required by applicable law or agreed to in writing, software 04:54:35 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:35 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:35 # See the License for the specific language governing permissions and 04:54:35 # limitations under the License. 04:54:35 # 04:54:35 ################################################################################ 04:54:35 04:54:35 # 04:54:35 # If set to true, the following property will not allow any certificate to be used 04:54:35 # when accessing Maven repositories through SSL 04:54:35 # 04:54:35 #org.ops4j.pax.url.mvn.certificateCheck= 04:54:35 04:54:35 # 04:54:35 # Path to the local Maven settings file. 04:54:35 # The repositories defined in this file will be automatically added to the list 04:54:35 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:54:35 # below is not set. 04:54:35 # The following locations are checked for the existence of the settings.xml file 04:54:35 # * 1. looks for the specified url 04:54:35 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:54:35 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:54:35 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:54:35 # 04:54:35 #org.ops4j.pax.url.mvn.settings= 04:54:35 04:54:35 # 04:54:35 # Path to the local Maven repository which is used to avoid downloading 04:54:35 # artifacts when they already exist locally. 04:54:35 # The value of this property will be extracted from the settings.xml file 04:54:35 # above, or defaulted to: 04:54:35 # System.getProperty( "user.home" ) + "/.m2/repository" 04:54:35 # 04:54:35 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:54:35 04:54:35 # 04:54:35 # Default this to false. It's just weird to use undocumented repos 04:54:35 # 04:54:35 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:54:35 04:54:35 # 04:54:35 # Uncomment if you don't wanna use the proxy settings 04:54:35 # from the Maven conf/settings.xml file 04:54:35 # 04:54:35 # org.ops4j.pax.url.mvn.proxySupport=false 04:54:35 04:54:35 # 04:54:35 # Comma separated list of repositories scanned when resolving an artifact. 04:54:35 # Those repositories will be checked before iterating through the 04:54:35 # below list of repositories and even before the local repository 04:54:35 # A repository url can be appended with zero or more of the following flags: 04:54:35 # @snapshots : the repository contains snaphots 04:54:35 # @noreleases : the repository does not contain any released artifacts 04:54:35 # 04:54:35 # The following property value will add the system folder as a repo. 04:54:35 # 04:54:35 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:54:35 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:54:35 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:54:35 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:54:35 04:54:35 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:54:35 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:54:35 04:54:35 # 04:54:35 # Comma separated list of repositories scanned when resolving an artifact. 04:54:35 # The default list includes the following repositories: 04:54:35 # http://repo1.maven.org/maven2@id=central 04:54:35 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:54:35 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:54:35 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:54:35 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:54:35 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:54:35 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:35 # To add repositories to the default ones, prepend '+' to the list of repositories 04:54:35 # to add. 04:54:35 # A repository url can be appended with zero or more of the following flags: 04:54:35 # @snapshots : the repository contains snapshots 04:54:35 # @noreleases : the repository does not contain any released artifacts 04:54:35 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:54:35 # 04:54:35 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:35 04:54:35 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.+ [[ True == \T\r\u\e ]] 04:54:35 Configuring the startup features... 04:54:35 + echo 'Configuring the startup features...' 04:54:35 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:35 + FEATURE_TEST_STRING=features-test 04:54:35 + FEATURE_TEST_VERSION=0.22.1 04:54:35 + KARAF_VERSION=karaf4 04:54:35 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:54:35 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:35 + [[ ! -z '' ]] 04:54:35 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:35 ################################################################################ 04:54:35 # 04:54:35 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:35 # contributor license agreements. See the NOTICE file distributed with 04:54:35 # this work for additional information regarding copyright ownership. 04:54:35 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:35 # (the "License"); you may not use this file except in compliance with 04:54:35 # the License. You may obtain a copy of the License at 04:54:35 # 04:54:35 # http://www.apache.org/licenses/LICENSE-2.0 04:54:35 # 04:54:35 # Unless required by applicable law or agreed to in writing, software 04:54:35 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:35 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:35 # See the License for the specific language governing permissions and 04:54:35 # limitations under the License. 04:54:35 # 04:54:35 ################################################################################ 04:54:35 04:54:35 # 04:54:35 # Comma separated list of features repositories to register by default 04:54:35 # 04:54:35 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/d7b3d213-29db-4534-93a0-031e5065da16.xml 04:54:35 04:54:35 # 04:54:35 # Comma separated list of features to install at startup 04:54:35 # 04:54:35 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 510b7ed9-e8dc-40d4-b97c-ab7891511bec 04:54:35 04:54:35 # 04:54:35 # Resource repositories (OBR) that the features resolver can use 04:54:35 # to resolve requirements/capabilities 04:54:35 # 04:54:35 # The format of the resourceRepositories is 04:54:35 # resourceRepositories=[xml:url|json:url],... 04:54:35 # for Instance: 04:54:35 # 04:54:35 #resourceRepositories=xml:http://host/path/to/index.xml 04:54:35 # or 04:54:35 #resourceRepositories=json:http://host/path/to/index.json 04:54:35 # 04:54:35 04:54:35 # 04:54:35 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:54:35 # 04:54:35 featuresBootAsynchronous=false 04:54:35 04:54:35 # 04:54:35 # Service requirements enforcement 04:54:35 # 04:54:35 # By default, the feature resolver checks the service requirements/capabilities of 04:54:35 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:54:35 # the required bundles. 04:54:35 # The following flag can have those values: 04:54:35 # - disable: service requirements are completely ignored 04:54:35 # - default: service requirements are ignored for old features 04:54:35 # - enforce: service requirements are always verified 04:54:35 # 04:54:35 #serviceRequirements=default 04:54:35 04:54:35 # 04:54:35 # Store cfg file for config element in feature 04:54:35 # 04:54:35 #configCfgStore=true 04:54:35 04:54:35 # 04:54:35 # Define if the feature service automatically refresh bundles 04:54:35 # 04:54:35 autoRefresh=true 04:54:35 04:54:35 # 04:54:35 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:54:35 # XML file defines instructions related to features processing 04:54:35 # versions.properties may declare properties to resolve placeholders in XML file 04:54:35 # both files are relative to ${karaf.etc} 04:54:35 # 04:54:35 #featureProcessing=org.apache.karaf.features.xml 04:54:35 #featureProcessingVersions=versions.properties 04:54:35 + configure_karaf_log karaf4 '' 04:54:35 + local -r karaf_version=karaf4 04:54:35 + local -r controllerdebugmap= 04:54:35 + local logapi=log4j 04:54:35 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:35 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:35 log4j2.rootLogger.level = INFO 04:54:35 #log4j2.rootLogger.type = asyncRoot 04:54:35 #log4j2.rootLogger.includeLocation = false 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:35 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:35 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:35 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:35 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:35 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:35 log4j2.logger.spifly.level = WARN 04:54:35 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:35 log4j2.logger.audit.level = INFO 04:54:35 log4j2.logger.audit.additivity = false 04:54:35 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:35 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:35 log4j2.appender.console.type = Console 04:54:35 log4j2.appender.console.name = Console 04:54:35 log4j2.appender.console.layout.type = PatternLayout 04:54:35 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:35 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:35 log4j2.appender.rolling.name = RollingFile 04:54:35 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:35 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:35 #log4j2.appender.rolling.immediateFlush = false 04:54:35 log4j2.appender.rolling.append = true 04:54:35 log4j2.appender.rolling.layout.type = PatternLayout 04:54:35 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:35 log4j2.appender.rolling.policies.type = Policies 04:54:35 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:35 log4j2.appender.rolling.policies.size.size = 64MB 04:54:35 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:35 log4j2.appender.rolling.strategy.max = 7 04:54:35 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:35 log4j2.appender.audit.name = AuditRollingFile 04:54:35 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:35 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:35 log4j2.appender.audit.append = true 04:54:35 log4j2.appender.audit.layout.type = PatternLayout 04:54:35 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:35 log4j2.appender.audit.policies.type = Policies 04:54:35 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:35 log4j2.appender.audit.policies.size.size = 8MB 04:54:35 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:35 log4j2.appender.audit.strategy.max = 7 04:54:35 log4j2.appender.osgi.type = PaxOsgi 04:54:35 log4j2.appender.osgi.name = PaxOsgi 04:54:35 log4j2.appender.osgi.filter = * 04:54:35 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:35 #log4j2.logger.aether.level = TRACE 04:54:35 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:35 #log4j2.logger.http-headers.level = DEBUG 04:54:35 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:35 #log4j2.logger.maven.level = TRACE 04:54:35 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:54:35 + logapi=log4j2 04:54:35 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:54:35 + '[' log4j2 == log4j2 ']' 04:54:35 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:35 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:54:35 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:54:35 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:54:35 controllerdebugmap: 04:54:35 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:35 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:54:35 + unset IFS 04:54:35 + echo 'controllerdebugmap: ' 04:54:35 + '[' -n '' ']' 04:54:35 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 04:54:35 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:35 ################################################################################ 04:54:35 # 04:54:35 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:35 # contributor license agreements. See the NOTICE file distributed with 04:54:35 # this work for additional information regarding copyright ownership. 04:54:35 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:35 # (the "License"); you may not use this file except in compliance with 04:54:35 # the License. You may obtain a copy of the License at 04:54:35 # 04:54:35 # http://www.apache.org/licenses/LICENSE-2.0 04:54:35 # 04:54:35 # Unless required by applicable law or agreed to in writing, software 04:54:35 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:35 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:35 # See the License for the specific language governing permissions and 04:54:35 # limitations under the License. 04:54:35 # 04:54:35 ################################################################################ 04:54:35 04:54:35 # Common pattern layout for appenders 04:54:35 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:35 04:54:35 # Root logger 04:54:35 log4j2.rootLogger.level = INFO 04:54:35 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:35 #log4j2.rootLogger.type = asyncRoot 04:54:35 #log4j2.rootLogger.includeLocation = false 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:35 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:35 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:35 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:35 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:35 04:54:35 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:35 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:35 04:54:35 # Loggers configuration 04:54:35 04:54:35 # Spifly logger 04:54:35 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:35 log4j2.logger.spifly.level = WARN 04:54:35 04:54:35 # Security audit logger 04:54:35 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:35 log4j2.logger.audit.level = INFO 04:54:35 log4j2.logger.audit.additivity = false 04:54:35 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:35 04:54:35 # Appenders configuration 04:54:35 04:54:35 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:35 log4j2.appender.console.type = Console 04:54:35 log4j2.appender.console.name = Console 04:54:35 log4j2.appender.console.layout.type = PatternLayout 04:54:35 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:35 04:54:35 # Rolling file appender 04:54:35 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:35 log4j2.appender.rolling.name = RollingFile 04:54:35 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:35 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:35 # uncomment to not force a disk flush 04:54:35 #log4j2.appender.rolling.immediateFlush = false 04:54:35 log4j2.appender.rolling.append = true 04:54:35 log4j2.appender.rolling.layout.type = PatternLayout 04:54:35 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:35 log4j2.appender.rolling.policies.type = Policies 04:54:35 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:35 log4j2.appender.rolling.policies.size.size = 1GB 04:54:35 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:35 log4j2.appender.rolling.strategy.max = 7 04:54:35 04:54:35 # Audit file appender 04:54:35 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:35 log4j2.appender.audit.name = AuditRollingFile 04:54:35 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:35 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:35 log4j2.appender.audit.append = true 04:54:35 log4j2.appender.audit.layout.type = PatternLayout 04:54:35 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:35 log4j2.appender.audit.policies.type = Policies 04:54:35 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:35 log4j2.appender.audit.policies.size.size = 8MB 04:54:35 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:35 log4j2.appender.audit.strategy.max = 7 04:54:35 04:54:35 # OSGi appender 04:54:35 log4j2.appender.osgi.type = PaxOsgi 04:54:35 log4j2.appender.osgi.name = PaxOsgi 04:54:35 log4j2.appender.osgi.filter = * 04:54:35 04:54:35 # help with identification of maven-related problems with pax-url-aether 04:54:35 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:35 #log4j2.logger.aether.level = TRACE 04:54:35 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:35 #log4j2.logger.http-headers.level = DEBUG 04:54:35 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:35 #log4j2.logger.maven.level = TRACE 04:54:35 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:35 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:35 Configure 04:54:35 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 04:54:35 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:54:35 + local -r controllermem=2048m 04:54:35 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 04:54:35 + echo Configure 04:54:35 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:54:35 max memory: 2048m 04:54:35 memconf: /tmp/karaf-0.22.1/bin/setenv 04:54:35 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:54:35 + echo ' max memory: 2048m' 04:54:35 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 04:54:35 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 04:54:35 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 04:54:35 cat /tmp/karaf-0.22.1/bin/setenv 04:54:35 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 04:54:35 + cat /tmp/karaf-0.22.1/bin/setenv 04:54:35 #!/bin/sh 04:54:35 # 04:54:35 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:35 # contributor license agreements. See the NOTICE file distributed with 04:54:35 # this work for additional information regarding copyright ownership. 04:54:35 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:35 # (the "License"); you may not use this file except in compliance with 04:54:35 # the License. You may obtain a copy of the License at 04:54:35 # 04:54:35 # http://www.apache.org/licenses/LICENSE-2.0 04:54:35 # 04:54:35 # Unless required by applicable law or agreed to in writing, software 04:54:35 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:35 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:35 # See the License for the specific language governing permissions and 04:54:35 # limitations under the License. 04:54:35 # 04:54:35 04:54:35 # 04:54:35 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:54:35 # script: client, instance, shell, start, status, stop, karaf 04:54:35 # 04:54:35 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:54:35 # Actions go here... 04:54:35 # fi 04:54:35 04:54:35 # 04:54:35 # general settings which should be applied for all scripts go here; please keep 04:54:35 # in mind that it is possible that scripts might be executed more than once, e.g. 04:54:35 # in example of the start script where the start script is executed first and the 04:54:35 # karaf script afterwards. 04:54:35 # 04:54:35 04:54:35 # 04:54:35 # The following section shows the possible configuration options for the default 04:54:35 # karaf scripts 04:54:35 # 04:54:35 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:54:35 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:54:35 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:54:35 # export EXTRA_JAVA_OPTS # Additional JVM options 04:54:35 # export KARAF_HOME # Karaf home folder 04:54:35 # export KARAF_DATA # Karaf data folder 04:54:35 # export KARAF_BASE # Karaf base folder 04:54:35 # export KARAF_ETC # Karaf etc folder 04:54:35 # export KARAF_LOG # Karaf log folder 04:54:35 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:54:35 # export KARAF_OPTS # Additional available Karaf options 04:54:35 # export KARAF_DEBUG # Enable debug mode 04:54:35 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:54:35 # export KARAF_NOROOT # Prevent execution as root if set to true 04:54:35 Set Java version 04:54:35 + echo 'Set Java version' 04:54:35 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:54:35 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:35 sudo: a password is required 04:54:35 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:35 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:35 sudo: a password is required 04:54:35 JDK default version ... 04:54:35 + echo 'JDK default version ...' 04:54:35 + java -version 04:54:36 openjdk version "21.0.5" 2024-10-15 04:54:36 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 04:54:36 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 04:54:36 Set JAVA_HOME 04:54:36 + echo 'Set JAVA_HOME' 04:54:36 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:36 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:36 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:36 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:36 Listing all open ports on controller system... 04:54:36 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:36 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:54:36 + echo 'Listing all open ports on controller system...' 04:54:36 + netstat -pnatu 04:54:36 /tmp/configuration-script.sh: line 40: netstat: command not found 04:54:36 Configuring cluster 04:54:36 + '[' -f /tmp/custom_shard_config.txt ']' 04:54:36 + echo 'Configuring cluster' 04:54:36 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 1 10.30.170.73 10.30.171.201 10.30.170.175 04:54:36 ################################################ 04:54:36 ## Configure Cluster ## 04:54:36 ################################################ 04:54:36 ERROR: Cluster configurations files not found. Please configure clustering feature. 04:54:36 Dump pekko.conf 04:54:36 + echo 'Dump pekko.conf' 04:54:36 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:36 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:36 Dump modules.conf 04:54:36 + echo 'Dump modules.conf' 04:54:36 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:36 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 04:54:36 Dump module-shards.conf 04:54:36 + echo 'Dump module-shards.conf' 04:54:36 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:36 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 04:54:36 Configuring member-2 with IP address 10.30.171.201 04:54:36 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:36 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:36 + source /tmp/common-functions.sh karaf-0.22.1 titanium 04:54:36 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:54:36 ++ echo 'common-functions.sh is being sourced' 04:54:36 common-functions.sh is being sourced 04:54:36 ++ BUNDLEFOLDER=karaf-0.22.1 04:54:36 ++ DISTROSTREAM=titanium 04:54:36 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:36 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:36 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:36 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:36 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:36 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:36 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:36 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:36 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:36 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:36 ++ export CONTROLLERMEM= 04:54:36 ++ CONTROLLERMEM= 04:54:36 ++ case "${DISTROSTREAM}" in 04:54:36 ++ CLUSTER_SYSTEM=pekko 04:54:36 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:36 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:36 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:36 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:36 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:36 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:36 ++ print_common_env 04:54:36 ++ cat 04:54:36 common-functions environment: 04:54:36 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:36 ACTUALFEATURES: 04:54:36 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:36 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 04:54:36 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:36 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 04:54:36 CONTROLLERMEM: 04:54:36 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:36 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:36 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:36 SUITES: 04:54:36 04:54:36 ++ SSH='ssh -t -t' 04:54:36 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:54:36 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:54:36 Changing to /tmp 04:54:36 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:36 + echo 'Changing to /tmp' 04:54:36 + cd /tmp 04:54:36 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 04:54:36 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:36 --2025-09-13 04:54:36-- https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:36 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:54:36 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:54:36 HTTP request sent, awaiting response... 200 OK 04:54:36 Length: 236634504 (226M) [application/zip] 04:54:36 Saving to: ‘karaf-0.22.1.zip’ 04:54:36 04:54:36 0K ........ ........ ........ ........ ........ ........ 1% 66.4M 3s 04:54:36 3072K ........ ........ ........ ........ ........ ........ 2% 86.7M 3s 04:54:36 6144K ........ ........ ........ ........ ........ ........ 3% 141M 2s 04:54:36 9216K ........ ........ ........ ........ ........ ........ 5% 158M 2s 04:54:36 12288K ........ ........ ........ ........ ........ ........ 6% 125M 2s 04:54:36 15360K ........ ........ ........ ........ ........ ........ 7% 184M 2s 04:54:36 18432K ........ ........ ........ ........ ........ ........ 9% 215M 2s 04:54:36 21504K ........ ........ ........ ........ ........ ........ 10% 247M 2s 04:54:37 24576K ........ ........ ........ ........ ........ ........ 11% 167M 2s 04:54:37 27648K ........ ........ ........ ........ ........ ........ 13% 221M 1s 04:54:37 30720K ........ ........ ........ ........ ........ ........ 14% 217M 1s 04:54:37 33792K ........ ........ ........ ........ ........ ........ 15% 205M 1s 04:54:37 36864K ........ ........ ........ ........ ........ ........ 17% 121M 1s 04:54:37 39936K ........ ........ ........ ........ ........ ........ 18% 208M 1s 04:54:37 43008K ........ ........ ........ ........ ........ ........ 19% 178M 1s 04:54:37 46080K ........ ........ ........ ........ ........ ........ 21% 164M 1s 04:54:37 49152K ........ ........ ........ ........ ........ ........ 22% 157M 1s 04:54:37 52224K ........ ........ ........ ........ ........ ........ 23% 125M 1s 04:54:37 55296K ........ ........ ........ ........ ........ ........ 25% 132M 1s 04:54:37 58368K ........ ........ ........ ........ ........ ........ 26% 291M 1s 04:54:37 61440K ........ ........ ........ ........ ........ ........ 27% 284M 1s 04:54:37 64512K ........ ........ ........ ........ ........ ........ 29% 278M 1s 04:54:37 67584K ........ ........ ........ ........ ........ ........ 30% 313M 1s 04:54:37 70656K ........ ........ ........ ........ ........ ........ 31% 311M 1s 04:54:37 73728K ........ ........ ........ ........ ........ ........ 33% 324M 1s 04:54:37 76800K ........ ........ ........ ........ ........ ........ 34% 297M 1s 04:54:37 79872K ........ ........ ........ ........ ........ ........ 35% 311M 1s 04:54:37 82944K ........ ........ ........ ........ ........ ........ 37% 338M 1s 04:54:37 86016K ........ ........ ........ ........ ........ ........ 38% 321M 1s 04:54:37 89088K ........ ........ ........ ........ ........ ........ 39% 346M 1s 04:54:37 92160K ........ ........ ........ ........ ........ ........ 41% 336M 1s 04:54:37 95232K ........ ........ ........ ........ ........ ........ 42% 345M 1s 04:54:37 98304K ........ ........ ........ ........ ........ ........ 43% 340M 1s 04:54:37 101376K ........ ........ ........ ........ ........ ........ 45% 341M 1s 04:54:37 104448K ........ ........ ........ ........ ........ ........ 46% 325M 1s 04:54:37 107520K ........ ........ ........ ........ ........ ........ 47% 333M 1s 04:54:37 110592K ........ ........ ........ ........ ........ ........ 49% 324M 1s 04:54:37 113664K ........ ........ ........ ........ ........ ........ 50% 334M 1s 04:54:37 116736K ........ ........ ........ ........ ........ ........ 51% 335M 1s 04:54:37 119808K ........ ........ ........ ........ ........ ........ 53% 228M 1s 04:54:37 122880K ........ ........ ........ ........ ........ ........ 54% 182M 1s 04:54:37 125952K ........ ........ ........ ........ ........ ........ 55% 234M 0s 04:54:37 129024K ........ ........ ........ ........ ........ ........ 57% 207M 0s 04:54:37 132096K ........ ........ ........ ........ ........ ........ 58% 181M 0s 04:54:37 135168K ........ ........ ........ ........ ........ ........ 59% 278M 0s 04:54:37 138240K ........ ........ ........ ........ ........ ........ 61% 365M 0s 04:54:37 141312K ........ ........ ........ ........ ........ ........ 62% 279M 0s 04:54:37 144384K ........ ........ ........ ........ ........ ........ 63% 303M 0s 04:54:37 147456K ........ ........ ........ ........ ........ ........ 65% 309M 0s 04:54:37 150528K ........ ........ ........ ........ ........ ........ 66% 301M 0s 04:54:37 153600K ........ ........ ........ ........ ........ ........ 67% 311M 0s 04:54:37 156672K ........ ........ ........ ........ ........ ........ 69% 310M 0s 04:54:37 159744K ........ ........ ........ ........ ........ ........ 70% 300M 0s 04:54:37 162816K ........ ........ ........ ........ ........ ........ 71% 320M 0s 04:54:37 165888K ........ ........ ........ ........ ........ ........ 73% 315M 0s 04:54:37 168960K ........ ........ ........ ........ ........ ........ 74% 329M 0s 04:54:37 172032K ........ ........ ........ ........ ........ ........ 75% 333M 0s 04:54:37 175104K ........ ........ ........ ........ ........ ........ 77% 339M 0s 04:54:37 178176K ........ ........ ........ ........ ........ ........ 78% 316M 0s 04:54:37 181248K ........ ........ ........ ........ ........ ........ 79% 254M 0s 04:54:37 184320K ........ ........ ........ ........ ........ ........ 81% 338M 0s 04:54:37 187392K ........ ........ ........ ........ ........ ........ 82% 297M 0s 04:54:37 190464K ........ ........ ........ ........ ........ ........ 83% 291M 0s 04:54:37 193536K ........ ........ ........ ........ ........ ........ 85% 284M 0s 04:54:37 196608K ........ ........ ........ ........ ........ ........ 86% 315M 0s 04:54:37 199680K ........ ........ ........ ........ ........ ........ 87% 296M 0s 04:54:37 202752K ........ ........ ........ ........ ........ ........ 89% 279M 0s 04:54:37 205824K ........ ........ ........ ........ ........ ........ 90% 293M 0s 04:54:37 208896K ........ ........ ........ ........ ........ ........ 91% 259M 0s 04:54:37 211968K ........ ........ ........ ........ ........ ........ 93% 285M 0s 04:54:37 215040K ........ ........ ........ ........ ........ ........ 94% 241M 0s 04:54:37 218112K ........ ........ ........ ........ ........ ........ 95% 290M 0s 04:54:37 221184K ........ ........ ........ ........ ........ ........ 97% 279M 0s 04:54:37 224256K ........ ........ ........ ........ ........ ........ 98% 282M 0s 04:54:37 227328K ........ ........ ........ ........ ........ ........ 99% 264M 0s 04:54:37 230400K ........ .. 100% 294M=1.0s 04:54:37 04:54:37 2025-09-13 04:54:37 (234 MB/s) - ‘karaf-0.22.1.zip’ saved [236634504/236634504] 04:54:37 04:54:37 Extracting the new controller... 04:54:37 + echo 'Extracting the new controller...' 04:54:37 + unzip -q karaf-0.22.1.zip 04:54:39 Adding external repositories... 04:54:39 + echo 'Adding external repositories...' 04:54:39 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:39 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:39 ################################################################################ 04:54:39 # 04:54:39 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:39 # contributor license agreements. See the NOTICE file distributed with 04:54:39 # this work for additional information regarding copyright ownership. 04:54:39 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:39 # (the "License"); you may not use this file except in compliance with 04:54:39 # the License. You may obtain a copy of the License at 04:54:39 # 04:54:39 # http://www.apache.org/licenses/LICENSE-2.0 04:54:39 # 04:54:39 # Unless required by applicable law or agreed to in writing, software 04:54:39 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:39 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:39 # See the License for the specific language governing permissions and 04:54:39 # limitations under the License. 04:54:39 # 04:54:39 ################################################################################ 04:54:39 04:54:39 # 04:54:39 # If set to true, the following property will not allow any certificate to be used 04:54:39 # when accessing Maven repositories through SSL 04:54:39 # 04:54:39 #org.ops4j.pax.url.mvn.certificateCheck= 04:54:39 04:54:39 # 04:54:39 # Path to the local Maven settings file. 04:54:39 # The repositories defined in this file will be automatically added to the list 04:54:39 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:54:39 # below is not set. 04:54:39 # The following locations are checked for the existence of the settings.xml file 04:54:39 # * 1. looks for the specified url 04:54:39 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:54:39 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:54:39 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:54:39 # 04:54:39 #org.ops4j.pax.url.mvn.settings= 04:54:39 04:54:39 # 04:54:39 # Path to the local Maven repository which is used to avoid downloading 04:54:39 # artifacts when they already exist locally. 04:54:39 # The value of this property will be extracted from the settings.xml file 04:54:39 # above, or defaulted to: 04:54:39 # System.getProperty( "user.home" ) + "/.m2/repository" 04:54:39 # 04:54:39 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:54:39 04:54:39 # 04:54:39 # Default this to false. It's just weird to use undocumented repos 04:54:39 # 04:54:39 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:54:39 04:54:39 # 04:54:39 # Uncomment if you don't wanna use the proxy settings 04:54:39 # from the Maven conf/settings.xml file 04:54:39 # 04:54:39 # org.ops4j.pax.url.mvn.proxySupport=false 04:54:39 04:54:39 # 04:54:39 # Comma separated list of repositories scanned when resolving an artifact. 04:54:39 # Those repositories will be checked before iterating through the 04:54:39 # below list of repositories and even before the local repository 04:54:39 # A repository url can be appended with zero or more of the following flags: 04:54:39 # @snapshots : the repository contains snaphots 04:54:39 # @noreleases : the repository does not contain any released artifacts 04:54:39 # 04:54:39 # The following property value will add the system folder as a repo. 04:54:39 # 04:54:39 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:54:39 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:54:39 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:54:39 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:54:39 04:54:39 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:54:39 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:54:39 04:54:39 # 04:54:39 # Comma separated list of repositories scanned when resolving an artifact. 04:54:39 # The default list includes the following repositories: 04:54:39 # http://repo1.maven.org/maven2@id=central 04:54:39 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:54:39 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:54:39 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:54:39 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:54:39 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:54:39 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:39 # To add repositories to the default ones, prepend '+' to the list of repositories 04:54:39 # to add. 04:54:39 # A repository url can be appended with zero or more of the following flags: 04:54:39 # @snapshots : the repository contains snapshots 04:54:39 # @noreleases : the repository does not contain any released artifacts 04:54:39 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:54:39 # 04:54:39 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:39 04:54:39 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 04:54:39 + [[ True == \T\r\u\e ]] 04:54:39 + echo 'Configuring the startup features...' 04:54:39 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:39 ################################################################################ 04:54:39 # 04:54:39 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:39 # contributor license agreements. See the NOTICE file distributed with 04:54:39 # this work for additional information regarding copyright ownership. 04:54:39 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:39 # (the "License"); you may not use this file except in compliance with 04:54:39 # the License. You may obtain a copy of the License at 04:54:39 # 04:54:39 # http://www.apache.org/licenses/LICENSE-2.0 04:54:39 # 04:54:39 # Unless required by applicable law or agreed to in writing, software 04:54:39 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:39 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:39 # See the License for the specific language governing permissions and 04:54:39 # limitations under the License. 04:54:39 # 04:54:39 ################################################################################ 04:54:39 04:54:39 # 04:54:39 # Comma separated list of features repositories to register by default 04:54:39 # 04:54:39 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/d7b3d213-29db-4534-93a0-031e5065da16.xml 04:54:39 04:54:39 # 04:54:39 # Comma separated list of features to install at startup 04:54:39 # 04:54:39 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 510b7ed9-e8dc-40d4-b97c-ab7891511bec 04:54:39 04:54:39 # 04:54:39 # Resource repositories (OBR) that the features resolver can use 04:54:39 # to resolve requirements/capabilities 04:54:39 # 04:54:39 # The format of the resourceRepositories is 04:54:39 # resourceRepositories=[xml:url|json:url],... 04:54:39 # for Instance: 04:54:39 # 04:54:39 #resourceRepositories=xml:http://host/path/to/index.xml 04:54:39 # or 04:54:39 #resourceRepositories=json:http://host/path/to/index.json 04:54:39 # 04:54:39 04:54:39 # 04:54:39 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:54:39 # 04:54:39 featuresBootAsynchronous=false 04:54:39 04:54:39 # 04:54:39 # Service requirements enforcement 04:54:39 # 04:54:39 # By default, the feature resolver checks the service requirements/capabilities of 04:54:39 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:54:39 # the required bundles. 04:54:39 # The following flag can have those values: 04:54:39 # - disable: service requirements are completely ignored 04:54:39 # - default: service requirements are ignored for old features 04:54:39 # - enforce: service requirements are always verified 04:54:39 # 04:54:39 #serviceRequirements=default 04:54:39 04:54:39 # 04:54:39 # Store cfg file for config element in feature 04:54:39 # 04:54:39 #configCfgStore=true 04:54:39 04:54:39 # 04:54:39 # Define if the feature service automatically refresh bundles 04:54:39 # 04:54:39 autoRefresh=true 04:54:39 04:54:39 # 04:54:39 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:54:39 # XML file defines instructions related to features processing 04:54:39 # versions.properties may declare properties to resolve placeholders in XML file 04:54:39 # both files are relative to ${karaf.etc} 04:54:39 # 04:54:39 #featureProcessing=org.apache.karaf.features.xml 04:54:39 #featureProcessingVersions=versions.properties 04:54:39 + FEATURE_TEST_STRING=features-test 04:54:39 + FEATURE_TEST_VERSION=0.22.1 04:54:39 + KARAF_VERSION=karaf4 04:54:39 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:54:39 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:39 + [[ ! -z '' ]] 04:54:39 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:39 + configure_karaf_log karaf4 '' 04:54:39 + local -r karaf_version=karaf4 04:54:39 + local -r controllerdebugmap= 04:54:39 + local logapi=log4j 04:54:39 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:39 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:39 log4j2.rootLogger.level = INFO 04:54:39 #log4j2.rootLogger.type = asyncRoot 04:54:39 #log4j2.rootLogger.includeLocation = false 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:39 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:39 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:39 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:39 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:39 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:39 log4j2.logger.spifly.level = WARN 04:54:39 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:39 log4j2.logger.audit.level = INFO 04:54:39 log4j2.logger.audit.additivity = false 04:54:39 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:39 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:39 log4j2.appender.console.type = Console 04:54:39 log4j2.appender.console.name = Console 04:54:39 log4j2.appender.console.layout.type = PatternLayout 04:54:39 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:39 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:39 log4j2.appender.rolling.name = RollingFile 04:54:39 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:39 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:39 #log4j2.appender.rolling.immediateFlush = false 04:54:39 log4j2.appender.rolling.append = true 04:54:39 log4j2.appender.rolling.layout.type = PatternLayout 04:54:39 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:39 log4j2.appender.rolling.policies.type = Policies 04:54:39 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:39 log4j2.appender.rolling.policies.size.size = 64MB 04:54:39 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:39 log4j2.appender.rolling.strategy.max = 7 04:54:39 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:39 log4j2.appender.audit.name = AuditRollingFile 04:54:39 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:39 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:39 log4j2.appender.audit.append = true 04:54:39 log4j2.appender.audit.layout.type = PatternLayout 04:54:39 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:39 log4j2.appender.audit.policies.type = Policies 04:54:39 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:39 log4j2.appender.audit.policies.size.size = 8MB 04:54:39 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:39 log4j2.appender.audit.strategy.max = 7 04:54:39 log4j2.appender.osgi.type = PaxOsgi 04:54:39 log4j2.appender.osgi.name = PaxOsgi 04:54:39 log4j2.appender.osgi.filter = * 04:54:39 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:39 #log4j2.logger.aether.level = TRACE 04:54:39 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:39 #log4j2.logger.http-headers.level = DEBUG 04:54:39 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:39 #log4j2.logger.maven.level = TRACE 04:54:39 + logapi=log4j2 04:54:39 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:54:39 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:54:39 + '[' log4j2 == log4j2 ']' 04:54:39 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:39 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:54:39 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:54:39 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:54:39 controllerdebugmap: 04:54:39 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:54:39 + unset IFS 04:54:39 + echo 'controllerdebugmap: ' 04:54:39 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:39 + '[' -n '' ']' 04:54:39 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 04:54:39 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:39 ################################################################################ 04:54:39 # 04:54:39 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:39 # contributor license agreements. See the NOTICE file distributed with 04:54:39 # this work for additional information regarding copyright ownership. 04:54:39 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:39 # (the "License"); you may not use this file except in compliance with 04:54:39 # the License. You may obtain a copy of the License at 04:54:39 # 04:54:39 # http://www.apache.org/licenses/LICENSE-2.0 04:54:39 # 04:54:39 # Unless required by applicable law or agreed to in writing, software 04:54:39 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:39 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:39 # See the License for the specific language governing permissions and 04:54:39 # limitations under the License. 04:54:39 # 04:54:39 ################################################################################ 04:54:39 04:54:39 # Common pattern layout for appenders 04:54:39 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:39 04:54:39 # Root logger 04:54:39 log4j2.rootLogger.level = INFO 04:54:39 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:39 #log4j2.rootLogger.type = asyncRoot 04:54:39 #log4j2.rootLogger.includeLocation = false 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:39 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:39 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:39 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:39 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:39 04:54:39 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:39 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:39 04:54:39 # Loggers configuration 04:54:39 04:54:39 # Spifly logger 04:54:39 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:39 log4j2.logger.spifly.level = WARN 04:54:39 04:54:39 # Security audit logger 04:54:39 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:39 log4j2.logger.audit.level = INFO 04:54:39 log4j2.logger.audit.additivity = false 04:54:39 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:39 04:54:39 # Appenders configuration 04:54:39 04:54:39 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:39 log4j2.appender.console.type = Console 04:54:39 log4j2.appender.console.name = Console 04:54:39 log4j2.appender.console.layout.type = PatternLayout 04:54:39 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:39 04:54:39 # Rolling file appender 04:54:39 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:39 log4j2.appender.rolling.name = RollingFile 04:54:39 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:39 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:39 # uncomment to not force a disk flush 04:54:39 #log4j2.appender.rolling.immediateFlush = false 04:54:39 log4j2.appender.rolling.append = true 04:54:39 log4j2.appender.rolling.layout.type = PatternLayout 04:54:39 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:39 log4j2.appender.rolling.policies.type = Policies 04:54:39 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:39 log4j2.appender.rolling.policies.size.size = 1GB 04:54:39 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:39 log4j2.appender.rolling.strategy.max = 7 04:54:39 04:54:39 # Audit file appender 04:54:39 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:39 log4j2.appender.audit.name = AuditRollingFile 04:54:39 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:39 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:39 log4j2.appender.audit.append = true 04:54:39 log4j2.appender.audit.layout.type = PatternLayout 04:54:39 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:39 log4j2.appender.audit.policies.type = Policies 04:54:39 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:39 log4j2.appender.audit.policies.size.size = 8MB 04:54:39 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:39 log4j2.appender.audit.strategy.max = 7 04:54:39 04:54:39 # OSGi appender 04:54:39 log4j2.appender.osgi.type = PaxOsgi 04:54:39 log4j2.appender.osgi.name = PaxOsgi 04:54:39 log4j2.appender.osgi.filter = * 04:54:39 04:54:39 # help with identification of maven-related problems with pax-url-aether 04:54:39 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:39 #log4j2.logger.aether.level = TRACE 04:54:39 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:39 #log4j2.logger.http-headers.level = DEBUG 04:54:39 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:39 #log4j2.logger.maven.level = TRACE 04:54:39 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:39 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:39 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 04:54:39 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:54:39 + local -r controllermem=2048m 04:54:39 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 04:54:39 Configure 04:54:39 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:54:39 max memory: 2048m 04:54:39 memconf: /tmp/karaf-0.22.1/bin/setenv 04:54:39 + echo Configure 04:54:39 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:54:39 + echo ' max memory: 2048m' 04:54:39 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 04:54:39 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 04:54:39 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 04:54:39 cat /tmp/karaf-0.22.1/bin/setenv 04:54:39 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 04:54:39 + cat /tmp/karaf-0.22.1/bin/setenv 04:54:39 #!/bin/sh 04:54:39 # 04:54:39 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:39 # contributor license agreements. See the NOTICE file distributed with 04:54:39 # this work for additional information regarding copyright ownership. 04:54:39 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:39 # (the "License"); you may not use this file except in compliance with 04:54:39 # the License. You may obtain a copy of the License at 04:54:39 # 04:54:39 # http://www.apache.org/licenses/LICENSE-2.0 04:54:39 # 04:54:39 # Unless required by applicable law or agreed to in writing, software 04:54:39 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:39 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:39 # See the License for the specific language governing permissions and 04:54:39 # limitations under the License. 04:54:39 # 04:54:39 04:54:39 # 04:54:39 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:54:39 # script: client, instance, shell, start, status, stop, karaf 04:54:39 # 04:54:39 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:54:39 # Actions go here... 04:54:39 # fi 04:54:39 04:54:39 # 04:54:39 # general settings which should be applied for all scripts go here; please keep 04:54:39 # in mind that it is possible that scripts might be executed more than once, e.g. 04:54:39 # in example of the start script where the start script is executed first and the 04:54:39 # karaf script afterwards. 04:54:39 # 04:54:39 04:54:39 # 04:54:39 # The following section shows the possible configuration options for the default 04:54:39 # karaf scripts 04:54:39 # 04:54:39 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:54:39 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:54:39 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:54:39 # export EXTRA_JAVA_OPTS # Additional JVM options 04:54:39 # export KARAF_HOME # Karaf home folder 04:54:39 # export KARAF_DATA # Karaf data folder 04:54:39 # export KARAF_BASE # Karaf base folder 04:54:39 # export KARAF_ETC # Karaf etc folder 04:54:39 # export KARAF_LOG # Karaf log folder 04:54:39 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:54:39 # export KARAF_OPTS # Additional available Karaf options 04:54:39 # export KARAF_DEBUG # Enable debug mode 04:54:39 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:54:39 # export KARAF_NOROOT # Prevent execution as root if set to true 04:54:39 Set Java version 04:54:39 + echo 'Set Java version' 04:54:39 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:54:39 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:39 sudo: a password is required 04:54:39 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:39 JDK default version ... 04:54:39 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:39 sudo: a password is required 04:54:39 + echo 'JDK default version ...' 04:54:39 + java -version 04:54:40 openjdk version "21.0.5" 2024-10-15 04:54:40 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 04:54:40 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 04:54:40 Set JAVA_HOME 04:54:40 + echo 'Set JAVA_HOME' 04:54:40 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:40 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:40 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:40 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:40 Listing all open ports on controller system... 04:54:40 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:40 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:54:40 + echo 'Listing all open ports on controller system...' 04:54:40 + netstat -pnatu 04:54:40 /tmp/configuration-script.sh: line 40: netstat: command not found 04:54:40 Configuring cluster 04:54:40 + '[' -f /tmp/custom_shard_config.txt ']' 04:54:40 + echo 'Configuring cluster' 04:54:40 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 2 10.30.170.73 10.30.171.201 10.30.170.175 04:54:40 ################################################ 04:54:40 ## Configure Cluster ## 04:54:40 ################################################ 04:54:40 ERROR: Cluster configurations files not found. Please configure clustering feature. 04:54:40 Dump pekko.conf 04:54:40 + echo 'Dump pekko.conf' 04:54:40 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:40 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:40 Dump modules.conf 04:54:40 + echo 'Dump modules.conf' 04:54:40 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:40 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 04:54:40 Dump module-shards.conf 04:54:40 + echo 'Dump module-shards.conf' 04:54:40 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:40 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 04:54:40 Configuring member-3 with IP address 10.30.170.175 04:54:40 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:40 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:40 + source /tmp/common-functions.sh karaf-0.22.1 titanium 04:54:40 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:54:40 common-functions.sh is being sourced 04:54:40 ++ echo 'common-functions.sh is being sourced' 04:54:40 ++ BUNDLEFOLDER=karaf-0.22.1 04:54:40 ++ DISTROSTREAM=titanium 04:54:40 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:40 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:40 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:40 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:40 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:40 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 04:54:40 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:40 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:40 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:40 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 04:54:40 ++ export CONTROLLERMEM= 04:54:40 ++ CONTROLLERMEM= 04:54:40 ++ case "${DISTROSTREAM}" in 04:54:40 ++ CLUSTER_SYSTEM=pekko 04:54:40 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:40 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:40 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:40 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:40 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:40 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:40 ++ print_common_env 04:54:40 ++ cat 04:54:40 common-functions environment: 04:54:40 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:40 ACTUALFEATURES: 04:54:40 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:40 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 04:54:40 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:40 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 04:54:40 CONTROLLERMEM: 04:54:40 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:40 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:40 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:40 SUITES: 04:54:40 04:54:40 ++ SSH='ssh -t -t' 04:54:40 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:54:40 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:54:40 Changing to /tmp 04:54:40 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:40 + echo 'Changing to /tmp' 04:54:40 + cd /tmp 04:54:40 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 04:54:40 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:40 --2025-09-13 04:54:40-- https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 04:54:40 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:54:40 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:54:40 HTTP request sent, awaiting response... 200 OK 04:54:40 Length: 236634504 (226M) [application/zip] 04:54:40 Saving to: ‘karaf-0.22.1.zip’ 04:54:40 04:54:40 0K ........ ........ ........ ........ ........ ........ 1% 89.2M 2s 04:54:40 3072K ........ ........ ........ ........ ........ ........ 2% 144M 2s 04:54:40 6144K ........ ........ ........ ........ ........ ........ 3% 169M 2s 04:54:40 9216K ........ ........ ........ ........ ........ ........ 5% 136M 2s 04:54:40 12288K ........ ........ ........ ........ ........ ........ 6% 227M 2s 04:54:40 15360K ........ ........ ........ ........ ........ ........ 7% 231M 1s 04:54:40 18432K ........ ........ ........ ........ ........ ........ 9% 232M 1s 04:54:40 21504K ........ ........ ........ ........ ........ ........ 10% 269M 1s 04:54:40 24576K ........ ........ ........ ........ ........ ........ 11% 247M 1s 04:54:40 27648K ........ ........ ........ ........ ........ ........ 13% 288M 1s 04:54:40 30720K ........ ........ ........ ........ ........ ........ 14% 257M 1s 04:54:40 33792K ........ ........ ........ ........ ........ ........ 15% 304M 1s 04:54:40 36864K ........ ........ ........ ........ ........ ........ 17% 331M 1s 04:54:40 39936K ........ ........ ........ ........ ........ ........ 18% 300M 1s 04:54:40 43008K ........ ........ ........ ........ ........ ........ 19% 287M 1s 04:54:40 46080K ........ ........ ........ ........ ........ ........ 21% 304M 1s 04:54:40 49152K ........ ........ ........ ........ ........ ........ 22% 329M 1s 04:54:40 52224K ........ ........ ........ ........ ........ ........ 23% 300M 1s 04:54:40 55296K ........ ........ ........ ........ ........ ........ 25% 265M 1s 04:54:40 58368K ........ ........ ........ ........ ........ ........ 26% 323M 1s 04:54:40 61440K ........ ........ ........ ........ ........ ........ 27% 301M 1s 04:54:40 64512K ........ ........ ........ ........ ........ ........ 29% 314M 1s 04:54:40 67584K ........ ........ ........ ........ ........ ........ 30% 341M 1s 04:54:40 70656K ........ ........ ........ ........ ........ ........ 31% 340M 1s 04:54:40 73728K ........ ........ ........ ........ ........ ........ 33% 332M 1s 04:54:40 76800K ........ ........ ........ ........ ........ ........ 34% 318M 1s 04:54:40 79872K ........ ........ ........ ........ ........ ........ 35% 369M 1s 04:54:40 82944K ........ ........ ........ ........ ........ ........ 37% 378M 1s 04:54:41 86016K ........ ........ ........ ........ ........ ........ 38% 338M 1s 04:54:41 89088K ........ ........ ........ ........ ........ ........ 39% 374M 1s 04:54:41 92160K ........ ........ ........ ........ ........ ........ 41% 349M 1s 04:54:41 95232K ........ ........ ........ ........ ........ ........ 42% 356M 1s 04:54:41 98304K ........ ........ ........ ........ ........ ........ 43% 406M 0s 04:54:41 101376K ........ ........ ........ ........ ........ ........ 45% 338M 0s 04:54:41 104448K ........ ........ ........ ........ ........ ........ 46% 248M 0s 04:54:41 107520K ........ ........ ........ ........ ........ ........ 47% 311M 0s 04:54:41 110592K ........ ........ ........ ........ ........ ........ 49% 359M 0s 04:54:41 113664K ........ ........ ........ ........ ........ ........ 50% 360M 0s 04:54:41 116736K ........ ........ ........ ........ ........ ........ 51% 360M 0s 04:54:41 119808K ........ ........ ........ ........ ........ ........ 53% 348M 0s 04:54:41 122880K ........ ........ ........ ........ ........ ........ 54% 365M 0s 04:54:41 125952K ........ ........ ........ ........ ........ ........ 55% 318M 0s 04:54:41 129024K ........ ........ ........ ........ ........ ........ 57% 337M 0s 04:54:41 132096K ........ ........ ........ ........ ........ ........ 58% 332M 0s 04:54:41 135168K ........ ........ ........ ........ ........ ........ 59% 367M 0s 04:54:41 138240K ........ ........ ........ ........ ........ ........ 61% 267M 0s 04:54:41 141312K ........ ........ ........ ........ ........ ........ 62% 290M 0s 04:54:41 144384K ........ ........ ........ ........ ........ ........ 63% 251M 0s 04:54:41 147456K ........ ........ ........ ........ ........ ........ 65% 278M 0s 04:54:41 150528K ........ ........ ........ ........ ........ ........ 66% 343M 0s 04:54:41 153600K ........ ........ ........ ........ ........ ........ 67% 329M 0s 04:54:41 156672K ........ ........ ........ ........ ........ ........ 69% 368M 0s 04:54:41 159744K ........ ........ ........ ........ ........ ........ 70% 360M 0s 04:54:41 162816K ........ ........ ........ ........ ........ ........ 71% 373M 0s 04:54:41 165888K ........ ........ ........ ........ ........ ........ 73% 368M 0s 04:54:41 168960K ........ ........ ........ ........ ........ ........ 74% 356M 0s 04:54:41 172032K ........ ........ ........ ........ ........ ........ 75% 311M 0s 04:54:41 175104K ........ ........ ........ ........ ........ ........ 77% 329M 0s 04:54:41 178176K ........ ........ ........ ........ ........ ........ 78% 377M 0s 04:54:41 181248K ........ ........ ........ ........ ........ ........ 79% 366M 0s 04:54:41 184320K ........ ........ ........ ........ ........ ........ 81% 358M 0s 04:54:41 187392K ........ ........ ........ ........ ........ ........ 82% 323M 0s 04:54:41 190464K ........ ........ ........ ........ ........ ........ 83% 326M 0s 04:54:41 193536K ........ ........ ........ ........ ........ ........ 85% 394M 0s 04:54:41 196608K ........ ........ ........ ........ ........ ........ 86% 296M 0s 04:54:41 199680K ........ ........ ........ ........ ........ ........ 87% 491M 0s 04:54:41 202752K ........ ........ ........ ........ ........ ........ 89% 406M 0s 04:54:41 205824K ........ ........ ........ ........ ........ ........ 90% 378M 0s 04:54:41 208896K ........ ........ ........ ........ ........ ........ 91% 379M 0s 04:54:41 211968K ........ ........ ........ ........ ........ ........ 93% 382M 0s 04:54:41 215040K ........ ........ ........ ........ ........ ........ 94% 304M 0s 04:54:41 218112K ........ ........ ........ ........ ........ ........ 95% 354M 0s 04:54:41 221184K ........ ........ ........ ........ ........ ........ 97% 349M 0s 04:54:41 224256K ........ ........ ........ ........ ........ ........ 98% 499M 0s 04:54:41 227328K ........ ........ ........ ........ ........ ........ 99% 426M 0s 04:54:41 230400K ........ .. 100% 364M=0.8s 04:54:41 04:54:41 2025-09-13 04:54:41 (301 MB/s) - ‘karaf-0.22.1.zip’ saved [236634504/236634504] 04:54:41 04:54:41 Extracting the new controller... 04:54:41 + echo 'Extracting the new controller...' 04:54:41 + unzip -q karaf-0.22.1.zip 04:54:43 Adding external repositories... 04:54:43 + echo 'Adding external repositories...' 04:54:43 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:43 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:43 ################################################################################ 04:54:43 # 04:54:43 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:43 # contributor license agreements. See the NOTICE file distributed with 04:54:43 # this work for additional information regarding copyright ownership. 04:54:43 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:43 # (the "License"); you may not use this file except in compliance with 04:54:43 # the License. You may obtain a copy of the License at 04:54:43 # 04:54:43 # http://www.apache.org/licenses/LICENSE-2.0 04:54:43 # 04:54:43 # Unless required by applicable law or agreed to in writing, software 04:54:43 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:43 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:43 # See the License for the specific language governing permissions and 04:54:43 # limitations under the License. 04:54:43 # 04:54:43 ################################################################################ 04:54:43 04:54:43 # 04:54:43 # If set to true, the following property will not allow any certificate to be used 04:54:43 # when accessing Maven repositories through SSL 04:54:43 # 04:54:43 #org.ops4j.pax.url.mvn.certificateCheck= 04:54:43 04:54:43 # 04:54:43 # Path to the local Maven settings file. 04:54:43 # The repositories defined in this file will be automatically added to the list 04:54:43 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:54:43 # below is not set. 04:54:43 # The following locations are checked for the existence of the settings.xml file 04:54:43 # * 1. looks for the specified url 04:54:43 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:54:43 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:54:43 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:54:43 # 04:54:43 #org.ops4j.pax.url.mvn.settings= 04:54:43 04:54:43 # 04:54:43 # Path to the local Maven repository which is used to avoid downloading 04:54:43 # artifacts when they already exist locally. 04:54:43 # The value of this property will be extracted from the settings.xml file 04:54:43 # above, or defaulted to: 04:54:43 # System.getProperty( "user.home" ) + "/.m2/repository" 04:54:43 # 04:54:43 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:54:43 04:54:43 # 04:54:43 # Default this to false. It's just weird to use undocumented repos 04:54:43 # 04:54:43 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:54:43 04:54:43 # 04:54:43 # Uncomment if you don't wanna use the proxy settings 04:54:43 # from the Maven conf/settings.xml file 04:54:43 # 04:54:43 # org.ops4j.pax.url.mvn.proxySupport=false 04:54:43 04:54:43 # 04:54:43 # Comma separated list of repositories scanned when resolving an artifact. 04:54:43 # Those repositories will be checked before iterating through the 04:54:43 # below list of repositories and even before the local repository 04:54:43 # A repository url can be appended with zero or more of the following flags: 04:54:43 # @snapshots : the repository contains snaphots 04:54:43 # @noreleases : the repository does not contain any released artifacts 04:54:43 # 04:54:43 # The following property value will add the system folder as a repo. 04:54:43 # 04:54:43 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:54:43 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:54:43 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:54:43 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:54:43 04:54:43 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:54:43 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:54:43 04:54:43 # 04:54:43 # Comma separated list of repositories scanned when resolving an artifact. 04:54:43 # The default list includes the following repositories: 04:54:43 # http://repo1.maven.org/maven2@id=central 04:54:43 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:54:43 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:54:43 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:54:43 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:54:43 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:54:43 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:43 # To add repositories to the default ones, prepend '+' to the list of repositories 04:54:43 # to add. 04:54:43 # A repository url can be appended with zero or more of the following flags: 04:54:43 # @snapshots : the repository contains snapshots 04:54:43 # @noreleases : the repository does not contain any released artifacts 04:54:43 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:54:43 # 04:54:43 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:54:43 04:54:43 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 04:54:43 + [[ True == \T\r\u\e ]] 04:54:43 + echo 'Configuring the startup features...' 04:54:43 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:43 + FEATURE_TEST_STRING=features-test 04:54:43 + FEATURE_TEST_VERSION=0.22.1 04:54:43 + KARAF_VERSION=karaf4 04:54:43 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:54:43 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:43 + [[ ! -z '' ]] 04:54:43 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:43 ################################################################################ 04:54:43 # 04:54:43 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:43 # contributor license agreements. See the NOTICE file distributed with 04:54:43 # this work for additional information regarding copyright ownership. 04:54:43 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:43 # (the "License"); you may not use this file except in compliance with 04:54:43 # the License. You may obtain a copy of the License at 04:54:43 # 04:54:43 # http://www.apache.org/licenses/LICENSE-2.0 04:54:43 # 04:54:43 # Unless required by applicable law or agreed to in writing, software 04:54:43 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:43 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:43 # See the License for the specific language governing permissions and 04:54:43 # limitations under the License. 04:54:43 # 04:54:43 ################################################################################ 04:54:43 04:54:43 # 04:54:43 # Comma separated list of features repositories to register by default 04:54:43 # 04:54:43 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/d7b3d213-29db-4534-93a0-031e5065da16.xml 04:54:43 04:54:43 # 04:54:43 # Comma separated list of features to install at startup 04:54:43 # 04:54:43 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, 510b7ed9-e8dc-40d4-b97c-ab7891511bec 04:54:43 04:54:43 # 04:54:43 # Resource repositories (OBR) that the features resolver can use 04:54:43 # to resolve requirements/capabilities 04:54:43 # 04:54:43 # The format of the resourceRepositories is 04:54:43 # resourceRepositories=[xml:url|json:url],... 04:54:43 # for Instance: 04:54:43 # 04:54:43 #resourceRepositories=xml:http://host/path/to/index.xml 04:54:43 # or 04:54:43 #resourceRepositories=json:http://host/path/to/index.json 04:54:43 # 04:54:43 04:54:43 # 04:54:43 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:54:43 # 04:54:43 featuresBootAsynchronous=false 04:54:43 04:54:43 # 04:54:43 # Service requirements enforcement 04:54:43 # 04:54:43 # By default, the feature resolver checks the service requirements/capabilities of 04:54:43 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:54:43 # the required bundles. 04:54:43 # The following flag can have those values: 04:54:43 # - disable: service requirements are completely ignored 04:54:43 # - default: service requirements are ignored for old features 04:54:43 # - enforce: service requirements are always verified 04:54:43 # 04:54:43 #serviceRequirements=default 04:54:43 04:54:43 # 04:54:43 # Store cfg file for config element in feature 04:54:43 # 04:54:43 #configCfgStore=true 04:54:43 04:54:43 # 04:54:43 # Define if the feature service automatically refresh bundles 04:54:43 # 04:54:43 autoRefresh=true 04:54:43 04:54:43 # 04:54:43 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:54:43 # XML file defines instructions related to features processing 04:54:43 # versions.properties may declare properties to resolve placeholders in XML file 04:54:43 # both files are relative to ${karaf.etc} 04:54:43 # 04:54:43 #featureProcessing=org.apache.karaf.features.xml 04:54:43 #featureProcessingVersions=versions.properties 04:54:43 + configure_karaf_log karaf4 '' 04:54:43 + local -r karaf_version=karaf4 04:54:43 + local -r controllerdebugmap= 04:54:43 + local logapi=log4j 04:54:43 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:43 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:43 log4j2.rootLogger.level = INFO 04:54:43 #log4j2.rootLogger.type = asyncRoot 04:54:43 #log4j2.rootLogger.includeLocation = false 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:43 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:43 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:43 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:43 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:43 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:43 log4j2.logger.spifly.level = WARN 04:54:43 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:43 log4j2.logger.audit.level = INFO 04:54:43 log4j2.logger.audit.additivity = false 04:54:43 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:43 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:43 log4j2.appender.console.type = Console 04:54:43 log4j2.appender.console.name = Console 04:54:43 log4j2.appender.console.layout.type = PatternLayout 04:54:43 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:43 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:43 log4j2.appender.rolling.name = RollingFile 04:54:43 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:43 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:43 #log4j2.appender.rolling.immediateFlush = false 04:54:43 log4j2.appender.rolling.append = true 04:54:43 log4j2.appender.rolling.layout.type = PatternLayout 04:54:43 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:43 log4j2.appender.rolling.policies.type = Policies 04:54:43 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:43 log4j2.appender.rolling.policies.size.size = 64MB 04:54:43 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:43 log4j2.appender.rolling.strategy.max = 7 04:54:43 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:43 log4j2.appender.audit.name = AuditRollingFile 04:54:43 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:43 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:43 log4j2.appender.audit.append = true 04:54:43 log4j2.appender.audit.layout.type = PatternLayout 04:54:43 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:43 log4j2.appender.audit.policies.type = Policies 04:54:43 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:43 log4j2.appender.audit.policies.size.size = 8MB 04:54:43 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:43 log4j2.appender.audit.strategy.max = 7 04:54:43 log4j2.appender.osgi.type = PaxOsgi 04:54:43 log4j2.appender.osgi.name = PaxOsgi 04:54:43 log4j2.appender.osgi.filter = * 04:54:43 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:43 #log4j2.logger.aether.level = TRACE 04:54:43 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:43 #log4j2.logger.http-headers.level = DEBUG 04:54:43 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:43 #log4j2.logger.maven.level = TRACE 04:54:43 + logapi=log4j2 04:54:43 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:54:43 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:54:43 + '[' log4j2 == log4j2 ']' 04:54:43 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:43 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:54:43 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:54:43 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:54:43 controllerdebugmap: 04:54:43 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:43 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:54:43 + unset IFS 04:54:43 + echo 'controllerdebugmap: ' 04:54:43 + '[' -n '' ']' 04:54:43 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 04:54:43 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:43 ################################################################################ 04:54:43 # 04:54:43 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:43 # contributor license agreements. See the NOTICE file distributed with 04:54:43 # this work for additional information regarding copyright ownership. 04:54:43 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:43 # (the "License"); you may not use this file except in compliance with 04:54:43 # the License. You may obtain a copy of the License at 04:54:43 # 04:54:43 # http://www.apache.org/licenses/LICENSE-2.0 04:54:43 # 04:54:43 # Unless required by applicable law or agreed to in writing, software 04:54:43 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:43 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:43 # See the License for the specific language governing permissions and 04:54:43 # limitations under the License. 04:54:43 # 04:54:43 ################################################################################ 04:54:43 04:54:43 # Common pattern layout for appenders 04:54:43 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:43 04:54:43 # Root logger 04:54:43 log4j2.rootLogger.level = INFO 04:54:43 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:43 #log4j2.rootLogger.type = asyncRoot 04:54:43 #log4j2.rootLogger.includeLocation = false 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:43 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:43 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:43 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:43 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:43 04:54:43 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:43 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:43 04:54:43 # Loggers configuration 04:54:43 04:54:43 # Spifly logger 04:54:43 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:43 log4j2.logger.spifly.level = WARN 04:54:43 04:54:43 # Security audit logger 04:54:43 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:43 log4j2.logger.audit.level = INFO 04:54:43 log4j2.logger.audit.additivity = false 04:54:43 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:43 04:54:43 # Appenders configuration 04:54:43 04:54:43 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:43 log4j2.appender.console.type = Console 04:54:43 log4j2.appender.console.name = Console 04:54:43 log4j2.appender.console.layout.type = PatternLayout 04:54:43 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:43 04:54:43 # Rolling file appender 04:54:43 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:43 log4j2.appender.rolling.name = RollingFile 04:54:43 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:43 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:43 # uncomment to not force a disk flush 04:54:43 #log4j2.appender.rolling.immediateFlush = false 04:54:43 log4j2.appender.rolling.append = true 04:54:43 log4j2.appender.rolling.layout.type = PatternLayout 04:54:43 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:43 log4j2.appender.rolling.policies.type = Policies 04:54:43 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:43 log4j2.appender.rolling.policies.size.size = 1GB 04:54:43 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:43 log4j2.appender.rolling.strategy.max = 7 04:54:43 04:54:43 # Audit file appender 04:54:43 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:43 log4j2.appender.audit.name = AuditRollingFile 04:54:43 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:43 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:43 log4j2.appender.audit.append = true 04:54:43 log4j2.appender.audit.layout.type = PatternLayout 04:54:43 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:43 log4j2.appender.audit.policies.type = Policies 04:54:43 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:43 log4j2.appender.audit.policies.size.size = 8MB 04:54:43 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:43 log4j2.appender.audit.strategy.max = 7 04:54:43 04:54:43 # OSGi appender 04:54:43 log4j2.appender.osgi.type = PaxOsgi 04:54:43 log4j2.appender.osgi.name = PaxOsgi 04:54:43 log4j2.appender.osgi.filter = * 04:54:43 04:54:43 # help with identification of maven-related problems with pax-url-aether 04:54:43 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:43 #log4j2.logger.aether.level = TRACE 04:54:43 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:43 #log4j2.logger.http-headers.level = DEBUG 04:54:43 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:43 #log4j2.logger.maven.level = TRACE 04:54:43 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:43 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:43 Configure 04:54:43 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:54:43 max memory: 2048m 04:54:43 memconf: /tmp/karaf-0.22.1/bin/setenv 04:54:43 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 04:54:43 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:54:43 + local -r controllermem=2048m 04:54:43 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 04:54:43 + echo Configure 04:54:43 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:54:43 + echo ' max memory: 2048m' 04:54:43 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 04:54:43 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 04:54:43 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 04:54:43 cat /tmp/karaf-0.22.1/bin/setenv 04:54:43 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 04:54:43 + cat /tmp/karaf-0.22.1/bin/setenv 04:54:43 #!/bin/sh 04:54:43 # 04:54:43 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:43 # contributor license agreements. See the NOTICE file distributed with 04:54:43 # this work for additional information regarding copyright ownership. 04:54:43 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:43 # (the "License"); you may not use this file except in compliance with 04:54:43 # the License. You may obtain a copy of the License at 04:54:43 # 04:54:43 # http://www.apache.org/licenses/LICENSE-2.0 04:54:43 # 04:54:43 # Unless required by applicable law or agreed to in writing, software 04:54:43 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:43 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:43 # See the License for the specific language governing permissions and 04:54:43 # limitations under the License. 04:54:43 # 04:54:43 04:54:43 # 04:54:43 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:54:43 # script: client, instance, shell, start, status, stop, karaf 04:54:43 # 04:54:43 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:54:43 # Actions go here... 04:54:43 # fi 04:54:43 04:54:43 # 04:54:43 # general settings which should be applied for all scripts go here; please keep 04:54:43 # in mind that it is possible that scripts might be executed more than once, e.g. 04:54:43 # in example of the start script where the start script is executed first and the 04:54:43 # karaf script afterwards. 04:54:43 # 04:54:43 04:54:43 # 04:54:43 # The following section shows the possible configuration options for the default 04:54:43 # karaf scripts 04:54:43 # 04:54:43 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:54:43 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:54:43 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:54:43 # export EXTRA_JAVA_OPTS # Additional JVM options 04:54:43 # export KARAF_HOME # Karaf home folder 04:54:43 # export KARAF_DATA # Karaf data folder 04:54:43 # export KARAF_BASE # Karaf base folder 04:54:43 # export KARAF_ETC # Karaf etc folder 04:54:43 # export KARAF_LOG # Karaf log folder 04:54:43 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:54:43 # export KARAF_OPTS # Additional available Karaf options 04:54:43 # export KARAF_DEBUG # Enable debug mode 04:54:43 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:54:43 # export KARAF_NOROOT # Prevent execution as root if set to true 04:54:43 Set Java version 04:54:43 + echo 'Set Java version' 04:54:43 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:54:43 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:43 sudo: a password is required 04:54:43 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:43 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:54:43 sudo: a password is required 04:54:43 JDK default version ... 04:54:43 + echo 'JDK default version ...' 04:54:43 + java -version 04:54:43 openjdk version "21.0.5" 2024-10-15 04:54:43 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 04:54:43 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 04:54:43 Set JAVA_HOME 04:54:43 + echo 'Set JAVA_HOME' 04:54:43 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:43 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:54:43 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:43 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:43 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:54:43 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:54:43 Listing all open ports on controller system... 04:54:43 + echo 'Listing all open ports on controller system...' 04:54:43 + netstat -pnatu 04:54:43 /tmp/configuration-script.sh: line 40: netstat: command not found 04:54:43 Configuring cluster 04:54:43 + '[' -f /tmp/custom_shard_config.txt ']' 04:54:43 + echo 'Configuring cluster' 04:54:43 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 3 10.30.170.73 10.30.171.201 10.30.170.175 04:54:43 ################################################ 04:54:43 ## Configure Cluster ## 04:54:43 ################################################ 04:54:43 ERROR: Cluster configurations files not found. Please configure clustering feature. 04:54:43 Dump pekko.conf 04:54:43 + echo 'Dump pekko.conf' 04:54:43 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:43 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:43 Dump modules.conf 04:54:43 + echo 'Dump modules.conf' 04:54:43 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:43 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 04:54:43 Dump module-shards.conf 04:54:43 + echo 'Dump module-shards.conf' 04:54:43 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:43 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 04:54:43 Locating config plan to use... 04:54:43 config plan exists!!! 04:54:43 Changing the config plan path... 04:54:43 # Place the suites in run order: 04:54:43 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh 04:54:43 Executing /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh... 04:54:43 Copying config files to ODL Controller folder 04:54:43 Set AKKA/PEKKO debug on 10.30.170.73 04:54:43 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:44 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:44 Enable AKKA/PEKKO debug 04:54:44 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:44 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:44 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:44 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:44 ################################################################################ 04:54:44 # 04:54:44 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:44 # contributor license agreements. See the NOTICE file distributed with 04:54:44 # this work for additional information regarding copyright ownership. 04:54:44 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:44 # (the "License"); you may not use this file except in compliance with 04:54:44 # the License. You may obtain a copy of the License at 04:54:44 # 04:54:44 # http://www.apache.org/licenses/LICENSE-2.0 04:54:44 # 04:54:44 # Unless required by applicable law or agreed to in writing, software 04:54:44 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:44 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:44 # See the License for the specific language governing permissions and 04:54:44 # limitations under the License. 04:54:44 # 04:54:44 ################################################################################ 04:54:44 04:54:44 # Common pattern layout for appenders 04:54:44 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:44 04:54:44 # Root logger 04:54:44 log4j2.rootLogger.level = INFO 04:54:44 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:44 #log4j2.rootLogger.type = asyncRoot 04:54:44 #log4j2.rootLogger.includeLocation = false 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:44 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:44 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:44 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:44 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:44 04:54:44 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:44 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:44 04:54:44 # Loggers configuration 04:54:44 04:54:44 # Spifly logger 04:54:44 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:44 log4j2.logger.spifly.level = WARN 04:54:44 04:54:44 # Security audit logger 04:54:44 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:44 log4j2.logger.audit.level = INFO 04:54:44 log4j2.logger.audit.additivity = false 04:54:44 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:44 04:54:44 # Appenders configuration 04:54:44 04:54:44 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:44 log4j2.appender.console.type = Console 04:54:44 log4j2.appender.console.name = Console 04:54:44 log4j2.appender.console.layout.type = PatternLayout 04:54:44 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:44 04:54:44 # Rolling file appender 04:54:44 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:44 log4j2.appender.rolling.name = RollingFile 04:54:44 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:44 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:44 # uncomment to not force a disk flush 04:54:44 #log4j2.appender.rolling.immediateFlush = false 04:54:44 log4j2.appender.rolling.append = true 04:54:44 log4j2.appender.rolling.layout.type = PatternLayout 04:54:44 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:44 log4j2.appender.rolling.policies.type = Policies 04:54:44 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:44 log4j2.appender.rolling.policies.size.size = 1GB 04:54:44 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:44 log4j2.appender.rolling.strategy.max = 7 04:54:44 04:54:44 # Audit file appender 04:54:44 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:44 log4j2.appender.audit.name = AuditRollingFile 04:54:44 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:44 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:44 log4j2.appender.audit.append = true 04:54:44 log4j2.appender.audit.layout.type = PatternLayout 04:54:44 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:44 log4j2.appender.audit.policies.type = Policies 04:54:44 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:44 log4j2.appender.audit.policies.size.size = 8MB 04:54:44 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:44 log4j2.appender.audit.strategy.max = 7 04:54:44 04:54:44 # OSGi appender 04:54:44 log4j2.appender.osgi.type = PaxOsgi 04:54:44 log4j2.appender.osgi.name = PaxOsgi 04:54:44 log4j2.appender.osgi.filter = * 04:54:44 04:54:44 # help with identification of maven-related problems with pax-url-aether 04:54:44 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:44 #log4j2.logger.aether.level = TRACE 04:54:44 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:44 #log4j2.logger.http-headers.level = DEBUG 04:54:44 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:44 #log4j2.logger.maven.level = TRACE 04:54:44 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:44 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:44 log4j2.logger.cluster.name=akka.cluster 04:54:44 log4j2.logger.cluster.level=DEBUG 04:54:44 log4j2.logger.remote.name=akka.remote 04:54:44 log4j2.logger.remote.level=DEBUG 04:54:44 Set AKKA/PEKKO debug on 10.30.171.201 04:54:44 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:44 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:45 Enable AKKA/PEKKO debug 04:54:45 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:45 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:45 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:45 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:45 ################################################################################ 04:54:45 # 04:54:45 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:45 # contributor license agreements. See the NOTICE file distributed with 04:54:45 # this work for additional information regarding copyright ownership. 04:54:45 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:45 # (the "License"); you may not use this file except in compliance with 04:54:45 # the License. You may obtain a copy of the License at 04:54:45 # 04:54:45 # http://www.apache.org/licenses/LICENSE-2.0 04:54:45 # 04:54:45 # Unless required by applicable law or agreed to in writing, software 04:54:45 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:45 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:45 # See the License for the specific language governing permissions and 04:54:45 # limitations under the License. 04:54:45 # 04:54:45 ################################################################################ 04:54:45 04:54:45 # Common pattern layout for appenders 04:54:45 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:45 04:54:45 # Root logger 04:54:45 log4j2.rootLogger.level = INFO 04:54:45 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:45 #log4j2.rootLogger.type = asyncRoot 04:54:45 #log4j2.rootLogger.includeLocation = false 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:45 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:45 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:45 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:45 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:45 04:54:45 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:45 04:54:45 # Loggers configuration 04:54:45 04:54:45 # Spifly logger 04:54:45 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:45 log4j2.logger.spifly.level = WARN 04:54:45 04:54:45 # Security audit logger 04:54:45 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:45 log4j2.logger.audit.level = INFO 04:54:45 log4j2.logger.audit.additivity = false 04:54:45 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:45 04:54:45 # Appenders configuration 04:54:45 04:54:45 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:45 log4j2.appender.console.type = Console 04:54:45 log4j2.appender.console.name = Console 04:54:45 log4j2.appender.console.layout.type = PatternLayout 04:54:45 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:45 04:54:45 # Rolling file appender 04:54:45 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:45 log4j2.appender.rolling.name = RollingFile 04:54:45 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:45 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:45 # uncomment to not force a disk flush 04:54:45 #log4j2.appender.rolling.immediateFlush = false 04:54:45 log4j2.appender.rolling.append = true 04:54:45 log4j2.appender.rolling.layout.type = PatternLayout 04:54:45 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:45 log4j2.appender.rolling.policies.type = Policies 04:54:45 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:45 log4j2.appender.rolling.policies.size.size = 1GB 04:54:45 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:45 log4j2.appender.rolling.strategy.max = 7 04:54:45 04:54:45 # Audit file appender 04:54:45 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:45 log4j2.appender.audit.name = AuditRollingFile 04:54:45 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:45 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:45 log4j2.appender.audit.append = true 04:54:45 log4j2.appender.audit.layout.type = PatternLayout 04:54:45 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:45 log4j2.appender.audit.policies.type = Policies 04:54:45 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:45 log4j2.appender.audit.policies.size.size = 8MB 04:54:45 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:45 log4j2.appender.audit.strategy.max = 7 04:54:45 04:54:45 # OSGi appender 04:54:45 log4j2.appender.osgi.type = PaxOsgi 04:54:45 log4j2.appender.osgi.name = PaxOsgi 04:54:45 log4j2.appender.osgi.filter = * 04:54:45 04:54:45 # help with identification of maven-related problems with pax-url-aether 04:54:45 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:45 #log4j2.logger.aether.level = TRACE 04:54:45 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:45 #log4j2.logger.http-headers.level = DEBUG 04:54:45 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:45 #log4j2.logger.maven.level = TRACE 04:54:45 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:45 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:45 log4j2.logger.cluster.name=akka.cluster 04:54:45 log4j2.logger.cluster.level=DEBUG 04:54:45 log4j2.logger.remote.name=akka.remote 04:54:45 log4j2.logger.remote.level=DEBUG 04:54:45 Set AKKA/PEKKO debug on 10.30.170.175 04:54:45 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:45 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:45 Enable AKKA/PEKKO debug 04:54:45 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:45 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:45 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 04:54:45 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:45 ################################################################################ 04:54:45 # 04:54:45 # Licensed to the Apache Software Foundation (ASF) under one or more 04:54:45 # contributor license agreements. See the NOTICE file distributed with 04:54:45 # this work for additional information regarding copyright ownership. 04:54:45 # The ASF licenses this file to You under the Apache License, Version 2.0 04:54:45 # (the "License"); you may not use this file except in compliance with 04:54:45 # the License. You may obtain a copy of the License at 04:54:45 # 04:54:45 # http://www.apache.org/licenses/LICENSE-2.0 04:54:45 # 04:54:45 # Unless required by applicable law or agreed to in writing, software 04:54:45 # distributed under the License is distributed on an "AS IS" BASIS, 04:54:45 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:54:45 # See the License for the specific language governing permissions and 04:54:45 # limitations under the License. 04:54:45 # 04:54:45 ################################################################################ 04:54:45 04:54:45 # Common pattern layout for appenders 04:54:45 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:54:45 04:54:45 # Root logger 04:54:45 log4j2.rootLogger.level = INFO 04:54:45 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:54:45 #log4j2.rootLogger.type = asyncRoot 04:54:45 #log4j2.rootLogger.includeLocation = false 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:54:45 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:54:45 log4j2.rootLogger.appenderRef.Console.ref = Console 04:54:45 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:54:45 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:54:45 04:54:45 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:54:45 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:54:45 04:54:45 # Loggers configuration 04:54:45 04:54:45 # Spifly logger 04:54:45 log4j2.logger.spifly.name = org.apache.aries.spifly 04:54:45 log4j2.logger.spifly.level = WARN 04:54:45 04:54:45 # Security audit logger 04:54:45 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:54:45 log4j2.logger.audit.level = INFO 04:54:45 log4j2.logger.audit.additivity = false 04:54:45 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:54:45 04:54:45 # Appenders configuration 04:54:45 04:54:45 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:54:45 log4j2.appender.console.type = Console 04:54:45 log4j2.appender.console.name = Console 04:54:45 log4j2.appender.console.layout.type = PatternLayout 04:54:45 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:54:45 04:54:45 # Rolling file appender 04:54:45 log4j2.appender.rolling.type = RollingRandomAccessFile 04:54:45 log4j2.appender.rolling.name = RollingFile 04:54:45 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:54:45 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:54:45 # uncomment to not force a disk flush 04:54:45 #log4j2.appender.rolling.immediateFlush = false 04:54:45 log4j2.appender.rolling.append = true 04:54:45 log4j2.appender.rolling.layout.type = PatternLayout 04:54:45 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:54:45 log4j2.appender.rolling.policies.type = Policies 04:54:45 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:54:45 log4j2.appender.rolling.policies.size.size = 1GB 04:54:45 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:54:45 log4j2.appender.rolling.strategy.max = 7 04:54:45 04:54:45 # Audit file appender 04:54:45 log4j2.appender.audit.type = RollingRandomAccessFile 04:54:45 log4j2.appender.audit.name = AuditRollingFile 04:54:45 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:54:45 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:54:45 log4j2.appender.audit.append = true 04:54:45 log4j2.appender.audit.layout.type = PatternLayout 04:54:45 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:54:45 log4j2.appender.audit.policies.type = Policies 04:54:45 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:54:45 log4j2.appender.audit.policies.size.size = 8MB 04:54:45 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:54:45 log4j2.appender.audit.strategy.max = 7 04:54:45 04:54:45 # OSGi appender 04:54:45 log4j2.appender.osgi.type = PaxOsgi 04:54:45 log4j2.appender.osgi.name = PaxOsgi 04:54:45 log4j2.appender.osgi.filter = * 04:54:45 04:54:45 # help with identification of maven-related problems with pax-url-aether 04:54:45 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:54:45 #log4j2.logger.aether.level = TRACE 04:54:45 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:54:45 #log4j2.logger.http-headers.level = DEBUG 04:54:45 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:54:45 #log4j2.logger.maven.level = TRACE 04:54:45 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:54:45 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:54:45 log4j2.logger.cluster.name=akka.cluster 04:54:45 log4j2.logger.cluster.level=DEBUG 04:54:45 log4j2.logger.remote.name=akka.remote 04:54:45 log4j2.logger.remote.level=DEBUG 04:54:45 Finished running config plans 04:54:45 Starting member-1 with IP address 10.30.170.73 04:54:45 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:45 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:45 Redirecting karaf console output to karaf_console.log 04:54:45 Starting controller... 04:54:45 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 04:54:45 Starting member-2 with IP address 10.30.171.201 04:54:46 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:46 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:54:46 Redirecting karaf console output to karaf_console.log 04:54:46 Starting controller... 04:54:46 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 04:54:46 Starting member-3 with IP address 10.30.170.175 04:54:46 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:46 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:54:46 Redirecting karaf console output to karaf_console.log 04:54:46 Starting controller... 04:54:46 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 04:54:46 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins154459794594816838.sh 04:54:46 common-functions.sh is being sourced 04:54:46 common-functions environment: 04:54:46 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 04:54:46 ACTUALFEATURES: 04:54:46 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 04:54:46 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 04:54:46 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 04:54:46 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 04:54:46 CONTROLLERMEM: 2048m 04:54:46 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 04:54:46 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 04:54:46 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 04:54:46 SUITES: 04:54:46 04:54:46 + echo '#################################################' 04:54:46 ################################################# 04:54:46 + echo '## Verify Cluster is UP ##' 04:54:46 ## Verify Cluster is UP ## 04:54:46 + echo '#################################################' 04:54:46 ################################################# 04:54:46 + create_post_startup_script 04:54:46 + cat 04:54:46 + copy_and_run_post_startup_script 04:54:46 + seed_index=1 04:54:46 ++ seq 1 3 04:54:46 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:54:46 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:54:46 + echo 'Execute the post startup script on controller 10.30.170.73' 04:54:46 Execute the post startup script on controller 10.30.170.73 04:54:46 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.170.73:/tmp/ 04:54:46 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:47 + ssh 10.30.170.73 'bash /tmp/post-startup-script.sh 1' 04:54:47 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:54:47 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:54:52 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:54:57 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:02 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:07 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:12 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:17 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:22 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:27 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:32 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:37 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:42 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:47 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:55:52 2025-09-13T04:55:03,883 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:55:52 Controller is UP 04:55:52 2025-09-13T04:55:03,883 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:55:52 Listing all open ports on controller system... 04:55:52 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:55:52 looking for "BindException: Address already in use" in log file 04:55:52 looking for "server is unhealthy" in log file 04:55:52 + '[' 1 == 0 ']' 04:55:52 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:55:52 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:55:52 + echo 'Execute the post startup script on controller 10.30.171.201' 04:55:52 Execute the post startup script on controller 10.30.171.201 04:55:52 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.201:/tmp/ 04:55:52 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:55:52 + ssh 10.30.171.201 'bash /tmp/post-startup-script.sh 2' 04:55:53 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:55:53 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:55:58 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:03 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:08 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:13 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:18 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:23 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:28 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:33 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:38 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:43 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:48 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:56:53 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:56:58 2025-09-13T04:55:07,187 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:56:58 Controller is UP 04:56:58 2025-09-13T04:55:07,187 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:56:58 Listing all open ports on controller system... 04:56:58 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:56:58 looking for "BindException: Address already in use" in log file 04:56:58 looking for "server is unhealthy" in log file 04:56:58 + '[' 2 == 0 ']' 04:56:58 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:56:58 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:56:58 + echo 'Execute the post startup script on controller 10.30.170.175' 04:56:58 Execute the post startup script on controller 10.30.170.175 04:56:58 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.170.175:/tmp/ 04:56:58 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:56:58 + ssh 10.30.170.175 'bash /tmp/post-startup-script.sh 3' 04:56:58 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:56:58 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:03 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:08 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:13 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:18 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:23 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:28 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:33 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:38 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:44 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:49 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:54 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:57:59 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:58:04 2025-09-13T04:55:05,100 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:58:04 Controller is UP 04:58:04 2025-09-13T04:55:05,100 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:58:04 Listing all open ports on controller system... 04:58:04 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:58:04 looking for "BindException: Address already in use" in log file 04:58:04 looking for "server is unhealthy" in log file 04:58:04 + '[' 0 == 0 ']' 04:58:04 + seed_index=1 04:58:04 + dump_controller_threads 04:58:04 ++ seq 1 3 04:58:04 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:04 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:58:04 + echo 'Let'\''s take the karaf thread dump' 04:58:04 Let's take the karaf thread dump 04:58:04 + ssh 10.30.170.73 'sudo ps aux' 04:58:04 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:58:04 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 04:58:04 ++ grep -v grep 04:58:04 ++ cut -f2 '-d ' 04:58:04 ++ tr -s ' ' 04:58:04 + pid=2124 04:58:04 + echo 'karaf main: org.apache.karaf.main.Main, pid:2124' 04:58:04 karaf main: org.apache.karaf.main.Main, pid:2124 04:58:04 + ssh 10.30.170.73 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2124' 04:58:04 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 04:58:05 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:05 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:58:05 + echo 'Let'\''s take the karaf thread dump' 04:58:05 Let's take the karaf thread dump 04:58:05 + ssh 10.30.171.201 'sudo ps aux' 04:58:05 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:58:05 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 04:58:05 ++ grep -v grep 04:58:05 ++ tr -s ' ' 04:58:05 ++ cut -f2 '-d ' 04:58:05 + pid=2122 04:58:05 + echo 'karaf main: org.apache.karaf.main.Main, pid:2122' 04:58:05 karaf main: org.apache.karaf.main.Main, pid:2122 04:58:05 + ssh 10.30.171.201 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2122' 04:58:05 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 04:58:06 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:06 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:58:06 + echo 'Let'\''s take the karaf thread dump' 04:58:06 Let's take the karaf thread dump 04:58:06 + ssh 10.30.170.175 'sudo ps aux' 04:58:06 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:58:06 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 04:58:06 ++ grep -v grep 04:58:06 ++ tr -s ' ' 04:58:06 ++ cut -f2 '-d ' 04:58:06 + pid=2119 04:58:06 + echo 'karaf main: org.apache.karaf.main.Main, pid:2119' 04:58:06 karaf main: org.apache.karaf.main.Main, pid:2119 04:58:06 + ssh 10.30.170.175 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2119' 04:58:06 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 04:58:07 + '[' 0 -gt 0 ']' 04:58:07 + echo 'Generating controller variables...' 04:58:07 Generating controller variables... 04:58:07 ++ seq 1 3 04:58:07 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:07 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:58:07 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.73' 04:58:07 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:07 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:58:07 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.73 -v ODL_SYSTEM_2_IP:10.30.171.201' 04:58:07 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:58:07 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:58:07 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.73 -v ODL_SYSTEM_2_IP:10.30.171.201 -v ODL_SYSTEM_3_IP:10.30.170.175' 04:58:07 + echo 'Generating mininet variables...' 04:58:07 Generating mininet variables... 04:58:07 ++ seq 1 1 04:58:07 + for i in $(seq 1 "${NUM_TOOLS_SYSTEM}") 04:58:07 + MININETIP=TOOLS_SYSTEM_1_IP 04:58:07 + tools_variables=' -v TOOLS_SYSTEM_1_IP:10.30.171.2' 04:58:07 + get_test_suites SUITES 04:58:07 + local __suite_list=SUITES 04:58:07 + echo 'Locating test plan to use...' 04:58:07 Locating test plan to use... 04:58:07 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt 04:58:07 + '[' '!' -f /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt ']' 04:58:07 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 04:58:07 + '[' disabled '!=' disabled ']' 04:58:07 + echo 'Changing the testplan path...' 04:58:07 Changing the testplan path... 04:58:07 + sed s:integration:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 04:58:07 + cat testplan.txt 04:58:07 # Place the suites in run order: 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot 04:58:07 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 04:58:07 + '[' -z '' ']' 04:58:07 ++ grep -E -v '(^[[:space:]]*#|^[[:space:]]*$)' testplan.txt 04:58:07 ++ tr '\012' ' ' 04:58:07 + suite_list='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 04:58:07 + eval 'SUITES='\''/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot '\''' 04:58:07 ++ SUITES='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 04:58:07 + echo 'Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ...' 04:58:07 Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ... 04:58:07 + robot -N openflowplugin-clustering.txt --removekeywords wuks -e exclude -e skip_if_titanium -v BUNDLEFOLDER:karaf-0.22.1 -v BUNDLE_URL:https://nexus.opendaylight.org/content/repositories//autorelease-9182/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip -v CONTROLLER:10.30.170.73 -v CONTROLLER1:10.30.171.201 -v CONTROLLER2:10.30.170.175 -v CONTROLLER_USER:jenkins -v JAVA_HOME:/usr/lib/jvm/java-21-openjdk-amd64 -v JDKVERSION:openjdk21 -v JENKINS_WORKSPACE:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium -v MININET:10.30.171.2 -v MININET1: -v MININET2: -v MININET_USER:jenkins -v NEXUSURL_PREFIX:https://nexus.opendaylight.org -v NUM_ODL_SYSTEM:3 -v NUM_TOOLS_SYSTEM:1 -v ODL_STREAM:titanium -v ODL_SYSTEM_IP:10.30.170.73 -v ODL_SYSTEM_1_IP:10.30.170.73 -v ODL_SYSTEM_2_IP:10.30.171.201 -v ODL_SYSTEM_3_IP:10.30.170.175 -v ODL_SYSTEM_USER:jenkins -v TOOLS_SYSTEM_IP:10.30.171.2 -v TOOLS_SYSTEM_1_IP:10.30.171.2 -v TOOLS_SYSTEM_USER:jenkins -v USER_HOME:/home/jenkins -v IS_KARAF_APPL:True -v WORKSPACE:/tmp -v ODL_OF_PLUGIN:lithium /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 04:58:07 ============================================================================== 04:58:07 openflowplugin-clustering.txt 04:58:07 ============================================================================== 04:58:08 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test suite for C... 04:58:08 ============================================================================== 04:58:12 Check Shards Status Before Fail :: Check Status for all shards in ... | FAIL | 04:58:15 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 04:58:15 [ Message content over the limit has been removed. ] 04:58:15 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 04:58:15 ------------------------------------------------------------------------------ 04:58:15 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 04:58:25 ------------------------------------------------------------------------------ 04:58:25 Check Entity Owner Status And Find Owner and Successor Before Fail... | FAIL | 04:58:56 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 04:58:56 Lengths are different: 2 != 0 04:58:56 ------------------------------------------------------------------------------ 04:58:56 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 04:58:56 Variable '@{original_successor_list}' not found. 04:58:56 ------------------------------------------------------------------------------ 04:58:56 Check Network Operational Information Before Fail :: Check devices... | FAIL | 04:59:02 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Add Configuration In Owner and Verify Before Fail :: Add Flow in O... | FAIL | 04:59:02 Variable '${original_owner}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Modify Configuration In Owner and Verify Before Fail :: Modify Flo... | FAIL | 04:59:02 Variable '${original_owner}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Delete Configuration In Owner and Verify Before Fail :: Delete Flo... | FAIL | 04:59:02 Variable '${original_owner}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Add Configuration In Successor and Verify Before Fail :: Add Flow ... | FAIL | 04:59:02 Variable '${original_successor}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Modify Configuration In Successor and Verify Before Fail :: Modify... | FAIL | 04:59:02 Variable '${original_successor}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Delete Configuration In Successor and Verify Before Fail :: Delete... | FAIL | 04:59:02 Variable '${original_successor}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Send RPC Add to Owner and Verify Before Fail :: Add Flow in Owner ... | FAIL | 04:59:02 Variable '${original_owner}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Send RPC Delete to Owner and Verify Before Fail :: Delete Flow in ... | FAIL | 04:59:02 Variable '${original_owner}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Send RPC Add to Successor and Verify Before Fail :: Add Flow in Su... | FAIL | 04:59:02 Variable '${original_successor}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Send RPC Delete to Successor and Verify Before Fail :: Delete Flow... | FAIL | 04:59:02 Variable '${original_successor}' not found. 04:59:02 ------------------------------------------------------------------------------ 04:59:02 Modify Network And Verify Before Fail :: Take a link down and veri... | FAIL | 04:59:23 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 04:59:23 ------------------------------------------------------------------------------ 04:59:23 Restore Network And Verify Before Fail :: Take the link up and ver... | FAIL | 04:59:34 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Kill Owner Instance :: Kill Owner Instance and verify it is dead | FAIL | 04:59:34 Variable '${original_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Check Shards Status After Fail :: Create original cluster list and... | FAIL | 04:59:34 Variable '${new_cluster_list}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Check Entity Owner Status And Find Owner and Successor After Fail ... | FAIL | 04:59:34 Variable '${original_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Check Network Operational Information After Fail :: Check devices ... | FAIL | 04:59:34 Variable '${new_cluster_list}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Add Configuration In Owner and Verify After Fail :: Add Flow in Ow... | FAIL | 04:59:34 Variable '${new_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Modify Configuration In Owner and Verify After Fail :: Modify Flow... | FAIL | 04:59:34 Variable '${new_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Delete Configuration In Owner and Verify After Fail :: Delete Flow... | FAIL | 04:59:34 Variable '${new_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Add Configuration In Successor and Verify After Fail :: Add Flow i... | FAIL | 04:59:34 Variable '${new_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Modify Configuration In Successor and Verify After Fail :: Modify ... | FAIL | 04:59:34 Variable '${new_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Delete Configuration In Successor and Verify After Fail :: Delete ... | FAIL | 04:59:34 Variable '${new_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Send RPC Add to Owner and Verify After Fail :: Add Flow in Owner a... | FAIL | 04:59:34 Variable '${new_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Send RPC Delete to Owner and Verify After Fail :: Delete Flow in O... | FAIL | 04:59:34 Variable '${new_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Send RPC Add to Successor and Verify After Fail :: Add Flow in Suc... | FAIL | 04:59:34 Variable '${new_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Send RPC Delete to Successor and Verify After Fail :: Delete Flow ... | FAIL | 04:59:34 Variable '${new_successor}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Modify Network and Verify After Fail :: Take a link down and verif... | FAIL | 04:59:34 Variable '${new_cluster_list}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Restore Network and Verify After Fail :: Take the link up and veri... | FAIL | 04:59:34 Variable '${new_cluster_list}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 04:59:34 This test fails due to https://jira.opendaylight.org/browse/CONTROLLER-1849 04:59:34 04:59:34 Variable '${original_owner}' not found. 04:59:34 ------------------------------------------------------------------------------ 04:59:34 Check Shards Status After Recover :: Create original cluster list ... | FAIL | 05:01:05 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 05:01:05 [ Message content over the limit has been removed. ] 05:01:05 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 05:01:05 ------------------------------------------------------------------------------ 05:01:05 Check Entity Owner Status After Recover :: Check Entity Owner Stat... | FAIL | 05:01:36 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:01:36 Lengths are different: 2 != 0 05:01:36 ------------------------------------------------------------------------------ 05:01:36 Check Network Operational Information After Recover :: Check devic... | FAIL | 05:01:41 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Add Configuration In Owner and Verify After Recover :: Add Flow in... | FAIL | 05:01:41 Variable '${new_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Modify Configuration In Owner and Verify After Recover :: Modify F... | FAIL | 05:01:41 Variable '${new_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Delete Configuration In Owner and Verify After Recover :: Delete F... | FAIL | 05:01:41 Variable '${new_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Add Configuration In Old Owner and Verify After Recover :: Add Flo... | FAIL | 05:01:41 Variable '${original_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Modify Configuration In Old Owner and Verify After Recover :: Modi... | FAIL | 05:01:41 Variable '${original_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Delete Configuration In Old Owner and Verify After Recover :: Dele... | FAIL | 05:01:41 Variable '${original_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Send RPC Add to Owner and Verify After Recover :: Add Flow in Owne... | FAIL | 05:01:41 Variable '${new_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Send RPC Delete to Owner and Verify After Recover :: Delete Flow i... | FAIL | 05:01:41 Variable '${new_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Send RPC Add to Old Owner and Verify After Recover :: Add Flow in ... | FAIL | 05:01:41 Variable '${original_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Send RPC Delete to Old Owner and Verify After Recover :: Delete Fl... | FAIL | 05:01:41 Variable '${original_owner}' not found. 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Modify Network and Verify After Recover :: Take a link down and ve... | FAIL | 05:02:03 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 05:02:03 ------------------------------------------------------------------------------ 05:02:03 Restore Network and Verify After Recover :: Take the link up and v... | FAIL | 05:02:14 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 05:02:14 ------------------------------------------------------------------------------ 05:02:14 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 05:02:16 ------------------------------------------------------------------------------ 05:02:16 Check No Network Operational Information :: Check device is not in... | PASS | 05:02:17 ------------------------------------------------------------------------------ 05:02:17 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test su... | FAIL | 05:02:17 51 tests, 3 passed, 48 failed 05:02:17 ============================================================================== 05:02:17 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test suite for Cl... 05:02:17 ============================================================================== 05:02:20 Check Shards Status Before Stop :: Check Status for all shards in ... | FAIL | 05:02:20 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 05:02:20 [ Message content over the limit has been removed. ] 05:02:20 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 05:02:20 ------------------------------------------------------------------------------ 05:02:20 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 05:02:30 ------------------------------------------------------------------------------ 05:02:30 Check Entity Owner Status And Find Owner and Successor Before Stop... | FAIL | 05:03:01 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:03:01 Lengths are different: 2 != 0 05:03:01 ------------------------------------------------------------------------------ 05:03:01 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 05:03:01 Variable '@{original_successor_list}' not found. 05:03:01 ------------------------------------------------------------------------------ 05:03:01 Check Network Operational Information Before Stop :: Check devices... | FAIL | 05:03:07 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 21 times. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Add Configuration In Owner and Verify Before Stop :: Add Flow in O... | FAIL | 05:03:07 Variable '${original_owner}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Modify Configuration In Owner and Verify Before Stop :: Modify Flo... | FAIL | 05:03:07 Variable '${original_owner}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Delete Configuration In Owner and Verify Before Stop :: Delete Flo... | FAIL | 05:03:07 Variable '${original_owner}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Add Configuration In Successor and Verify Before Stop :: Add Flow ... | FAIL | 05:03:07 Variable '${original_successor}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Modify Configuration In Successor and Verify Before Stop :: Modify... | FAIL | 05:03:07 Variable '${original_successor}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Delete Configuration In Successor and Verify Before Stop :: Delete... | FAIL | 05:03:07 Variable '${original_successor}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Send RPC Add to Owner and Verify Before Stop :: Add Flow in Owner ... | FAIL | 05:03:07 Variable '${original_owner}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Send RPC Delete to Owner and Verify Before Stop :: Delete Flow in ... | FAIL | 05:03:07 Variable '${original_owner}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Send RPC Add to Successor and Verify Before Stop :: Add Flow in Su... | FAIL | 05:03:07 Variable '${original_successor}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Send RPC Delete to Successor and Verify Before Stop :: Delete Flow... | FAIL | 05:03:07 Variable '${original_successor}' not found. 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Modify Network And Verify Before Stop :: Take a link down and veri... | FAIL | 05:03:27 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 16 times. 05:03:27 ------------------------------------------------------------------------------ 05:03:27 Restore Network And Verify Before Stop :: Take the link up and ver... | FAIL | 05:03:38 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 21 times. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Stop Owner Instance :: Stop Owner Instance and verify it is dead | FAIL | 05:03:38 Variable '${original_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Check Shards Status After Stop :: Create original cluster list and... | FAIL | 05:03:38 Variable '${new_cluster_list}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Check Entity Owner Status And Find Owner and Successor After Stop ... | FAIL | 05:03:38 Variable '${original_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Check Network Operational Information After Stop :: Check devices ... | FAIL | 05:03:38 Variable '${new_cluster_list}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Add Configuration In Owner and Verify After Stop :: Add Flow in Ow... | FAIL | 05:03:38 Variable '${new_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Modify Configuration In Owner and Verify After Stop :: Modify Flow... | FAIL | 05:03:38 Variable '${new_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Delete Configuration In Owner and Verify After Stop :: Delete Flow... | FAIL | 05:03:38 Variable '${new_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Add Configuration In Successor and Verify After Stop :: Add Flow i... | FAIL | 05:03:38 Variable '${new_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Modify Configuration In Successor and Verify After Stop :: Modify ... | FAIL | 05:03:38 Variable '${new_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Delete Configuration In Successor and Verify After Stop :: Delete ... | FAIL | 05:03:38 Variable '${new_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Send RPC Add to Owner and Verify After Stop :: Add Flow in Owner a... | FAIL | 05:03:38 Variable '${new_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Send RPC Delete to Owner and Verify After Stop :: Delete Flow in O... | FAIL | 05:03:38 Variable '${new_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Send RPC Add to Successor and Verify After Stop :: Add Flow in Suc... | FAIL | 05:03:38 Variable '${new_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Send RPC Delete to Successor and Verify After Stop :: Delete Flow ... | FAIL | 05:03:38 Variable '${new_successor}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Modify Network and Verify After Stop :: Take a link down and verif... | FAIL | 05:03:38 Variable '${new_cluster_list}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Restore Network and Verify After Stop :: Take the link up and veri... | FAIL | 05:03:38 Variable '${new_cluster_list}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 05:03:38 Variable '${original_owner}' not found. 05:03:38 ------------------------------------------------------------------------------ 05:03:38 Check Shards Status After Start :: Create original cluster list an... | FAIL | 05:05:09 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 05:05:09 [ Message content over the limit has been removed. ] 05:05:09 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 05:05:09 ------------------------------------------------------------------------------ 05:05:09 Check Entity Owner Status After Start :: Check Entity Owner Status... | FAIL | 05:05:40 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:05:40 Lengths are different: 2 != 0 05:05:40 ------------------------------------------------------------------------------ 05:05:40 Check Network Operational Information After Start :: Check devices... | FAIL | 05:05:45 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 21 times. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Add Configuration In Owner and Verify After Start :: Add Flow in O... | FAIL | 05:05:45 Variable '${new_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Modify Configuration In Owner and Verify After Start :: Modify Flo... | FAIL | 05:05:45 Variable '${new_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Delete Configuration In Owner and Verify After Start :: Delete Flo... | FAIL | 05:05:45 Variable '${new_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Add Configuration In Old Owner and Verify After Start :: Add Flow ... | FAIL | 05:05:45 Variable '${original_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Modify Configuration In Old Owner and Verify After Start :: Modify... | FAIL | 05:05:45 Variable '${original_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Delete Configuration In Old Owner and Verify After Start :: Delete... | FAIL | 05:05:45 Variable '${original_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Send RPC Add to Owner and Verify After Start :: Add Flow in Owner ... | FAIL | 05:05:45 Variable '${new_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Send RPC Delete to Owner and Verify After Start :: Delete Flow in ... | FAIL | 05:05:45 Variable '${new_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Send RPC Add to Old Owner and Verify After Start :: Add Flow in Ow... | FAIL | 05:05:45 Variable '${original_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Send RPC Delete to Old Owner and Verify After Start :: Delete Flow... | FAIL | 05:05:45 Variable '${original_owner}' not found. 05:05:45 ------------------------------------------------------------------------------ 05:05:45 Modify Network and Verify After Start :: Take a link down and veri... | FAIL | 05:06:06 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 16 times. 05:06:06 ------------------------------------------------------------------------------ 05:06:06 Restore Network and Verify After Start :: Take the link up and ver... | FAIL | 05:06:17 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:3:3","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 14 times, not 21 times. 05:06:17 ------------------------------------------------------------------------------ 05:06:17 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 05:06:19 ------------------------------------------------------------------------------ 05:06:19 Check No Network Operational Information :: Check device is not in... | PASS | 05:06:19 ------------------------------------------------------------------------------ 05:06:19 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test sui... | FAIL | 05:06:19 51 tests, 3 passed, 48 failed 05:06:19 ============================================================================== 05:06:19 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Follower Fail... 05:06:19 ============================================================================== 05:06:22 Check Shards Status Before Leader Restart :: Check Status for all ... | FAIL | 05:06:23 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 05:06:23 [ Message content over the limit has been removed. ] 05:06:23 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 05:06:23 ------------------------------------------------------------------------------ 05:06:23 Get inventory Leader Before Leader Restart :: Find leader in the i... | FAIL | 05:06:34 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 05:06:34 [ Message content over the limit has been removed. ] 05:06:34 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 05:06:34 ------------------------------------------------------------------------------ 05:06:34 Start Mininet Connect To Follower Node1 :: Start mininet with conn... | FAIL | 05:06:34 Variable '${follower_node_1}' not found. 05:06:34 ------------------------------------------------------------------------------ 05:06:34 Add Flows In Follower Node2 and Verify Before Leader Restart :: Ad... | FAIL | 05:06:35 Variable '${follower_node_2}' not found. 05:06:35 ------------------------------------------------------------------------------ 05:06:35 Stop Mininet Connected To Follower Node1 and Exit :: Stop mininet ... | FAIL | 05:06:35 Variable '${mininet_conn_id}' not found. 05:06:35 ------------------------------------------------------------------------------ 05:06:35 Restart Leader From Cluster Node :: Stop Leader Node and Start it ... | FAIL | 05:06:36 Variable '${inventory_leader}' not found. 05:06:36 ------------------------------------------------------------------------------ 05:06:36 Get inventory Follower After Leader Restart :: Find new Followers ... | FAIL | 05:06:46 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 05:06:46 [ Message content over the limit has been removed. ] 05:06:46 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 05:06:46 ------------------------------------------------------------------------------ 05:06:46 Start Mininet Connect To Old Leader :: Start mininet with connecti... | FAIL | 05:06:47 Variable '${inventory_leader_old}' not found. 05:06:47 ------------------------------------------------------------------------------ 05:06:47 Verify Flows In Switch After Leader Restart :: Verify flows are in... | FAIL | 05:07:03 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.73:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 05:07:03 ------------------------------------------------------------------------------ 05:07:03 Stop Mininet Connected To Old Leader and Exit :: Stop mininet and ... | FAIL | 05:07:03 Variable '${mininet_conn_id}' not found. 05:07:03 ------------------------------------------------------------------------------ 05:07:03 Restart Follower Node2 :: Stop Follower Node2 and Start it Up, Ver... | FAIL | 05:07:04 Variable '${follower_node_2}' not found. 05:07:04 ------------------------------------------------------------------------------ 05:07:04 Get inventory Follower After Follower Restart :: Find Followers an... | FAIL | 05:07:14 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 05:07:14 [ Message content over the limit has been removed. ] 05:07:14 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 05:07:14 ------------------------------------------------------------------------------ 05:07:14 Start Mininet Connect To Leader :: Start mininet with connection t... | FAIL | 05:07:15 Variable '${inventory_leader}' not found. 05:07:15 ------------------------------------------------------------------------------ 05:07:15 Verify Flows In Switch After Follower Restart :: Verify flows are ... | FAIL | 05:07:31 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.73:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 05:07:31 ------------------------------------------------------------------------------ 05:07:31 Stop Mininet Connected To Leader and Exit :: Stop mininet Connecte... | FAIL | 05:07:31 Variable '${mininet_conn_id}' not found. 05:07:31 ------------------------------------------------------------------------------ 05:07:31 Restart Full Cluster :: Stop all Cluster Nodes and Start it Up All. | PASS | 05:08:07 ------------------------------------------------------------------------------ 05:08:07 Get inventory Status After Cluster Restart :: Find New Followers a... | FAIL | 05:08:50 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 05:08:50 [ Message content over the limit has been removed. ] 05:08:50 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 05:08:50 ------------------------------------------------------------------------------ 05:08:50 Start Mininet Connect To Follower Node2 After Cluster Restart :: S... | FAIL | 05:08:51 Variable '${follower_node_2}' not found. 05:08:51 ------------------------------------------------------------------------------ 05:08:51 Verify Flows In Switch After Cluster Restart :: Verify flows are i... | FAIL | 05:09:07 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.73:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 05:09:07 ------------------------------------------------------------------------------ 05:09:07 Delete Flows In Follower Node1 and Verify After Leader Restart :: ... | FAIL | 05:09:07 Variable '${follower_node_1}' not found. 05:09:07 ------------------------------------------------------------------------------ 05:09:07 Stop Mininet Connected To Follower Node2 and Exit After Cluster Re... | FAIL | 05:09:08 Variable '${mininet_conn_id}' not found. 05:09:08 ------------------------------------------------------------------------------ 05:09:08 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Foll... | FAIL | 05:09:08 21 tests, 1 passed, 20 failed 05:09:08 ============================================================================== 05:09:08 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/libraries/VsctlListParser.py:61: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:08 if ctl_ref is not "": 05:09:08 openflowplugin-clustering.txt.010 Group Flows :: Switch connections and clu... 05:09:08 ============================================================================== 05:09:11 Add Groups And Flows :: Add 100 groups 1&2 and flows in every switch. | PASS | 05:09:15 ------------------------------------------------------------------------------ 05:09:15 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 05:09:24 ------------------------------------------------------------------------------ 05:09:24 Check Linear Topology :: Check Linear Topology. | FAIL | 05:09:55 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:09:55 ------------------------------------------------------------------------------ 05:09:55 Check Stats Are Not Frozen :: Check that duration flow stat is inc... | PASS | 05:10:01 ------------------------------------------------------------------------------ 05:10:01 Check Flows In Operational DS :: Check Flows in operational DS. | FAIL | 05:10:12 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 10 seconds. The last error was: 203 != 303 05:10:12 ------------------------------------------------------------------------------ 05:10:12 Check Groups In Operational DS :: Check Groups in operational DS. | FAIL | 05:10:23 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 400 != 600 05:10:23 ------------------------------------------------------------------------------ 05:10:23 Check Flows In Switch :: Check Flows in switch. | FAIL | 05:10:23 203.0 != 303.0 05:10:23 ------------------------------------------------------------------------------ 05:10:23 Check Entity Owner Status And Find Owner and Successor Before Fail... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 | FAIL | 05:10:55 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:10:55 Lengths are different: 2 != 0 05:10:55 ------------------------------------------------------------------------------ 05:10:55 Disconnect Mininet From Owner :: Disconnect mininet from the owner :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 | FAIL | 05:10:55 Variable '${original_owner}' not found. 05:10:55 ------------------------------------------------------------------------------ 05:10:55 Check Entity Owner Status And Find Owner and Successor After Fail ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 | FAIL | 05:11:06 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${new_cluster_list}' not found. 05:11:06 ------------------------------------------------------------------------------ 05:11:06 Check Switch Moves To New Master :: Check switch s1 is connected t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 | FAIL | 05:11:06 Variable '${new_owner}' not found. 05:11:06 ------------------------------------------------------------------------------ 05:11:06 Check Linear Topology After Disconnect :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:37 | FAIL | 05:11:37 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:11:37 ------------------------------------------------------------------------------ 05:11:37 Check Stats Are Not Frozen After Disconnect :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 | PASS | 05:11:43 ------------------------------------------------------------------------------ 05:11:43 Remove Flows And Groups After Mininet Is Disconnected :: Remove 1 ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 | PASS | 05:11:43 ------------------------------------------------------------------------------ 05:11:43 Check Flows In Operational DS After Mininet Is Disconnected :: Che... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:14 | FAIL | 05:12:14 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 201 != 300 05:12:14 ------------------------------------------------------------------------------ 05:12:14 Check Groups In Operational DS After Mininet Is Disconnected :: Ch... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:25 | FAIL | 05:12:25 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 396 != 594 05:12:25 ------------------------------------------------------------------------------ 05:12:25 Check Flows In Switch After Mininet Is Disconnected :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 | FAIL | 05:12:26 201.0 != 300.0 05:12:26 ------------------------------------------------------------------------------ 05:12:26 Reconnect Mininet To Owner :: Reconnect mininet to switch 1 owner. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 | FAIL | 05:12:26 Variable '${original_owner_list}' not found. 05:12:26 ------------------------------------------------------------------------------ 05:12:26 Check Entity Owner Status And Find Owner and Successor After Recon... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:57 | FAIL | 05:12:57 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:12:57 Lengths are different: 2 != 0 05:12:57 ------------------------------------------------------------------------------ 05:12:57 Add Flows And Groups After Owner Reconnect :: Add 1 group type 1&2... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:59 | PASS | 05:12:59 ------------------------------------------------------------------------------ 05:12:59 Check Stats Are Not Frozen After Owner Reconnect :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:05 | PASS | 05:13:05 ------------------------------------------------------------------------------ 05:13:05 Check Flows After Owner Reconnect In Operational DS :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:36 | FAIL | 05:13:36 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 203 != 303 05:13:36 ------------------------------------------------------------------------------ 05:13:36 Check Groups After Owner Reconnect In Operational DS :: Check Grou... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:47 | FAIL | 05:13:47 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 400 != 600 05:13:47 ------------------------------------------------------------------------------ 05:13:47 Check Flows After Owner Reconnect In Switch :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 | FAIL | 05:13:48 203.0 != 303.0 05:13:48 ------------------------------------------------------------------------------ 05:13:48 Check Switches Generate Slave Connection :: Check switches are con... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 | FAIL | 05:13:48 Variable '${original_owner}' not found. 05:13:48 ------------------------------------------------------------------------------ 05:13:48 Disconnect Mininet From Successor :: Disconnect mininet from the S... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:48 | FAIL | 05:13:48 Variable '${new_successor_list}' not found. 05:13:48 ------------------------------------------------------------------------------ 05:13:48 Check Entity Owner Status And Find New Owner and Successor After D... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:59 | FAIL | 05:13:59 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${owner_list}' not found. 05:13:59 ------------------------------------------------------------------------------ 05:13:59 Disconnect Mininet From Current Owner :: Disconnect mininet from t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:59 | FAIL | 05:13:59 Variable '${current_owner}' not found. 05:13:59 ------------------------------------------------------------------------------ 05:13:59 Check Entity Owner Status And Find Current Owner and Successor Aft... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:13:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 | FAIL | 05:14:10 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${original_owner_list}' not found. 05:14:10 ------------------------------------------------------------------------------ 05:14:10 Check Switch Moves To Current Master :: Check switch s1 is connect... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 | FAIL | 05:14:10 Variable '${current_new_owner}' not found. 05:14:10 ------------------------------------------------------------------------------ 05:14:10 Check Linear Topology After Owner Disconnect :: Check Linear Topol... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:40 | FAIL | 05:14:40 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:14:40 ------------------------------------------------------------------------------ 05:14:40 Check Stats Are Not Frozen After Owner Disconnect :: Check that du... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:46 | PASS | 05:14:46 ------------------------------------------------------------------------------ 05:14:46 Remove Flows And Groups After Owner Disconnected :: Remove 1 group... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:46 | PASS | 05:14:46 ------------------------------------------------------------------------------ 05:14:46 Check Flows In Operational DS After Owner Disconnected :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:14:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:18 | FAIL | 05:15:18 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 201 != 300 05:15:18 ------------------------------------------------------------------------------ 05:15:18 Check Groups In Operational DS After Owner Disconnected :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:28 | FAIL | 05:15:28 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 396 != 594 05:15:28 ------------------------------------------------------------------------------ 05:15:28 Check Flows In Switch After Owner Disconnected :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 | FAIL | 05:15:29 201.0 != 300.0 05:15:29 ------------------------------------------------------------------------------ 05:15:29 Disconnect Mininet From Cluster :: Disconnect Mininet from Cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 | FAIL | 05:15:29 Variable '${original_owner_list}' not found. 05:15:29 ------------------------------------------------------------------------------ 05:15:29 Check No Switches After Disconnect :: Check no switches in topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:15:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:00 | FAIL | 05:16:00 Keyword 'ClusterOpenFlow.Check No Switches On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' contains 'openflow:1' 05:16:00 ------------------------------------------------------------------------------ 05:16:00 Check Switch Is Not Connected :: Check switch s1 is not connected ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:11 | FAIL | 05:16:11 Keyword 'OvsManager.Should Be Disconnected' failed after retrying for 10 seconds. The last error was: Dictionary does not contain key 's1'. 05:16:11 ------------------------------------------------------------------------------ 05:16:11 Reconnect Mininet To Cluster :: Reconnect mininet to cluster by re... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:11 10.30.170.73 05:16:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:13 10.30.171.201 05:16:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:14 10.30.170.175 05:16:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:15 | PASS | 05:16:15 ------------------------------------------------------------------------------ 05:16:15 Check Linear Topology After Mininet Reconnects :: Check Linear Top... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:25 | FAIL | 05:16:25 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:16:25 ------------------------------------------------------------------------------ 05:16:25 Add Flows And Groups After Mininet Reconnects :: Add 1 group type ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:27 | PASS | 05:16:27 ------------------------------------------------------------------------------ 05:16:27 Check Flows In Operational DS After Mininet Reconnects :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:58 | FAIL | 05:16:58 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 203 != 303 05:16:58 ------------------------------------------------------------------------------ 05:16:58 Check Groups In Operational DS After Mininet Reconnects :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:16:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 | FAIL | 05:17:10 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 400 != 600 05:17:10 ------------------------------------------------------------------------------ 05:17:10 Check Flows In Switch After Mininet Reconnects :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 | FAIL | 05:17:10 203.0 != 303.0 05:17:10 ------------------------------------------------------------------------------ 05:17:10 Check Entity Owner Status And Find Owner and Successor Before Owne... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:41 | FAIL | 05:17:41 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:17:41 Lengths are different: 2 != 0 05:17:41 ------------------------------------------------------------------------------ 05:17:41 Check Switch Generates Slave Connection Before Owner Stop :: Check... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:41 | FAIL | 05:17:41 Variable '${original_successor}' not found. 05:17:41 ------------------------------------------------------------------------------ 05:17:41 Check Shards Status Before Owner Stop :: Check Status for all shar... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 | FAIL | 05:17:42 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 05:17:42 [ Message content over the limit has been removed. ] 05:17:42 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 05:17:42 ------------------------------------------------------------------------------ 05:17:42 Stop Owner Instance :: Stop Owner Instance and verify it is shutdown :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:42 | FAIL | 05:17:42 Variable '${original_owner}' not found. 05:17:42 ------------------------------------------------------------------------------ 05:17:42 Check Shards Status After Stop :: Check Status for all shards in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 | FAIL | 05:17:43 Variable '${new_cluster_list}' not found. 05:17:43 ------------------------------------------------------------------------------ 05:17:43 Check Entity Owner Status And Find Owner and Successor After Stop ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 | FAIL | 05:17:43 Variable '${original_successor}' not found. 05:17:43 ------------------------------------------------------------------------------ 05:17:43 Check Stats Are Not Frozen After Owner Stop :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:17:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 | FAIL | 05:18:14 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 05:18:14 ------------------------------------------------------------------------------ 05:18:14 Remove Configuration In Owner and Verify After Owner Stop :: Remov... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 | FAIL | 05:18:14 Variable '${new_owner}' not found. 05:18:14 ------------------------------------------------------------------------------ 05:18:14 Check Flows After Owner Stop In Operational DS :: Check Flows in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:45 | FAIL | 05:18:45 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 05:18:45 ------------------------------------------------------------------------------ 05:18:45 Check Groups After Owner Stop In Operational DS :: Check Groups in... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:55 | FAIL | 05:18:55 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: Variable '${new_owner}' not found. 05:18:55 ------------------------------------------------------------------------------ 05:18:55 Check Flows In Switch After Owner Stop :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 | FAIL | 05:18:58 203.0 != 300.0 05:18:58 ------------------------------------------------------------------------------ 05:18:58 Start Old Owner Instance :: Start old Owner Instance and verify it... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 | FAIL | 05:18:58 Variable '${original_owner}' not found. 05:18:58 ------------------------------------------------------------------------------ 05:18:58 Check Entity Owner Status And Find Owner and Successor After Start... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:18:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:29 | FAIL | 05:19:29 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:19:29 Lengths are different: 2 != 0 05:19:29 ------------------------------------------------------------------------------ 05:19:29 Check Linear Topology After Owner Restart :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:40 | FAIL | 05:19:40 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:19:40 ------------------------------------------------------------------------------ 05:19:40 Add Configuration In Owner and Verify After Owner Restart :: Add 1... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:40 | FAIL | 05:19:40 Variable '${new_owner}' not found. 05:19:40 ------------------------------------------------------------------------------ 05:19:40 Check Stats Are Not Frozen After Owner Restart :: Check that durat... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:41 | FAIL | 05:19:41 Variable '${new_owner}' not found. 05:19:41 ------------------------------------------------------------------------------ 05:19:41 Check Flows In Operational DS After Owner Restart :: Check Flows i... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:19:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:12 | FAIL | 05:20:12 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 203 != 303 05:20:12 ------------------------------------------------------------------------------ 05:20:12 Check Groups In Operational DS After Owner Restart :: Check Groups... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:23 | FAIL | 05:20:23 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 400 != 600 05:20:23 ------------------------------------------------------------------------------ 05:20:23 Check Flows In Switch After Owner Restart :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:23 | FAIL | 05:20:23 203.0 != 303.0 05:20:23 ------------------------------------------------------------------------------ 05:20:23 Restart Cluster :: Stop and Start cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:20:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:01 | PASS | 05:21:01 ------------------------------------------------------------------------------ 05:21:01 Check Linear Topology After Controller Restarts :: Check Linear To... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:22 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:21:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:26:34 | FAIL | 05:26:34 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 5 minutes. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}}]}]}}' does not contain '"source-tp":"openflow:1:2"' 05:26:34 ------------------------------------------------------------------------------ 05:26:34 Check Stats Are Not Frozen After Cluster Restart :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:26:34 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:26:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:05 | FAIL | 05:27:05 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.170.73:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 05:27:05 ------------------------------------------------------------------------------ 05:27:05 Check Flows In Operational DS After Controller Restarts :: Check F... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:36 | FAIL | 05:27:36 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 103 != 303 05:27:36 ------------------------------------------------------------------------------ 05:27:36 Check Groups In Operational DS After Controller Restarts :: Check ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 | FAIL | 05:27:48 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 372 != 600 05:27:48 ------------------------------------------------------------------------------ 05:27:48 Check Flows In Switch After Controller Restarts :: Check Flows in ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 | FAIL | 05:27:48 103.0 != 303.0 05:27:48 ------------------------------------------------------------------------------ 05:27:48 Stop Mininet :: Stop Mininet. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:27:49 | PASS | 05:27:49 ------------------------------------------------------------------------------ 05:27:49 Check No Switches :: Check no switches in topology. | PASS | 05:27:50 ------------------------------------------------------------------------------ 05:27:52 openflowplugin-clustering.txt.010 Group Flows :: Switch connection... | FAIL | 05:27:52 72 tests, 14 passed, 58 failed 05:27:52 ============================================================================== 05:27:52 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite for entit... 05:27:52 ============================================================================== 05:27:56 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 05:27:56 Parent suite setup failed: 05:27:56 Dictionary does not contain key 's1'. 05:27:56 ------------------------------------------------------------------------------ 05:27:56 Reconnecting Switch s1 | FAIL | 05:27:56 Parent suite setup failed: 05:27:56 Dictionary does not contain key 's1'. 05:27:56 ------------------------------------------------------------------------------ 05:27:56 Switches Still Be Connected To All Nodes | FAIL | 05:27:56 Parent suite setup failed: 05:27:56 Dictionary does not contain key 's1'. 05:27:56 ------------------------------------------------------------------------------ 05:27:57 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite ... | FAIL | 05:27:57 Suite setup failed: 05:27:57 Dictionary does not contain key 's1'. 05:27:57 05:27:57 3 tests, 0 passed, 3 failed 05:27:57 ============================================================================== 05:27:57 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test suite for en... 05:27:57 ============================================================================== 05:28:01 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 05:28:01 Parent suite setup failed: 05:28:01 Dictionary does not contain key 's1'. 05:28:01 ------------------------------------------------------------------------------ 05:28:01 Restarting Owner Of Switch s1 | FAIL | 05:28:01 Parent suite setup failed: 05:28:01 Dictionary does not contain key 's1'. 05:28:01 ------------------------------------------------------------------------------ 05:28:01 Switches Still Be Connected To All Nodes | FAIL | 05:28:01 Parent suite setup failed: 05:28:01 Dictionary does not contain key 's1'. 05:28:01 ------------------------------------------------------------------------------ 05:28:01 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test sui... | FAIL | 05:28:01 Suite setup failed: 05:28:01 Dictionary does not contain key 's1'. 05:28:01 05:28:01 3 tests, 0 passed, 3 failed 05:28:01 ============================================================================== 05:28:01 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test suite for e... 05:28:01 ============================================================================== 05:28:03 Start Mininet To All Nodes | FAIL | 05:28:05 Dictionary does not contain key 's1'. 05:28:05 ------------------------------------------------------------------------------ 05:28:05 Switches To Be Connected To All Nodes :: Initial check for correct... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:21 | FAIL | 05:28:21 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 05:28:21 ------------------------------------------------------------------------------ 05:28:21 Isolating Owner Of Switch s1 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:52 | FAIL | 05:28:52 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:28:52 05:28:52 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:28:52 ------------------------------------------------------------------------------ 05:28:52 Switches Still Be Connected To All Nodes :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:28:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:29:08 | FAIL | 05:29:08 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:29:08 05:29:08 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 05:29:08 ------------------------------------------------------------------------------ 05:29:08 Stop Mininet And Verify No Owners :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:29:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:29:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:29:23 | FAIL | 05:29:23 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:29:23 05:29:23 Keyword 'Check No Device Owners In Controller' failed after retrying 15 times. The last error was: Dictionary does not contain key '1'. 05:29:23 ------------------------------------------------------------------------------ 05:29:24 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test su... | FAIL | 05:29:24 5 tests, 0 passed, 5 failed 05:29:24 ============================================================================== 05:29:25 openflowplugin-clustering.txt.9145 :: Switch connections and cluster are re... 05:29:25 ============================================================================== 05:29:25 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 05:29:33 ------------------------------------------------------------------------------ 05:29:33 Check Entity Owner Status And Find Owner and Successor :: Check En... | FAIL | 05:30:04 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=9145 05:30:04 05:30:04 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 05:30:04 Lengths are different: 2 != 0 05:30:04 ------------------------------------------------------------------------------ 05:30:04 Stop Mininet :: Stop Mininet. | PASS | 05:30:04 ------------------------------------------------------------------------------ 05:30:04 openflowplugin-clustering.txt.9145 :: Switch connections and clust... | FAIL | 05:30:04 3 tests, 2 passed, 1 failed 05:30:04 ============================================================================== 05:30:04 openflowplugin-clustering.txt | FAIL | 05:30:04 209 tests, 23 passed, 186 failed 05:30:04 ============================================================================== 05:30:04 Output: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/output.xml 05:30:12 Log: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/log.html 05:30:12 Report: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/report.html 05:30:12 + true 05:30:12 + echo 'Examining the files in data/log and checking filesize' 05:30:12 Examining the files in data/log and checking filesize 05:30:12 + ssh 10.30.170.73 'ls -altr /tmp/karaf-0.22.1/data/log/' 05:30:12 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:12 total 1368 05:30:12 drwxrwxr-x 2 jenkins jenkins 4096 Sep 13 04:54 . 05:30:12 -rw-rw-r-- 1 jenkins jenkins 1720 Sep 13 04:54 karaf_console.log 05:30:12 drwxrwxr-x 9 jenkins jenkins 4096 Sep 13 04:55 .. 05:30:12 -rw-rw-r-- 1 jenkins jenkins 1384754 Sep 13 05:30 karaf.log 05:30:12 + ssh 10.30.170.73 'du -hs /tmp/karaf-0.22.1/data/log/*' 05:30:12 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:13 1.4M /tmp/karaf-0.22.1/data/log/karaf.log 05:30:13 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:13 + ssh 10.30.171.201 'ls -altr /tmp/karaf-0.22.1/data/log/' 05:30:13 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:13 total 1308 05:30:13 drwxrwxr-x 2 jenkins jenkins 4096 Sep 13 04:54 . 05:30:13 -rw-rw-r-- 1 jenkins jenkins 1720 Sep 13 04:54 karaf_console.log 05:30:13 drwxrwxr-x 9 jenkins jenkins 4096 Sep 13 04:55 .. 05:30:13 -rw-rw-r-- 1 jenkins jenkins 1324547 Sep 13 05:30 karaf.log 05:30:13 + ssh 10.30.171.201 'du -hs /tmp/karaf-0.22.1/data/log/*' 05:30:13 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:13 1.3M /tmp/karaf-0.22.1/data/log/karaf.log 05:30:13 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:13 + ssh 10.30.170.175 'ls -altr /tmp/karaf-0.22.1/data/log/' 05:30:13 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:13 total 1168 05:30:13 drwxrwxr-x 2 jenkins jenkins 4096 Sep 13 04:54 . 05:30:13 -rw-rw-r-- 1 jenkins jenkins 1720 Sep 13 04:54 karaf_console.log 05:30:13 drwxrwxr-x 9 jenkins jenkins 4096 Sep 13 04:55 .. 05:30:13 -rw-rw-r-- 1 jenkins jenkins 1175916 Sep 13 05:30 karaf.log 05:30:13 + ssh 10.30.170.175 'du -hs /tmp/karaf-0.22.1/data/log/*' 05:30:13 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:13 1.2M /tmp/karaf-0.22.1/data/log/karaf.log 05:30:13 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:13 + set +e 05:30:13 ++ seq 1 3 05:30:13 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:13 + CONTROLLERIP=ODL_SYSTEM_1_IP 05:30:13 + echo 'Let'\''s take the karaf thread dump again' 05:30:13 Let's take the karaf thread dump again 05:30:13 + ssh 10.30.170.73 'sudo ps aux' 05:30:14 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:14 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 05:30:14 ++ cut -f2 '-d ' 05:30:14 ++ grep -v grep 05:30:14 ++ tr -s ' ' 05:30:14 + pid=6402 05:30:14 + echo 'karaf main: org.apache.karaf.main.Main, pid:6402' 05:30:14 karaf main: org.apache.karaf.main.Main, pid:6402 05:30:14 + ssh 10.30.170.73 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 6402' 05:30:14 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:14 + echo 'killing karaf process...' 05:30:14 killing karaf process... 05:30:14 + ssh 10.30.170.73 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:30:14 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:15 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:15 + CONTROLLERIP=ODL_SYSTEM_2_IP 05:30:15 + echo 'Let'\''s take the karaf thread dump again' 05:30:15 Let's take the karaf thread dump again 05:30:15 + ssh 10.30.171.201 'sudo ps aux' 05:30:15 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:15 ++ tr -s ' ' 05:30:15 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 05:30:15 ++ grep -v grep 05:30:15 ++ cut -f2 '-d ' 05:30:15 + pid=6028 05:30:15 + echo 'karaf main: org.apache.karaf.main.Main, pid:6028' 05:30:15 karaf main: org.apache.karaf.main.Main, pid:6028 05:30:15 + ssh 10.30.171.201 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 6028' 05:30:15 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:15 + echo 'killing karaf process...' 05:30:15 killing karaf process... 05:30:15 + ssh 10.30.171.201 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:30:15 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:15 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:15 + CONTROLLERIP=ODL_SYSTEM_3_IP 05:30:15 + echo 'Let'\''s take the karaf thread dump again' 05:30:15 Let's take the karaf thread dump again 05:30:15 + ssh 10.30.170.175 'sudo ps aux' 05:30:16 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:16 ++ tr -s ' ' 05:30:16 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 05:30:16 ++ grep -v grep 05:30:16 ++ cut -f2 '-d ' 05:30:16 + pid=5965 05:30:16 + echo 'karaf main: org.apache.karaf.main.Main, pid:5965' 05:30:16 karaf main: org.apache.karaf.main.Main, pid:5965 05:30:16 + ssh 10.30.170.175 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 5965' 05:30:16 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:16 + echo 'killing karaf process...' 05:30:16 killing karaf process... 05:30:16 + ssh 10.30.170.175 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:30:16 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:16 + sleep 5 05:30:21 ++ seq 1 3 05:30:21 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:21 + CONTROLLERIP=ODL_SYSTEM_1_IP 05:30:21 + echo 'Compressing karaf.log 1' 05:30:21 Compressing karaf.log 1 05:30:21 + ssh 10.30.170.73 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 05:30:22 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:22 + echo 'Fetching compressed karaf.log 1' 05:30:22 Fetching compressed karaf.log 1 05:30:22 + scp 10.30.170.73:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl1_karaf.log.gz 05:30:22 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:22 + ssh 10.30.170.73 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 05:30:22 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:22 + scp 10.30.170.73:/tmp/karaf-0.22.1/data/log/karaf_console.log odl1_karaf_console.log 05:30:22 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:23 + ssh 10.30.170.73 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:23 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:23 + echo 'Fetch GC logs' 05:30:23 Fetch GC logs 05:30:23 + mkdir -p gclogs-1 05:30:23 + scp '10.30.170.73:/tmp/karaf-0.22.1/data/log/*.log' gclogs-1/ 05:30:23 Warning: Permanently added '10.30.170.73' (ECDSA) to the list of known hosts. 05:30:23 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 05:30:23 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:23 + CONTROLLERIP=ODL_SYSTEM_2_IP 05:30:23 + echo 'Compressing karaf.log 2' 05:30:23 Compressing karaf.log 2 05:30:23 + ssh 10.30.171.201 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 05:30:23 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:23 + echo 'Fetching compressed karaf.log 2' 05:30:23 Fetching compressed karaf.log 2 05:30:23 + scp 10.30.171.201:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl2_karaf.log.gz 05:30:23 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:23 + ssh 10.30.171.201 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 05:30:23 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:24 + scp 10.30.171.201:/tmp/karaf-0.22.1/data/log/karaf_console.log odl2_karaf_console.log 05:30:24 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:24 + ssh 10.30.171.201 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:24 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:24 + echo 'Fetch GC logs' 05:30:24 Fetch GC logs 05:30:24 + mkdir -p gclogs-2 05:30:24 + scp '10.30.171.201:/tmp/karaf-0.22.1/data/log/*.log' gclogs-2/ 05:30:24 Warning: Permanently added '10.30.171.201' (ECDSA) to the list of known hosts. 05:30:24 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 05:30:24 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:30:24 + CONTROLLERIP=ODL_SYSTEM_3_IP 05:30:24 + echo 'Compressing karaf.log 3' 05:30:24 Compressing karaf.log 3 05:30:24 + ssh 10.30.170.175 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 05:30:25 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:25 + echo 'Fetching compressed karaf.log 3' 05:30:25 Fetching compressed karaf.log 3 05:30:25 + scp 10.30.170.175:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl3_karaf.log.gz 05:30:25 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:25 + ssh 10.30.170.175 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 05:30:25 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:25 + scp 10.30.170.175:/tmp/karaf-0.22.1/data/log/karaf_console.log odl3_karaf_console.log 05:30:25 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:26 + ssh 10.30.170.175 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 05:30:26 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:26 + echo 'Fetch GC logs' 05:30:26 Fetch GC logs 05:30:26 + mkdir -p gclogs-3 05:30:26 + scp '10.30.170.175:/tmp/karaf-0.22.1/data/log/*.log' gclogs-3/ 05:30:26 Warning: Permanently added '10.30.170.175' (ECDSA) to the list of known hosts. 05:30:26 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 05:30:26 + echo 'Examine copied files' 05:30:26 Examine copied files 05:30:26 + ls -lt 05:30:26 total 184044 05:30:26 drwxrwxr-x. 2 jenkins jenkins 6 Sep 13 05:30 gclogs-3 05:30:26 -rw-rw-r--. 1 jenkins jenkins 1720 Sep 13 05:30 odl3_karaf_console.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 95953 Sep 13 05:30 odl3_karaf.log.gz 05:30:26 drwxrwxr-x. 2 jenkins jenkins 6 Sep 13 05:30 gclogs-2 05:30:26 -rw-rw-r--. 1 jenkins jenkins 1720 Sep 13 05:30 odl2_karaf_console.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 100187 Sep 13 05:30 odl2_karaf.log.gz 05:30:26 drwxrwxr-x. 2 jenkins jenkins 6 Sep 13 05:30 gclogs-1 05:30:26 -rw-rw-r--. 1 jenkins jenkins 1720 Sep 13 05:30 odl1_karaf_console.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 101395 Sep 13 05:30 odl1_karaf.log.gz 05:30:26 -rw-rw-r--. 1 jenkins jenkins 135862 Sep 13 05:30 karaf_3_5965_threads_after.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 13588 Sep 13 05:30 ps_after.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 136079 Sep 13 05:30 karaf_2_6028_threads_after.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 148992 Sep 13 05:30 karaf_1_6402_threads_after.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 287697 Sep 13 05:30 report.html 05:30:26 -rw-rw-r--. 1 jenkins jenkins 2940768 Sep 13 05:30 log.html 05:30:26 -rw-rw-r--. 1 jenkins jenkins 184101367 Sep 13 05:30 output.xml 05:30:26 -rw-rw-r--. 1 jenkins jenkins 1180 Sep 13 04:58 testplan.txt 05:30:26 -rw-rw-r--. 1 jenkins jenkins 96899 Sep 13 04:58 karaf_3_2119_threads_before.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 16195 Sep 13 04:58 ps_before.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 95208 Sep 13 04:58 karaf_2_2122_threads_before.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 95189 Sep 13 04:58 karaf_1_2124_threads_before.log 05:30:26 -rw-rw-r--. 1 jenkins jenkins 3043 Sep 13 04:54 post-startup-script.sh 05:30:26 -rw-rw-r--. 1 jenkins jenkins 1183 Sep 13 04:54 set_akka_debug.sh 05:30:26 -rw-rw-r--. 1 jenkins jenkins 133 Sep 13 04:54 configplan.txt 05:30:26 -rw-rw-r--. 1 jenkins jenkins 225 Sep 13 04:54 startup-script.sh 05:30:26 -rw-rw-r--. 1 jenkins jenkins 3290 Sep 13 04:54 configuration-script.sh 05:30:26 -rw-rw-r--. 1 jenkins jenkins 266 Sep 13 04:54 detect_variables.env 05:30:26 -rw-rw-r--. 1 jenkins jenkins 92 Sep 13 04:54 set_variables.env 05:30:26 -rw-rw-r--. 1 jenkins jenkins 353 Sep 13 04:54 slave_addresses.txt 05:30:26 -rw-rw-r--. 1 jenkins jenkins 570 Sep 13 04:53 requirements.txt 05:30:26 -rw-rw-r--. 1 jenkins jenkins 26 Sep 13 04:53 env.properties 05:30:26 -rw-rw-r--. 1 jenkins jenkins 334 Sep 13 04:52 stack-parameters.yaml 05:30:26 drwxrwxr-x. 7 jenkins jenkins 4096 Sep 13 04:51 test 05:30:26 drwxrwxr-x. 2 jenkins jenkins 6 Sep 13 04:51 test@tmp 05:30:26 + true 05:30:26 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins1362356276926531920.sh 05:30:26 Cleaning up Robot installation... 05:30:26 $ ssh-agent -k 05:30:26 unset SSH_AUTH_SOCK; 05:30:26 unset SSH_AGENT_PID; 05:30:26 echo Agent pid 5293 killed; 05:30:26 [ssh-agent] Stopped. 05:30:27 Recording plot data 05:30:27 Robot results publisher started... 05:30:27 INFO: Checking test criticality is deprecated and will be dropped in a future release! 05:30:27 -Parsing output xml: 05:30:29 Done! 05:30:29 -Copying log files to build dir: 05:30:34 Done! 05:30:34 -Assigning results to build: 05:30:34 Done! 05:30:34 -Checking thresholds: 05:30:34 Done! 05:30:34 Done publishing Robot results. 05:30:34 Build step 'Publish Robot Framework test results' changed build result to UNSTABLE 05:30:34 [PostBuildScript] - [INFO] Executing post build scripts. 05:30:34 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins3593421028158055093.sh 05:30:34 Archiving csit artifacts 05:30:34 mv: cannot stat '*_1.png': No such file or directory 05:30:34 mv: cannot stat '/tmp/odl1_*': No such file or directory 05:30:34 mv: cannot stat '*_2.png': No such file or directory 05:30:34 mv: cannot stat '/tmp/odl2_*': No such file or directory 05:30:34 mv: cannot stat '*_3.png': No such file or directory 05:30:34 mv: cannot stat '/tmp/odl3_*': No such file or directory 05:30:34 % Total % Received % Xferd Average Speed Time Time Time Current 05:30:34 Dload Upload Total Spent Left Speed 05:30:34 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 264k 0 264k 0 0 1306k 0 --:--:-- --:--:-- --:--:-- 1306k 100 6440k 0 6440k 0 0 5348k 0 --:--:-- 0:00:01 --:--:-- 5344k 100 10.1M 0 10.1M 0 0 4732k 0 --:--:-- 0:00:02 --:--:-- 4730k 100 12.0M 0 12.0M 0 0 4623k 0 --:--:-- 0:00:02 --:--:-- 4621k 05:30:37 Archive: robot-plugin.zip 05:30:37 inflating: ./archives/robot-plugin/log.html 05:30:37 inflating: ./archives/robot-plugin/output.xml 05:30:38 inflating: ./archives/robot-plugin/report.html 05:30:38 mv: cannot stat '*.log.gz': No such file or directory 05:30:38 mv: cannot stat '*.csv': No such file or directory 05:30:38 mv: cannot stat '*.png': No such file or directory 05:30:38 [PostBuildScript] - [INFO] Executing post build scripts. 05:30:38 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins7993006329866354451.sh 05:30:38 [PostBuildScript] - [INFO] Executing post build scripts. 05:30:38 [EnvInject] - Injecting environment variables from a build step. 05:30:38 [EnvInject] - Injecting as environment variables the properties content 05:30:38 OS_CLOUD=vex 05:30:38 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-406 05:30:38 05:30:38 [EnvInject] - Variables injected successfully. 05:30:38 provisioning config files... 05:30:38 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 05:30:38 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins9677365012717913488.sh 05:30:38 ---> openstack-stack-delete.sh 05:30:38 Setup pyenv: 05:30:38 system 05:30:38 3.8.13 05:30:38 3.9.13 05:30:38 3.10.13 05:30:38 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 05:30:39 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 05:30:40 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 05:30:40 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 05:30:41 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 05:30:59 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 05:30:59 INFO: Retrieving stack cost for: releng-openflowplugin-csit-3node-clustering-only-titanium-406 05:31:04 DEBUG: Successfully retrieved stack cost: total: 0.38999999999999996 05:31:16 INFO: Deleting stack releng-openflowplugin-csit-3node-clustering-only-titanium-406 05:31:16 Successfully deleted stack releng-openflowplugin-csit-3node-clustering-only-titanium-406 05:31:16 [PostBuildScript] - [INFO] Executing post build scripts. 05:31:16 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins11009268263236933845.sh 05:31:16 ---> sysstat.sh 05:31:16 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins16491338274777195100.sh 05:31:16 ---> package-listing.sh 05:31:16 ++ facter osfamily 05:31:16 ++ tr '[:upper:]' '[:lower:]' 05:31:16 + OS_FAMILY=redhat 05:31:16 + workspace=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium 05:31:16 + START_PACKAGES=/tmp/packages_start.txt 05:31:16 + END_PACKAGES=/tmp/packages_end.txt 05:31:16 + DIFF_PACKAGES=/tmp/packages_diff.txt 05:31:16 + PACKAGES=/tmp/packages_start.txt 05:31:16 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 05:31:16 + PACKAGES=/tmp/packages_end.txt 05:31:16 + case "${OS_FAMILY}" in 05:31:16 + rpm -qa 05:31:16 + sort 05:31:17 + '[' -f /tmp/packages_start.txt ']' 05:31:17 + '[' -f /tmp/packages_end.txt ']' 05:31:17 + diff /tmp/packages_start.txt /tmp/packages_end.txt 05:31:17 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 05:31:17 + mkdir -p /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 05:31:17 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 05:31:17 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins6326048019814573508.sh 05:31:17 ---> capture-instance-metadata.sh 05:31:17 Setup pyenv: 05:31:17 system 05:31:17 3.8.13 05:31:17 3.9.13 05:31:17 3.10.13 05:31:17 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 05:31:17 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 05:31:19 lf-activate-venv(): INFO: Installing: lftools 05:31:29 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 05:31:29 INFO: Running in OpenStack, capturing instance metadata 05:31:29 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins15873546438018841387.sh 05:31:29 provisioning config files... 05:31:30 Could not find credentials [logs] for openflowplugin-csit-3node-clustering-only-titanium #406 05:31:30 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/config5075790470039744120tmp 05:31:30 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 05:31:30 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 05:31:30 provisioning config files... 05:31:30 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 05:31:30 [EnvInject] - Injecting environment variables from a build step. 05:31:30 [EnvInject] - Injecting as environment variables the properties content 05:31:30 SERVER_ID=logs 05:31:30 05:31:30 [EnvInject] - Variables injected successfully. 05:31:30 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins1828919650267431782.sh 05:31:30 ---> create-netrc.sh 05:31:30 WARN: Log server credential not found. 05:31:30 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins15913017651238598488.sh 05:31:30 ---> python-tools-install.sh 05:31:30 Setup pyenv: 05:31:30 system 05:31:30 3.8.13 05:31:30 3.9.13 05:31:30 3.10.13 05:31:30 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 05:31:30 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 05:31:32 lf-activate-venv(): INFO: Installing: lftools 05:31:42 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 05:31:42 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins10850242040733322237.sh 05:31:42 ---> sudo-logs.sh 05:31:42 Archiving 'sudo' log.. 05:31:42 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins12755391204848765405.sh 05:31:42 ---> job-cost.sh 05:31:42 Setup pyenv: 05:31:42 system 05:31:42 3.8.13 05:31:42 3.9.13 05:31:42 3.10.13 05:31:42 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 05:31:43 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 05:31:44 lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 05:31:52 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 05:31:52 DEBUG: total: 0.38999999999999996 05:31:52 INFO: Retrieving Stack Cost... 05:31:52 INFO: Retrieving Pricing Info for: v3-standard-2 05:31:52 INFO: Archiving Costs 05:31:52 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins5208220347922393.sh 05:31:52 ---> logs-deploy.sh 05:31:52 Setup pyenv: 05:31:53 system 05:31:53 3.8.13 05:31:53 3.9.13 05:31:53 3.10.13 05:31:53 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 05:31:53 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l3Cx from file:/tmp/.os_lf_venv 05:31:55 lf-activate-venv(): INFO: Installing: lftools 05:32:05 lf-activate-venv(): INFO: Adding /tmp/venv-l3Cx/bin to PATH 05:32:05 WARNING: Nexus logging server not set 05:32:05 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/openflowplugin-csit-3node-clustering-only-titanium/406/ 05:32:05 INFO: archiving logs to S3 05:32:06 ---> uname -a: 05:32:06 Linux prd-centos8-robot-2c-8g-52087.novalocal 4.18.0-553.5.1.el8.x86_64 #1 SMP Tue May 21 05:46:01 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 05:32:06 05:32:06 05:32:06 ---> lscpu: 05:32:06 Architecture: x86_64 05:32:06 CPU op-mode(s): 32-bit, 64-bit 05:32:06 Byte Order: Little Endian 05:32:06 CPU(s): 2 05:32:06 On-line CPU(s) list: 0,1 05:32:06 Thread(s) per core: 1 05:32:06 Core(s) per socket: 1 05:32:06 Socket(s): 2 05:32:06 NUMA node(s): 1 05:32:06 Vendor ID: AuthenticAMD 05:32:06 CPU family: 23 05:32:06 Model: 49 05:32:06 Model name: AMD EPYC-Rome Processor 05:32:06 Stepping: 0 05:32:06 CPU MHz: 2800.000 05:32:06 BogoMIPS: 5600.00 05:32:06 Virtualization: AMD-V 05:32:06 Hypervisor vendor: KVM 05:32:06 Virtualization type: full 05:32:06 L1d cache: 32K 05:32:06 L1i cache: 32K 05:32:06 L2 cache: 512K 05:32:06 L3 cache: 16384K 05:32:06 NUMA node0 CPU(s): 0,1 05:32:06 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 05:32:06 05:32:06 05:32:06 ---> nproc: 05:32:06 2 05:32:06 05:32:06 05:32:06 ---> df -h: 05:32:06 Filesystem Size Used Avail Use% Mounted on 05:32:06 devtmpfs 3.8G 0 3.8G 0% /dev 05:32:06 tmpfs 3.8G 0 3.8G 0% /dev/shm 05:32:06 tmpfs 3.8G 17M 3.8G 1% /run 05:32:06 tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup 05:32:06 /dev/vda1 40G 8.7G 32G 22% / 05:32:06 tmpfs 770M 0 770M 0% /run/user/1001 05:32:06 05:32:06 05:32:06 ---> free -m: 05:32:06 total used free shared buff/cache available 05:32:06 Mem: 7697 660 4482 19 2554 6738 05:32:06 Swap: 1023 0 1023 05:32:06 05:32:06 05:32:06 ---> ip addr: 05:32:06 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 05:32:06 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 05:32:06 inet 127.0.0.1/8 scope host lo 05:32:06 valid_lft forever preferred_lft forever 05:32:06 inet6 ::1/128 scope host 05:32:06 valid_lft forever preferred_lft forever 05:32:06 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 05:32:06 link/ether fa:16:3e:4b:d9:c2 brd ff:ff:ff:ff:ff:ff 05:32:06 altname enp0s3 05:32:06 altname ens3 05:32:06 inet 10.30.171.215/23 brd 10.30.171.255 scope global dynamic noprefixroute eth0 05:32:06 valid_lft 83881sec preferred_lft 83881sec 05:32:06 inet6 fe80::f816:3eff:fe4b:d9c2/64 scope link 05:32:06 valid_lft forever preferred_lft forever 05:32:06 05:32:06 05:32:06 ---> sar -b -r -n DEV: 05:32:06 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 09/13/2025 _x86_64_ (2 CPU) 05:32:06 05:32:06 04:50:04 LINUX RESTART (2 CPU) 05:32:06 05:32:06 04:51:01 AM tps rtps wtps bread/s bwrtn/s 05:32:06 04:52:01 AM 90.25 6.13 84.12 1089.50 6780.66 05:32:06 04:53:01 AM 79.39 0.75 78.64 53.46 10963.99 05:32:06 04:54:01 AM 46.32 7.10 39.22 1309.96 2346.83 05:32:06 04:55:01 AM 53.60 0.25 53.35 47.05 7714.05 05:32:06 04:56:01 AM 5.27 0.00 5.27 0.00 231.01 05:32:06 04:57:01 AM 0.28 0.00 0.28 0.00 4.30 05:32:06 04:58:01 AM 0.17 0.00 0.17 0.00 1.18 05:32:06 04:59:01 AM 2.18 0.22 1.97 11.73 113.50 05:32:06 05:00:01 AM 0.37 0.00 0.37 0.00 117.95 05:32:06 05:01:01 AM 0.70 0.02 0.68 0.13 369.95 05:32:06 05:02:01 AM 0.50 0.07 0.43 1.73 204.53 05:32:06 05:03:01 AM 0.28 0.00 0.28 0.00 76.69 05:32:06 05:04:01 AM 0.17 0.00 0.17 0.00 48.79 05:32:06 05:05:01 AM 0.47 0.03 0.43 0.27 342.22 05:32:06 05:06:01 AM 1.55 0.00 1.55 0.00 311.31 05:32:06 05:07:01 AM 0.17 0.00 0.17 0.00 80.37 05:32:06 05:08:01 AM 0.27 0.03 0.23 1.87 177.34 05:32:06 05:09:01 AM 0.18 0.00 0.18 0.00 24.70 05:32:06 05:10:01 AM 0.40 0.00 0.40 0.00 189.69 05:32:06 05:11:01 AM 1.10 0.32 0.78 6.93 336.07 05:32:06 05:12:01 AM 1.00 0.00 1.00 0.00 200.45 05:32:06 05:13:01 AM 0.38 0.00 0.38 0.00 462.80 05:32:06 05:14:01 AM 0.33 0.00 0.33 0.00 320.63 05:32:06 05:15:01 AM 0.35 0.00 0.35 0.00 260.63 05:32:06 05:16:01 AM 0.68 0.00 0.68 0.00 566.36 05:32:06 05:17:01 AM 0.32 0.00 0.32 0.00 287.17 05:32:06 05:18:01 AM 0.30 0.00 0.30 0.00 368.69 05:32:06 05:19:01 AM 0.15 0.00 0.15 0.00 5.25 05:32:06 05:20:01 AM 0.25 0.00 0.25 0.00 157.43 05:32:06 05:21:01 AM 0.65 0.00 0.65 0.00 456.87 05:32:06 05:22:01 AM 0.25 0.00 0.25 0.00 14.03 05:32:06 05:23:01 AM 0.22 0.00 0.22 0.00 26.56 05:32:06 05:24:01 AM 0.22 0.00 0.22 0.00 35.19 05:32:06 05:25:01 AM 0.25 0.00 0.25 0.00 12.95 05:32:06 05:26:01 AM 0.33 0.00 0.33 0.00 30.21 05:32:06 05:27:01 AM 0.15 0.00 0.15 0.00 24.40 05:32:06 05:28:01 AM 0.45 0.00 0.45 0.00 456.73 05:32:06 05:29:01 AM 0.40 0.00 0.40 0.00 89.12 05:32:06 05:30:01 AM 0.32 0.00 0.32 0.00 34.18 05:32:06 05:31:01 AM 15.35 0.32 15.04 36.86 778.08 05:32:06 05:32:01 AM 42.45 6.43 36.02 628.72 7771.21 05:32:06 Average: 8.50 0.53 7.97 77.77 1043.81 05:32:06 05:32:06 04:51:01 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 05:32:06 04:52:01 AM 5211116 6997988 2671316 33.89 2688 1965520 737252 8.25 195056 2144740 241548 05:32:06 04:53:01 AM 5218568 7056060 2663864 33.79 2688 2013800 658648 7.37 208220 2108612 8880 05:32:06 04:54:01 AM 4895804 6983408 2986628 37.89 2688 2257024 755852 8.46 281004 2337268 181504 05:32:06 04:55:01 AM 4924168 7016760 2958264 37.53 2688 2259996 683708 7.66 262844 2314668 44 05:32:06 04:56:01 AM 4925840 7018216 2956592 37.51 2688 2259996 683708 7.66 262848 2314396 16 05:32:06 04:57:01 AM 4926100 7018480 2956332 37.51 2688 2260000 683708 7.66 262848 2314336 8 05:32:06 04:58:01 AM 4926144 7018524 2956288 37.50 2688 2260000 683708 7.66 262848 2314336 4 05:32:06 04:59:01 AM 4873152 6968184 3009280 38.18 2688 2262684 781652 8.75 262980 2366172 1044 05:32:06 05:00:01 AM 4860776 6961760 3021656 38.33 2688 2268628 805060 9.01 262980 2378420 3580 05:32:06 05:01:01 AM 4827704 6939400 3054728 38.75 2688 2279344 821296 9.20 262980 2410672 3408 05:32:06 05:02:01 AM 4820596 6935104 3061836 38.84 2688 2282196 821296 9.20 263108 2418680 204 05:32:06 05:03:01 AM 4816004 6932596 3066428 38.90 2688 2284216 821296 9.20 263108 2423488 4 05:32:06 05:04:01 AM 4809080 6931576 3073352 38.99 2688 2290148 821296 9.20 263108 2429516 4508 05:32:06 05:05:01 AM 4798748 6931876 3083684 39.12 2688 2300772 844308 9.45 263276 2439884 4948 05:32:06 05:06:01 AM 4795528 6932116 3086904 39.16 2688 2304232 844308 9.45 263276 2443448 44 05:32:06 05:07:01 AM 4791216 6933072 3091216 39.22 2688 2309508 844308 9.45 263360 2448000 2968 05:32:06 05:08:01 AM 4788700 6933260 3093732 39.25 2688 2312160 802000 8.98 263552 2450560 272 05:32:06 05:09:01 AM 4785900 6932888 3096532 39.28 2688 2314644 802000 8.98 263780 2452824 2076 05:32:06 05:10:01 AM 4781528 6932468 3100904 39.34 2688 2318548 855888 9.58 263832 2456908 432 05:32:06 05:11:01 AM 4769612 6930596 3112820 39.49 2688 2328584 855888 9.58 263964 2468720 500 05:32:06 05:12:01 AM 4758796 6927408 3123636 39.63 2688 2336268 855888 9.58 263964 2478524 2772 05:32:06 05:13:01 AM 4752808 6932560 3129624 39.70 2688 2347368 855888 9.58 263964 2485504 16 05:32:06 05:14:01 AM 4725144 6921520 3157288 40.05 2688 2363968 855888 9.58 263964 2513352 2976 05:32:06 05:15:01 AM 4728236 6931056 3154196 40.02 2688 2370460 855888 9.58 263964 2509268 5784 05:32:06 05:16:01 AM 4708684 6923080 3173748 40.26 2688 2382008 855888 9.58 263968 2529756 560 05:32:06 05:17:01 AM 4693364 6922236 3189068 40.46 2688 2396420 855888 9.58 263964 2544596 6396 05:32:06 05:18:01 AM 4699012 6932540 3183420 40.39 2688 2401072 855888 9.58 263964 2539456 16 05:32:06 05:19:01 AM 4698480 6932360 3183952 40.39 2688 2401428 855888 9.58 263964 2539864 268 05:32:06 05:20:01 AM 4685736 6928352 3196696 40.55 2688 2410220 851744 9.54 263964 2551184 4376 05:32:06 05:21:01 AM 4682840 6934944 3199592 40.59 2688 2419624 813016 9.10 263964 2556024 264 05:32:06 05:22:01 AM 4681984 6934492 3200448 40.60 2688 2420060 813016 9.10 263964 2556328 384 05:32:06 05:23:01 AM 4681072 6934284 3201360 40.61 2688 2420736 813016 9.10 263964 2557184 340 05:32:06 05:24:01 AM 4680720 6934568 3201712 40.62 2688 2421412 813016 9.10 263964 2557616 4 05:32:06 05:25:01 AM 4679920 6934496 3202512 40.63 2688 2422116 813016 9.10 263964 2558344 404 05:32:06 05:26:01 AM 4679300 6934572 3203132 40.64 2688 2422788 813016 9.10 263964 2559108 332 05:32:06 05:27:01 AM 4678856 6934740 3203576 40.64 2688 2423404 813016 9.10 263964 2559652 248 05:32:06 05:28:01 AM 4659380 6929932 3223052 40.89 2688 2438048 849848 9.52 264112 2578668 1268 05:32:06 05:29:01 AM 4655140 6927284 3227292 40.94 2688 2439632 866296 9.70 264120 2582556 376 05:32:06 05:30:01 AM 4654528 6927860 3227904 40.95 2688 2440824 885944 9.92 264120 2583440 656 05:32:06 05:31:01 AM 4621260 6909000 3261172 41.37 2688 2459932 756828 8.47 480756 2417044 219220 05:32:06 05:32:01 AM 4583768 6892372 3298664 41.85 2688 2481480 778400 8.72 642596 2297912 27056 05:32:06 Average: 4778910 6945951 3103522 39.37 2688 2335397 808157 9.05 275564 2451001 17798 05:32:06 05:32:06 04:51:01 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 05:32:06 04:52:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:52:01 AM eth0 155.30 110.60 1321.34 22.86 0.00 0.00 0.00 0.00 05:32:06 04:53:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:53:01 AM eth0 64.32 39.08 586.97 6.75 0.00 0.00 0.00 0.00 05:32:06 04:54:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:54:01 AM eth0 42.55 37.35 331.22 8.90 0.00 0.00 0.00 0.00 05:32:06 04:55:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:55:01 AM eth0 447.17 347.75 103.79 86.87 0.00 0.00 0.00 0.00 05:32:06 04:56:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:56:01 AM eth0 2.50 2.03 0.49 0.50 0.00 0.00 0.00 0.00 05:32:06 04:57:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:57:01 AM eth0 3.02 2.32 0.55 0.45 0.00 0.00 0.00 0.00 05:32:06 04:58:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:58:01 AM eth0 1.25 0.70 0.16 0.12 0.00 0.00 0.00 0.00 05:32:06 04:59:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 04:59:01 AM eth0 46.65 35.85 19.69 4.51 0.00 0.00 0.00 0.00 05:32:06 05:00:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:00:01 AM eth0 11.06 8.12 24.84 1.52 0.00 0.00 0.00 0.00 05:32:06 05:01:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:01:01 AM eth0 13.24 11.11 50.54 1.84 0.00 0.00 0.00 0.00 05:32:06 05:02:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:02:01 AM eth0 9.93 9.25 11.13 1.45 0.00 0.00 0.00 0.00 05:32:06 05:03:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:03:01 AM eth0 26.97 25.72 6.97 3.75 0.00 0.00 0.00 0.00 05:32:06 05:04:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:04:01 AM eth0 7.70 8.20 27.40 1.57 0.00 0.00 0.00 0.00 05:32:06 05:05:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:05:01 AM eth0 14.83 11.31 50.49 1.61 0.00 0.00 0.00 0.00 05:32:06 05:06:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:06:01 AM eth0 9.33 10.31 14.48 1.61 0.00 0.00 0.00 0.00 05:32:06 05:07:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:07:01 AM eth0 98.63 99.40 34.63 7.80 0.00 0.00 0.00 0.00 05:32:06 05:08:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:08:01 AM eth0 66.11 65.44 19.48 5.37 0.00 0.00 0.00 0.00 05:32:06 05:09:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:09:01 AM eth0 27.10 26.18 15.21 2.34 0.00 0.00 0.00 0.00 05:32:06 05:10:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:10:01 AM eth0 68.29 68.51 11.92 13.95 0.00 0.00 0.00 0.00 05:32:06 05:11:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:11:01 AM eth0 48.75 50.52 147.03 4.01 0.00 0.00 0.00 0.00 05:32:06 05:12:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:12:01 AM eth0 46.33 45.38 115.42 3.47 0.00 0.00 0.00 0.00 05:32:06 05:13:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:13:01 AM eth0 63.78 63.45 160.67 4.99 0.00 0.00 0.00 0.00 05:32:06 05:14:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:14:01 AM eth0 81.15 77.55 272.54 5.53 0.00 0.00 0.00 0.00 05:32:06 05:15:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:15:01 AM eth0 43.15 42.29 96.20 3.24 0.00 0.00 0.00 0.00 05:32:06 05:16:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:16:01 AM eth0 50.77 47.46 179.37 3.70 0.00 0.00 0.00 0.00 05:32:06 05:17:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:17:01 AM eth0 63.66 56.82 220.34 4.63 0.00 0.00 0.00 0.00 05:32:06 05:18:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:18:01 AM eth0 59.70 60.78 61.44 4.80 0.00 0.00 0.00 0.00 05:32:06 05:19:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:19:01 AM eth0 37.86 37.89 6.37 2.91 0.00 0.00 0.00 0.00 05:32:06 05:20:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:20:01 AM eth0 43.40 43.00 127.61 3.49 0.00 0.00 0.00 0.00 05:32:06 05:21:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:21:01 AM eth0 42.24 37.63 148.67 3.81 0.00 0.00 0.00 0.00 05:32:06 05:22:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:22:01 AM eth0 3.17 2.58 1.67 0.46 0.00 0.00 0.00 0.00 05:32:06 05:23:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:23:01 AM eth0 1.02 1.13 1.52 0.40 0.00 0.00 0.00 0.00 05:32:06 05:24:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:24:01 AM eth0 0.82 1.07 1.32 0.22 0.00 0.00 0.00 0.00 05:32:06 05:25:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:25:01 AM eth0 0.78 0.97 1.27 0.21 0.00 0.00 0.00 0.00 05:32:06 05:26:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:26:01 AM eth0 0.95 1.22 1.41 0.28 0.00 0.00 0.00 0.00 05:32:06 05:27:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:27:01 AM eth0 8.95 9.35 2.04 0.94 0.00 0.00 0.00 0.00 05:32:06 05:28:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:28:01 AM eth0 78.94 78.02 234.90 6.73 0.00 0.00 0.00 0.00 05:32:06 05:29:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:29:01 AM eth0 73.14 52.11 12.95 4.58 0.00 0.00 0.00 0.00 05:32:06 05:30:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:30:01 AM eth0 30.13 20.33 5.38 2.35 0.00 0.00 0.00 0.00 05:32:06 05:31:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:31:01 AM eth0 175.73 115.42 307.11 364.07 0.00 0.00 0.00 0.00 05:32:06 05:32:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 05:32:01 AM eth0 19.81 19.13 16.21 8.86 0.00 0.00 0.00 0.00 05:32:06 Average: lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:32:06 Average: eth0 50.99 43.50 115.93 14.83 0.00 0.00 0.00 0.00 05:32:06 05:32:06 05:32:06 ---> sar -P ALL: 05:32:06 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 09/13/2025 _x86_64_ (2 CPU) 05:32:06 05:32:06 04:50:04 LINUX RESTART (2 CPU) 05:32:06 05:32:06 04:51:01 AM CPU %user %nice %system %iowait %steal %idle 05:32:06 04:52:01 AM all 36.66 0.00 5.57 1.67 0.12 55.99 05:32:06 04:52:01 AM 0 48.69 0.00 6.73 2.13 0.12 42.33 05:32:06 04:52:01 AM 1 24.62 0.00 4.40 1.20 0.12 69.66 05:32:06 04:53:01 AM all 17.77 0.00 3.16 3.55 0.08 75.43 05:32:06 04:53:01 AM 0 23.87 0.00 3.62 3.82 0.08 68.60 05:32:06 04:53:01 AM 1 11.66 0.00 2.71 3.27 0.08 82.27 05:32:06 04:54:01 AM all 27.59 0.00 3.97 1.00 0.10 67.34 05:32:06 04:54:01 AM 0 35.76 0.00 4.72 1.17 0.08 58.26 05:32:06 04:54:01 AM 1 19.43 0.00 3.22 0.83 0.12 76.40 05:32:06 04:55:01 AM all 16.54 0.00 3.68 1.19 0.08 78.51 05:32:06 04:55:01 AM 0 15.53 0.00 3.55 1.19 0.10 79.63 05:32:06 04:55:01 AM 1 17.56 0.00 3.80 1.19 0.07 77.38 05:32:06 04:56:01 AM all 0.39 0.00 0.12 0.02 0.05 99.42 05:32:06 04:56:01 AM 0 0.60 0.00 0.08 0.00 0.05 99.27 05:32:06 04:56:01 AM 1 0.18 0.00 0.15 0.03 0.05 99.58 05:32:06 04:57:01 AM all 0.27 0.00 0.09 0.00 0.03 99.62 05:32:06 04:57:01 AM 0 0.43 0.00 0.08 0.00 0.03 99.45 05:32:06 04:57:01 AM 1 0.10 0.00 0.10 0.00 0.02 99.78 05:32:06 04:58:01 AM all 0.33 0.00 0.06 0.00 0.04 99.57 05:32:06 04:58:01 AM 0 0.58 0.00 0.07 0.00 0.05 99.30 05:32:06 04:58:01 AM 1 0.07 0.00 0.05 0.00 0.03 99.85 05:32:06 04:59:01 AM all 6.65 0.00 0.71 0.01 0.08 92.54 05:32:06 04:59:01 AM 0 6.75 0.00 0.64 0.02 0.10 92.50 05:32:06 04:59:01 AM 1 6.56 0.00 0.79 0.00 0.07 92.59 05:32:06 05:00:01 AM all 7.64 0.00 0.33 0.00 0.09 91.94 05:32:06 05:00:01 AM 0 14.02 0.00 0.42 0.00 0.10 85.46 05:32:06 05:00:01 AM 1 1.25 0.00 0.23 0.00 0.08 98.43 05:32:06 05:01:01 AM all 13.11 0.00 0.53 0.05 0.09 86.22 05:32:06 05:01:01 AM 0 7.80 0.00 0.39 0.10 0.08 91.63 05:32:06 05:01:01 AM 1 18.41 0.00 0.67 0.00 0.10 80.82 05:32:06 05:02:01 AM all 4.82 0.00 0.44 0.03 0.08 94.63 05:32:06 05:02:01 AM 0 4.52 0.00 0.52 0.07 0.07 94.83 05:32:06 05:02:01 AM 1 5.11 0.00 0.37 0.00 0.08 94.44 05:32:06 05:32:06 05:02:01 AM CPU %user %nice %system %iowait %steal %idle 05:32:06 05:03:01 AM all 6.38 0.00 0.53 0.00 0.08 93.01 05:32:06 05:03:01 AM 0 6.82 0.00 0.63 0.00 0.08 92.46 05:32:06 05:03:01 AM 1 5.95 0.00 0.42 0.00 0.07 93.57 05:32:06 05:04:01 AM all 7.15 0.00 0.29 0.03 0.08 92.45 05:32:06 05:04:01 AM 0 11.18 0.00 0.40 0.00 0.10 88.32 05:32:06 05:04:01 AM 1 3.11 0.00 0.18 0.05 0.07 96.59 05:32:06 05:05:01 AM all 13.25 0.00 0.59 0.06 0.09 86.01 05:32:06 05:05:01 AM 0 8.46 0.00 0.55 0.03 0.08 90.87 05:32:06 05:05:01 AM 1 18.04 0.00 0.63 0.08 0.10 81.14 05:32:06 05:06:01 AM all 5.59 0.00 0.51 0.03 0.09 93.78 05:32:06 05:06:01 AM 0 5.91 0.00 0.58 0.02 0.10 93.39 05:32:06 05:06:01 AM 1 5.26 0.00 0.43 0.05 0.08 94.17 05:32:06 05:07:01 AM all 8.82 0.00 1.22 0.00 0.08 89.87 05:32:06 05:07:01 AM 0 11.77 0.00 1.59 0.00 0.08 86.55 05:32:06 05:07:01 AM 1 5.87 0.00 0.86 0.00 0.08 93.19 05:32:06 05:08:01 AM all 6.37 0.00 0.76 0.03 0.08 92.77 05:32:06 05:08:01 AM 0 2.02 0.00 0.43 0.00 0.07 97.47 05:32:06 05:08:01 AM 1 10.73 0.00 1.09 0.05 0.08 88.05 05:32:06 05:09:01 AM all 8.04 0.00 0.85 0.00 0.07 91.03 05:32:06 05:09:01 AM 0 12.97 0.00 1.27 0.00 0.07 85.69 05:32:06 05:09:01 AM 1 3.31 0.00 0.45 0.00 0.07 96.17 05:32:06 05:10:01 AM all 7.23 0.00 0.40 0.03 0.10 92.24 05:32:06 05:10:01 AM 0 11.69 0.00 0.45 0.05 0.08 87.72 05:32:06 05:10:01 AM 1 2.78 0.00 0.35 0.00 0.12 96.75 05:32:06 05:11:01 AM all 4.22 0.00 0.57 0.06 0.08 95.08 05:32:06 05:11:01 AM 0 5.35 0.00 0.75 0.10 0.07 93.73 05:32:06 05:11:01 AM 1 3.08 0.00 0.39 0.02 0.08 96.43 05:32:06 05:12:01 AM all 2.34 0.00 0.23 0.01 0.08 97.33 05:32:06 05:12:01 AM 0 1.69 0.00 0.25 0.02 0.08 97.96 05:32:06 05:12:01 AM 1 3.00 0.00 0.22 0.00 0.08 96.70 05:32:06 05:13:01 AM all 5.76 0.00 0.66 0.05 0.08 93.45 05:32:06 05:13:01 AM 0 8.54 0.00 0.90 0.00 0.08 90.47 05:32:06 05:13:01 AM 1 2.97 0.00 0.42 0.10 0.08 96.43 05:32:06 05:32:06 05:13:01 AM CPU %user %nice %system %iowait %steal %idle 05:32:06 05:14:01 AM all 2.54 0.00 0.29 0.06 0.08 97.02 05:32:06 05:14:01 AM 0 2.69 0.00 0.33 0.10 0.08 96.79 05:32:06 05:14:01 AM 1 2.39 0.00 0.25 0.02 0.08 97.25 05:32:06 05:15:01 AM all 2.23 0.00 0.22 0.01 0.08 97.46 05:32:06 05:15:01 AM 0 3.28 0.00 0.20 0.00 0.08 96.44 05:32:06 05:15:01 AM 1 1.19 0.00 0.23 0.02 0.07 98.49 05:32:06 05:16:01 AM all 2.13 0.00 0.23 0.06 0.08 97.50 05:32:06 05:16:01 AM 0 1.92 0.00 0.25 0.12 0.07 97.64 05:32:06 05:16:01 AM 1 2.35 0.00 0.22 0.00 0.08 97.35 05:32:06 05:17:01 AM all 5.06 0.00 0.32 0.04 0.08 94.49 05:32:06 05:17:01 AM 0 6.65 0.00 0.35 0.00 0.08 92.92 05:32:06 05:17:01 AM 1 3.48 0.00 0.28 0.08 0.08 96.07 05:32:06 05:18:01 AM all 4.58 0.00 0.62 0.05 0.08 94.67 05:32:06 05:18:01 AM 0 5.31 0.00 0.69 0.00 0.08 93.92 05:32:06 05:18:01 AM 1 3.85 0.00 0.55 0.10 0.08 95.41 05:32:06 05:19:01 AM all 2.16 0.00 0.28 0.00 0.08 97.48 05:32:06 05:19:01 AM 0 1.76 0.00 0.27 0.00 0.08 97.89 05:32:06 05:19:01 AM 1 2.57 0.00 0.30 0.00 0.07 97.06 05:32:06 05:20:01 AM all 3.86 0.00 0.59 0.04 0.08 95.42 05:32:06 05:20:01 AM 0 3.16 0.00 0.52 0.08 0.07 96.17 05:32:06 05:20:01 AM 1 4.57 0.00 0.67 0.00 0.10 94.66 05:32:06 05:21:01 AM all 4.66 0.00 0.42 0.06 0.09 94.77 05:32:06 05:21:01 AM 0 3.78 0.00 0.47 0.12 0.08 95.55 05:32:06 05:21:01 AM 1 5.54 0.00 0.37 0.00 0.10 94.00 05:32:06 05:22:01 AM all 4.32 0.00 0.32 0.00 0.08 95.29 05:32:06 05:22:01 AM 0 7.85 0.00 0.52 0.00 0.07 91.56 05:32:06 05:22:01 AM 1 0.92 0.00 0.12 0.00 0.08 98.88 05:32:06 05:23:01 AM all 1.18 0.00 0.09 0.00 0.08 98.65 05:32:06 05:23:01 AM 0 0.80 0.00 0.07 0.00 0.07 99.06 05:32:06 05:23:01 AM 1 1.55 0.00 0.12 0.00 0.10 98.23 05:32:06 05:24:01 AM all 1.14 0.00 0.10 0.00 0.08 98.67 05:32:06 05:24:01 AM 0 0.73 0.00 0.10 0.00 0.07 99.10 05:32:06 05:24:01 AM 1 1.55 0.00 0.10 0.00 0.10 98.25 05:32:06 05:32:06 05:24:01 AM CPU %user %nice %system %iowait %steal %idle 05:32:06 05:25:01 AM all 1.15 0.00 0.08 0.00 0.10 98.66 05:32:06 05:25:01 AM 0 0.70 0.00 0.08 0.00 0.07 99.15 05:32:06 05:25:01 AM 1 1.60 0.00 0.08 0.00 0.13 98.18 05:32:06 05:26:01 AM all 1.07 0.00 0.12 0.00 0.09 98.72 05:32:06 05:26:01 AM 0 0.68 0.00 0.13 0.00 0.08 99.10 05:32:06 05:26:01 AM 1 1.45 0.00 0.10 0.00 0.10 98.35 05:32:06 05:27:01 AM all 1.24 0.00 0.13 0.00 0.08 98.55 05:32:06 05:27:01 AM 0 0.70 0.00 0.13 0.00 0.07 99.10 05:32:06 05:27:01 AM 1 1.77 0.00 0.13 0.00 0.10 97.99 05:32:06 05:28:01 AM all 4.87 0.00 0.53 0.03 0.08 94.48 05:32:06 05:28:01 AM 0 6.40 0.00 0.42 0.00 0.08 93.10 05:32:06 05:28:01 AM 1 3.34 0.00 0.64 0.07 0.08 95.87 05:32:06 05:29:01 AM all 4.82 0.00 0.99 0.00 0.08 94.11 05:32:06 05:29:01 AM 0 7.17 0.00 1.28 0.00 0.08 91.47 05:32:06 05:29:01 AM 1 2.48 0.00 0.70 0.00 0.07 96.75 05:32:06 05:30:01 AM all 5.09 0.00 0.53 0.00 0.10 94.28 05:32:06 05:30:01 AM 0 7.45 0.00 0.62 0.00 0.10 91.83 05:32:06 05:30:01 AM 1 2.73 0.00 0.44 0.00 0.10 96.73 05:32:06 05:31:01 AM all 28.88 0.00 2.89 0.31 0.15 67.77 05:32:06 05:31:01 AM 0 37.19 0.00 3.49 0.40 0.20 58.72 05:32:06 05:31:01 AM 1 20.56 0.00 2.29 0.22 0.10 76.84 05:32:06 05:32:01 AM all 31.28 0.23 4.10 1.25 0.08 63.06 05:32:06 05:32:01 AM 0 39.17 0.35 4.14 1.75 0.08 54.51 05:32:06 05:32:01 AM 1 23.39 0.10 4.06 0.75 0.08 71.61 05:32:06 Average: all 7.79 0.01 0.93 0.24 0.08 90.95 05:32:06 Average: 0 9.33 0.01 1.04 0.28 0.08 89.26 05:32:06 Average: 1 6.26 0.00 0.82 0.20 0.08 92.64 05:32:06 05:32:06 05:32:06