03:16:14 Started by upstream project "integration-distribution-test-titanium" build number 380 03:16:14 originally caused by: 03:16:14 Started by upstream project "autorelease-release-titanium-mvn39-openjdk21" build number 363 03:16:14 originally caused by: 03:16:14 Started by timer 03:16:14 Running as SYSTEM 03:16:14 [EnvInject] - Loading node environment variables. 03:16:14 Building remotely on prd-centos8-robot-2c-8g-3389 (centos8-robot-2c-8g) in workspace /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 03:16:15 [ssh-agent] Looking for ssh-agent implementation... 03:16:15 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 03:16:15 $ ssh-agent 03:16:15 SSH_AUTH_SOCK=/tmp/ssh-T7Qpvh6PgJU5/agent.5298 03:16:15 SSH_AGENT_PID=5299 03:16:15 [ssh-agent] Started. 03:16:15 Running ssh-add (command line suppressed) 03:16:15 Identity added: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_8314801731179723805.key (/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/private_key_8314801731179723805.key) 03:16:15 [ssh-agent] Using credentials jenkins (Release Engineering Jenkins Key) 03:16:15 The recommended git tool is: NONE 03:16:17 using credential opendaylight-jenkins-ssh 03:16:17 Wiping out workspace first. 03:16:17 Cloning the remote Git repository 03:16:17 Cloning repository git://devvexx.opendaylight.org/mirror/integration/test 03:16:17 > git init /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test # timeout=10 03:16:17 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 03:16:17 > git --version # timeout=10 03:16:17 > git --version # 'git version 2.43.0' 03:16:17 using GIT_SSH to set credentials Release Engineering Jenkins Key 03:16:17 [INFO] Currently running in a labeled security context 03:16:17 [INFO] Currently SELinux is 'enforcing' on the host 03:16:17 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh9157758949300299407.key 03:16:17 Verifying host key using known hosts file 03:16:18 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 03:16:18 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test +refs/heads/*:refs/remotes/origin/* # timeout=10 03:16:20 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 03:16:20 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 03:16:21 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 03:16:21 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 03:16:21 using GIT_SSH to set credentials Release Engineering Jenkins Key 03:16:21 [INFO] Currently running in a labeled security context 03:16:21 [INFO] Currently SELinux is 'enforcing' on the host 03:16:21 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test@tmp/jenkins-gitclient-ssh7710703309038189582.key 03:16:21 Verifying host key using known hosts file 03:16:21 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 03:16:21 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test master # timeout=10 03:16:21 > git rev-parse FETCH_HEAD^{commit} # timeout=10 03:16:21 Checking out Revision 9e7a2f1bec76f24ac7173c3a00f09ed1af208887 (origin/master) 03:16:21 > git config core.sparsecheckout # timeout=10 03:16:21 > git checkout -f 9e7a2f1bec76f24ac7173c3a00f09ed1af208887 # timeout=10 03:16:21 Commit message: "Add pekko templates" 03:16:21 > git rev-parse FETCH_HEAD^{commit} # timeout=10 03:16:22 > git rev-list --no-walk e12906d887353b3b6c7ca6e293959c75cf9a8409 # timeout=10 03:16:22 No emails were triggered. 03:16:22 provisioning config files... 03:16:22 copy managed file [npmrc] to file:/home/jenkins/.npmrc 03:16:22 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 03:16:22 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 03:16:22 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins18317244437160796412.sh 03:16:22 ---> python-tools-install.sh 03:16:22 Setup pyenv: 03:16:22 system 03:16:22 * 3.8.13 (set by /opt/pyenv/version) 03:16:22 * 3.9.13 (set by /opt/pyenv/version) 03:16:22 * 3.10.13 (set by /opt/pyenv/version) 03:16:22 * 3.11.7 (set by /opt/pyenv/version) 03:16:27 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-0MxD 03:16:27 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 03:16:31 lf-activate-venv(): INFO: Installing: lftools 03:16:55 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:16:55 Generating Requirements File 03:17:18 Python 3.11.7 03:17:18 pip 25.2 from /tmp/venv-0MxD/lib/python3.11/site-packages/pip (python 3.11) 03:17:18 appdirs==1.4.4 03:17:18 argcomplete==3.6.2 03:17:18 aspy.yaml==1.3.0 03:17:18 attrs==25.3.0 03:17:18 autopage==0.5.2 03:17:18 beautifulsoup4==4.13.4 03:17:18 boto3==1.40.16 03:17:18 botocore==1.40.16 03:17:18 bs4==0.0.2 03:17:18 cachetools==5.5.2 03:17:18 certifi==2025.8.3 03:17:18 cffi==1.17.1 03:17:18 cfgv==3.4.0 03:17:18 chardet==5.2.0 03:17:18 charset-normalizer==3.4.3 03:17:18 click==8.2.1 03:17:18 cliff==4.11.0 03:17:18 cmd2==2.7.0 03:17:18 cryptography==3.3.2 03:17:18 debtcollector==3.0.0 03:17:18 decorator==5.2.1 03:17:18 defusedxml==0.7.1 03:17:18 Deprecated==1.2.18 03:17:18 distlib==0.4.0 03:17:18 dnspython==2.7.0 03:17:18 docker==7.1.0 03:17:18 dogpile.cache==1.4.0 03:17:18 durationpy==0.10 03:17:18 email_validator==2.2.0 03:17:18 filelock==3.19.1 03:17:18 future==1.0.0 03:17:18 gitdb==4.0.12 03:17:18 GitPython==3.1.45 03:17:18 google-auth==2.40.3 03:17:18 httplib2==0.22.0 03:17:18 identify==2.6.13 03:17:18 idna==3.10 03:17:18 importlib-resources==1.5.0 03:17:18 iso8601==2.1.0 03:17:18 Jinja2==3.1.6 03:17:18 jmespath==1.0.1 03:17:18 jsonpatch==1.33 03:17:18 jsonpointer==3.0.0 03:17:18 jsonschema==4.25.1 03:17:18 jsonschema-specifications==2025.4.1 03:17:18 keystoneauth1==5.12.0 03:17:18 kubernetes==33.1.0 03:17:18 lftools==0.37.13 03:17:18 lxml==6.0.1 03:17:18 markdown-it-py==4.0.0 03:17:18 MarkupSafe==3.0.2 03:17:18 mdurl==0.1.2 03:17:18 msgpack==1.1.1 03:17:18 multi_key_dict==2.0.3 03:17:18 munch==4.0.0 03:17:18 netaddr==1.3.0 03:17:18 niet==1.4.2 03:17:18 nodeenv==1.9.1 03:17:18 oauth2client==4.1.3 03:17:18 oauthlib==3.3.1 03:17:18 openstacksdk==4.7.0 03:17:18 os-client-config==2.3.0 03:17:18 os-service-types==1.8.0 03:17:18 osc-lib==4.2.0 03:17:18 oslo.config==10.0.0 03:17:18 oslo.context==6.0.0 03:17:18 oslo.i18n==6.5.1 03:17:18 oslo.log==7.2.0 03:17:18 oslo.serialization==5.7.0 03:17:18 oslo.utils==9.0.0 03:17:18 packaging==25.0 03:17:18 pbr==7.0.1 03:17:18 platformdirs==4.3.8 03:17:18 prettytable==3.16.0 03:17:18 psutil==7.0.0 03:17:18 pyasn1==0.6.1 03:17:18 pyasn1_modules==0.4.2 03:17:18 pycparser==2.22 03:17:18 pygerrit2==2.0.15 03:17:18 PyGithub==2.7.0 03:17:18 Pygments==2.19.2 03:17:18 PyJWT==2.10.1 03:17:18 PyNaCl==1.5.0 03:17:18 pyparsing==2.4.7 03:17:18 pyperclip==1.9.0 03:17:18 pyrsistent==0.20.0 03:17:18 python-cinderclient==9.7.0 03:17:18 python-dateutil==2.9.0.post0 03:17:18 python-heatclient==4.3.0 03:17:18 python-jenkins==1.8.3 03:17:18 python-keystoneclient==5.6.0 03:17:18 python-magnumclient==4.8.1 03:17:18 python-openstackclient==8.2.0 03:17:18 python-swiftclient==4.8.0 03:17:18 PyYAML==6.0.2 03:17:18 referencing==0.36.2 03:17:18 requests==2.32.5 03:17:18 requests-oauthlib==2.0.0 03:17:18 requestsexceptions==1.4.0 03:17:18 rfc3986==2.0.0 03:17:18 rich==14.1.0 03:17:18 rich-argparse==1.7.1 03:17:18 rpds-py==0.27.0 03:17:18 rsa==4.9.1 03:17:18 ruamel.yaml==0.18.15 03:17:18 ruamel.yaml.clib==0.2.12 03:17:18 s3transfer==0.13.1 03:17:18 simplejson==3.20.1 03:17:18 six==1.17.0 03:17:18 smmap==5.0.2 03:17:18 soupsieve==2.7 03:17:18 stevedore==5.4.1 03:17:18 tabulate==0.9.0 03:17:18 toml==0.10.2 03:17:18 tomlkit==0.13.3 03:17:18 tqdm==4.67.1 03:17:18 typing_extensions==4.14.1 03:17:18 tzdata==2025.2 03:17:18 urllib3==1.26.20 03:17:18 virtualenv==20.34.0 03:17:18 wcwidth==0.2.13 03:17:18 websocket-client==1.8.0 03:17:18 wrapt==1.17.3 03:17:18 xdg==6.0.0 03:17:18 xmltodict==0.14.2 03:17:18 yq==3.4.3 03:17:19 [EnvInject] - Injecting environment variables from a build step. 03:17:19 [EnvInject] - Injecting as environment variables the properties content 03:17:19 OS_STACK_TEMPLATE=csit-2-instance-type.yaml 03:17:19 OS_CLOUD=vex 03:17:19 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:17:19 OS_STACK_TEMPLATE_DIR=openstack-hot 03:17:19 03:17:19 [EnvInject] - Variables injected successfully. 03:17:19 provisioning config files... 03:17:19 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 03:17:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins16609532724828418272.sh 03:17:19 ---> Create parameters file for OpenStack HOT 03:17:19 OpenStack Heat parameters generated 03:17:19 ----------------------------------- 03:17:19 parameters: 03:17:19 vm_0_count: '3' 03:17:19 vm_0_flavor: 'v3-standard-4' 03:17:19 vm_0_image: 'ZZCI - Ubuntu 22.04 - builder - x86_64 - 20250201-010426.857' 03:17:19 vm_1_count: '1' 03:17:19 vm_1_flavor: 'v3-standard-2' 03:17:19 vm_1_image: 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 - 20250201-060151.911' 03:17:19 03:17:19 job_name: '62057-373' 03:17:19 silo: 'releng' 03:17:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins15077465924850330129.sh 03:17:19 ---> Create HEAT stack 03:17:19 + source /home/jenkins/lf-env.sh 03:17:19 + lf-activate-venv --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 03:17:19 ++ mktemp -d /tmp/venv-XXXX 03:17:19 + lf_venv=/tmp/venv-kml8 03:17:19 + local venv_file=/tmp/.os_lf_venv 03:17:19 + local python=python3 03:17:19 + local options 03:17:19 + local set_path=true 03:17:19 + local install_args= 03:17:19 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 03:17:19 + options=' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 03:17:19 + eval set -- ' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' 03:17:19 ++ set -- --python python3 -- 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 03:17:19 + true 03:17:19 + case $1 in 03:17:19 + python=python3 03:17:19 + shift 2 03:17:19 + true 03:17:19 + case $1 in 03:17:19 + shift 03:17:19 + break 03:17:19 + case $python in 03:17:19 + local pkg_list= 03:17:19 + [[ -d /opt/pyenv ]] 03:17:19 + echo 'Setup pyenv:' 03:17:19 Setup pyenv: 03:17:19 + export PYENV_ROOT=/opt/pyenv 03:17:19 + PYENV_ROOT=/opt/pyenv 03:17:19 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:19 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:19 + pyenv versions 03:17:19 system 03:17:19 3.8.13 03:17:19 3.9.13 03:17:19 3.10.13 03:17:19 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:17:19 + command -v pyenv 03:17:19 ++ pyenv init - --no-rehash 03:17:19 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 03:17:19 for i in ${!paths[@]}; do 03:17:19 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 03:17:19 fi; done; 03:17:19 echo "${paths[*]}"'\'')" 03:17:19 export PATH="/opt/pyenv/shims:${PATH}" 03:17:19 export PYENV_SHELL=bash 03:17:19 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 03:17:19 pyenv() { 03:17:19 local command 03:17:19 command="${1:-}" 03:17:19 if [ "$#" -gt 0 ]; then 03:17:19 shift 03:17:19 fi 03:17:19 03:17:19 case "$command" in 03:17:19 rehash|shell) 03:17:19 eval "$(pyenv "sh-$command" "$@")" 03:17:19 ;; 03:17:19 *) 03:17:19 command pyenv "$command" "$@" 03:17:19 ;; 03:17:19 esac 03:17:19 }' 03:17:19 +++ bash --norc -ec 'IFS=:; paths=($PATH); 03:17:19 for i in ${!paths[@]}; do 03:17:19 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 03:17:19 fi; done; 03:17:19 echo "${paths[*]}"' 03:17:19 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:19 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:19 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:19 ++ export PYENV_SHELL=bash 03:17:19 ++ PYENV_SHELL=bash 03:17:19 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 03:17:19 +++ complete -F _pyenv pyenv 03:17:19 ++ lf-pyver python3 03:17:19 ++ local py_version_xy=python3 03:17:19 ++ local py_version_xyz= 03:17:19 ++ awk '{ print $1 }' 03:17:19 ++ pyenv versions 03:17:19 ++ local command 03:17:19 ++ command=versions 03:17:19 ++ '[' 1 -gt 0 ']' 03:17:19 ++ shift 03:17:19 ++ case "$command" in 03:17:19 ++ command pyenv versions 03:17:19 ++ pyenv versions 03:17:19 ++ sed 's/^[ *]* //' 03:17:19 ++ grep -E '^[0-9.]*[0-9]$' 03:17:19 ++ [[ ! -s /tmp/.pyenv_versions ]] 03:17:19 +++ tail -n 1 03:17:19 +++ grep '^3' /tmp/.pyenv_versions 03:17:19 +++ sort -V 03:17:19 ++ py_version_xyz=3.11.7 03:17:19 ++ [[ -z 3.11.7 ]] 03:17:19 ++ echo 3.11.7 03:17:19 ++ return 0 03:17:19 + pyenv local 3.11.7 03:17:19 + local command 03:17:19 + command=local 03:17:19 + '[' 2 -gt 0 ']' 03:17:19 + shift 03:17:19 + case "$command" in 03:17:19 + command pyenv local 3.11.7 03:17:19 + pyenv local 3.11.7 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='lftools[openstack] ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='kubernetes ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='niet ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='python-heatclient ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='python-openstackclient ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='python-magnumclient ' 03:17:19 + for arg in "$@" 03:17:19 + case $arg in 03:17:19 + pkg_list+='yq ' 03:17:19 + [[ -f /tmp/.os_lf_venv ]] 03:17:19 ++ cat /tmp/.os_lf_venv 03:17:19 + lf_venv=/tmp/venv-0MxD 03:17:19 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from' file:/tmp/.os_lf_venv 03:17:19 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:17:19 + /tmp/venv-0MxD/bin/python3 -m pip install --upgrade --quiet pip 'setuptools<66' virtualenv 03:17:21 + [[ -z lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ]] 03:17:21 + echo 'lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ' 03:17:21 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 03:17:21 + /tmp/venv-0MxD/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq 03:17:39 + type python3 03:17:39 + true 03:17:39 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH' 03:17:39 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:17:39 + PATH=/tmp/venv-0MxD/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 03:17:39 + return 0 03:17:39 + openstack --os-cloud vex limits show --absolute 03:17:41 +--------------------------+---------+ 03:17:41 | Name | Value | 03:17:41 +--------------------------+---------+ 03:17:41 | maxTotalInstances | -1 | 03:17:41 | maxTotalCores | 450 | 03:17:41 | maxTotalRAMSize | 1000000 | 03:17:41 | maxServerMeta | 128 | 03:17:41 | maxImageMeta | 128 | 03:17:41 | maxPersonality | 5 | 03:17:41 | maxPersonalitySize | 10240 | 03:17:41 | maxTotalKeypairs | 100 | 03:17:41 | maxServerGroups | 10 | 03:17:41 | maxServerGroupMembers | 10 | 03:17:41 | maxTotalFloatingIps | -1 | 03:17:41 | maxSecurityGroups | -1 | 03:17:41 | maxSecurityGroupRules | -1 | 03:17:41 | totalRAMUsed | 778240 | 03:17:41 | totalCoresUsed | 190 | 03:17:41 | totalInstancesUsed | 66 | 03:17:41 | totalFloatingIpsUsed | 0 | 03:17:41 | totalSecurityGroupsUsed | 0 | 03:17:41 | totalServerGroupsUsed | 0 | 03:17:41 | maxTotalVolumes | -1 | 03:17:41 | maxTotalSnapshots | 10 | 03:17:41 | maxTotalVolumeGigabytes | 4096 | 03:17:41 | maxTotalBackups | 10 | 03:17:41 | maxTotalBackupGigabytes | 1000 | 03:17:41 | totalVolumesUsed | 3 | 03:17:41 | totalGigabytesUsed | 60 | 03:17:41 | totalSnapshotsUsed | 0 | 03:17:41 | totalBackupsUsed | 0 | 03:17:41 | totalBackupGigabytesUsed | 0 | 03:17:41 +--------------------------+---------+ 03:17:41 + pushd /opt/ciman/openstack-hot 03:17:41 /opt/ciman/openstack-hot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 03:17:41 + lftools openstack --os-cloud vex stack create releng-openflowplugin-csit-3node-clustering-only-titanium-373 csit-2-instance-type.yaml /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/stack-parameters.yaml 03:18:08 Creating stack releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:18:08 Waiting to initialize infrastructure... 03:18:08 Stack initialization successful. 03:18:08 ------------------------------------ 03:18:08 Stack Details 03:18:08 ------------------------------------ 03:18:08 {'added': None, 03:18:08 'capabilities': [], 03:18:08 'created_at': '2025-08-23T03:17:44Z', 03:18:08 'deleted': None, 03:18:08 'deleted_at': None, 03:18:08 'description': 'No description', 03:18:08 'environment': None, 03:18:08 'environment_files': None, 03:18:08 'files': None, 03:18:08 'files_container': None, 03:18:08 'id': '937c837f-687b-4286-815f-252419556e77', 03:18:08 'is_rollback_disabled': True, 03:18:08 'links': [{'href': 'https://orchestration.public.mtl1.vexxhost.net/v1/12c36e260d8e4bb2913965203b1b491f/stacks/releng-openflowplugin-csit-3node-clustering-only-titanium-373/937c837f-687b-4286-815f-252419556e77', 03:18:08 'rel': 'self'}], 03:18:08 'location': Munch({'cloud': 'vex', 'region_name': 'ca-ymq-1', 'zone': None, 'project': Munch({'id': '12c36e260d8e4bb2913965203b1b491f', 'name': '61975f2c-7c17-4d69-82fa-c3ae420ad6fd', 'domain_id': None, 'domain_name': 'Default'})}), 03:18:08 'name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-373', 03:18:08 'notification_topics': [], 03:18:08 'outputs': [{'description': 'IP addresses of the 2nd vm types', 03:18:08 'output_key': 'vm_1_ips', 03:18:08 'output_value': ['10.30.171.150']}, 03:18:08 {'description': 'IP addresses of the 1st vm types', 03:18:08 'output_key': 'vm_0_ips', 03:18:08 'output_value': ['10.30.171.230', 03:18:08 '10.30.171.111', 03:18:08 '10.30.171.29']}], 03:18:08 'owner_id': ****, 03:18:08 'parameters': {'OS::project_id': '12c36e260d8e4bb2913965203b1b491f', 03:18:08 'OS::stack_id': '937c837f-687b-4286-815f-252419556e77', 03:18:08 'OS::stack_name': 'releng-openflowplugin-csit-3node-clustering-only-titanium-373', 03:18:08 'job_name': '62057-373', 03:18:08 'silo': 'releng', 03:18:08 'vm_0_count': '3', 03:18:08 'vm_0_flavor': 'v3-standard-4', 03:18:08 'vm_0_image': 'ZZCI - Ubuntu 22.04 - builder - x86_64 - ' 03:18:08 '20250201-010426.857', 03:18:08 'vm_1_count': '1', 03:18:08 'vm_1_flavor': 'v3-standard-2', 03:18:08 'vm_1_image': 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 ' 03:18:08 '- 20250201-060151.911'}, 03:18:08 'parent_id': None, 03:18:08 'replaced': None, 03:18:08 'status': 'CREATE_COMPLETE', 03:18:08 'status_reason': 'Stack CREATE completed successfully', 03:18:08 'tags': [], 03:18:08 'template': None, 03:18:08 'template_description': 'No description', 03:18:08 'template_url': None, 03:18:08 'timeout_mins': 15, 03:18:08 'unchanged': None, 03:18:08 'updated': None, 03:18:08 'updated_at': None, 03:18:08 'user_project_id': 'dfe791fe1aee4b5880111b59eb1611a2'} 03:18:08 ------------------------------------ 03:18:08 + popd 03:18:08 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium 03:18:08 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins10467500184721444034.sh 03:18:08 ---> Copy SSH public keys to CSIT lab 03:18:08 Setup pyenv: 03:18:08 system 03:18:08 3.8.13 03:18:08 3.9.13 03:18:08 3.10.13 03:18:08 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:18:08 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:18:10 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 03:18:25 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:18:27 SSH not responding on 10.30.171.111. Retrying in 10 seconds... 03:18:28 SSH not responding on 10.30.171.150. Retrying in 10 seconds... 03:18:28 SSH not responding on 10.30.171.230. Retrying in 10 seconds... 03:18:28 SSH not responding on 10.30.171.29. Retrying in 10 seconds... 03:18:38 Ping to 10.30.171.111 successful. 03:18:38 Ping to 10.30.171.150 successful. 03:18:38 Ping to 10.30.171.230 successful. 03:18:38 Ping to 10.30.171.29 successful. 03:18:39 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:18:39 releng-62057-373-0-builder-0 03:18:39 Successfully copied public keys to slave 10.30.171.230 03:18:39 Process 6518 ready. 03:18:40 SSH not responding on 10.30.171.29. Retrying in 10 seconds... 03:18:40 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:18:40 Warning: Permanently added '10.30.171.150' (ECDSA) to the list of known hosts. 03:18:40 releng-62057-373-0-builder-1 03:18:40 Successfully copied public keys to slave 10.30.171.111 03:18:40 Process 6519 ready. 03:18:41 releng-62057-373-1-mininet-ovs-217-0 03:18:41 Successfully copied public keys to slave 10.30.171.150 03:18:50 Ping to 10.30.171.29 successful. 03:18:51 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:18:51 releng-62057-373-0-builder-2 03:18:51 Successfully copied public keys to slave 10.30.171.29 03:18:51 Process 6520 ready. 03:18:51 Process 6521 ready. 03:18:51 SSH ready on all stack servers. 03:18:51 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins18102066141062343186.sh 03:18:51 Setup pyenv: 03:18:52 system 03:18:52 3.8.13 03:18:52 3.9.13 03:18:52 3.10.13 03:18:52 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:18:56 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-etDj 03:18:56 lf-activate-venv(): INFO: Save venv in file: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.robot_venv 03:19:00 lf-activate-venv(): INFO: Installing: setuptools wheel 03:19:01 lf-activate-venv(): INFO: Adding /tmp/venv-etDj/bin to PATH 03:19:01 + echo 'Installing Python Requirements' 03:19:01 Installing Python Requirements 03:19:01 + cat 03:19:01 + python -m pip install -r requirements.txt 03:19:02 Looking in indexes: https://nexus3.opendaylight.org/repository/PyPi/simple 03:19:02 Collecting docker-py (from -r requirements.txt (line 1)) 03:19:02 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-py/1.10.6/docker_py-1.10.6-py2.py3-none-any.whl (50 kB) 03:19:02 Collecting ipaddr (from -r requirements.txt (line 2)) 03:19:02 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ipaddr/2.2.0/ipaddr-2.2.0.tar.gz (26 kB) 03:19:02 Preparing metadata (setup.py): started 03:19:02 Preparing metadata (setup.py): finished with status 'done' 03:19:02 Collecting netaddr (from -r requirements.txt (line 3)) 03:19:02 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/netaddr/1.3.0/netaddr-1.3.0-py3-none-any.whl (2.3 MB) 03:19:02 Collecting netifaces (from -r requirements.txt (line 4)) 03:19:02 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/netifaces/0.11.0/netifaces-0.11.0.tar.gz (30 kB) 03:19:02 Preparing metadata (setup.py): started 03:19:02 Preparing metadata (setup.py): finished with status 'done' 03:19:02 Collecting pyhocon (from -r requirements.txt (line 5)) 03:19:02 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyhocon/0.3.61/pyhocon-0.3.61-py3-none-any.whl (25 kB) 03:19:02 Collecting requests (from -r requirements.txt (line 6)) 03:19:02 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/requests/2.32.5/requests-2.32.5-py3-none-any.whl (64 kB) 03:19:02 Collecting robotframework (from -r requirements.txt (line 7)) 03:19:02 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework/7.3.2/robotframework-7.3.2-py3-none-any.whl (795 kB) 03:19:03 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 795.1/795.1 kB 18.1 MB/s 0:00:00 03:19:03 Collecting robotframework-httplibrary (from -r requirements.txt (line 8)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-httplibrary/0.4.2/robotframework-httplibrary-0.4.2.tar.gz (9.1 kB) 03:19:03 Preparing metadata (setup.py): started 03:19:03 Preparing metadata (setup.py): finished with status 'done' 03:19:03 Collecting robotframework-requests==0.9.7 (from -r requirements.txt (line 9)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-requests/0.9.7/robotframework_requests-0.9.7-py3-none-any.whl (21 kB) 03:19:03 Collecting robotframework-selenium2library (from -r requirements.txt (line 10)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-selenium2library/3.0.0/robotframework_selenium2library-3.0.0-py2.py3-none-any.whl (6.2 kB) 03:19:03 Collecting robotframework-sshlibrary==3.8.0 (from -r requirements.txt (line 11)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-sshlibrary/3.8.0/robotframework-sshlibrary-3.8.0.tar.gz (51 kB) 03:19:03 Preparing metadata (setup.py): started 03:19:03 Preparing metadata (setup.py): finished with status 'done' 03:19:03 Collecting scapy (from -r requirements.txt (line 12)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scapy/2.6.1/scapy-2.6.1-py3-none-any.whl (2.4 MB) 03:19:03 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 55.7 MB/s 0:00:00 03:19:03 Collecting jsonpath-rw (from -r requirements.txt (line 15)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpath-rw/1.4.0/jsonpath-rw-1.4.0.tar.gz (13 kB) 03:19:03 Preparing metadata (setup.py): started 03:19:03 Preparing metadata (setup.py): finished with status 'done' 03:19:03 Collecting elasticsearch (from -r requirements.txt (line 18)) 03:19:03 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/9.1.0/elasticsearch-9.1.0-py3-none-any.whl (929 kB) 03:19:03 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 929.5/929.5 kB 29.5 MB/s 0:00:00 03:19:04 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.18.0/elasticsearch_dsl-8.18.0-py3-none-any.whl (10 kB) 03:19:04 Collecting pyangbind (from -r requirements.txt (line 22)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyangbind/0.8.6/pyangbind-0.8.6-py3-none-any.whl (52 kB) 03:19:04 Collecting isodate (from -r requirements.txt (line 25)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/isodate/0.7.2/isodate-0.7.2-py3-none-any.whl (22 kB) 03:19:04 Collecting jmespath (from -r requirements.txt (line 28)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jmespath/1.0.1/jmespath-1.0.1-py3-none-any.whl (20 kB) 03:19:04 Collecting jsonpatch (from -r requirements.txt (line 31)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpatch/1.33/jsonpatch-1.33-py2.py3-none-any.whl (12 kB) 03:19:04 Collecting paramiko>=1.15.3 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/paramiko/4.0.0/paramiko-4.0.0-py3-none-any.whl (223 kB) 03:19:04 Collecting scp>=0.13.0 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scp/0.15.0/scp-0.15.0-py2.py3-none-any.whl (8.8 kB) 03:19:04 Collecting docker-pycreds>=0.2.1 (from docker-py->-r requirements.txt (line 1)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-pycreds/0.4.0/docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) 03:19:04 Collecting six>=1.4.0 (from docker-py->-r requirements.txt (line 1)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/six/1.17.0/six-1.17.0-py2.py3-none-any.whl (11 kB) 03:19:04 Collecting websocket-client>=0.32.0 (from docker-py->-r requirements.txt (line 1)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/websocket-client/1.8.0/websocket_client-1.8.0-py3-none-any.whl (58 kB) 03:19:04 Collecting pyparsing<4,>=2 (from pyhocon->-r requirements.txt (line 5)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pyparsing/3.2.3/pyparsing-3.2.3-py3-none-any.whl (111 kB) 03:19:04 Collecting charset_normalizer<4,>=2 (from requests->-r requirements.txt (line 6)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/charset-normalizer/3.4.3/charset_normalizer-3.4.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (150 kB) 03:19:04 Collecting idna<4,>=2.5 (from requests->-r requirements.txt (line 6)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/idna/3.10/idna-3.10-py3-none-any.whl (70 kB) 03:19:04 Collecting urllib3<3,>=1.21.1 (from requests->-r requirements.txt (line 6)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/urllib3/2.5.0/urllib3-2.5.0-py3-none-any.whl (129 kB) 03:19:04 Collecting certifi>=2017.4.17 (from requests->-r requirements.txt (line 6)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/certifi/2025.8.3/certifi-2025.8.3-py3-none-any.whl (161 kB) 03:19:04 Collecting webtest>=2.0 (from robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:04 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webtest/3.0.6/webtest-3.0.6-py3-none-any.whl (32 kB) 03:19:04 Collecting jsonpointer (from robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:04 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpointer/3.0.0/jsonpointer-3.0.0-py2.py3-none-any.whl (7.6 kB) 03:19:05 Collecting robotframework-seleniumlibrary>=3.0.0 (from robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-seleniumlibrary/6.7.1/robotframework_seleniumlibrary-6.7.1-py2.py3-none-any.whl (104 kB) 03:19:05 Collecting ply (from jsonpath-rw->-r requirements.txt (line 15)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ply/3.11/ply-3.11-py2.py3-none-any.whl (49 kB) 03:19:05 Collecting decorator (from jsonpath-rw->-r requirements.txt (line 15)) 03:19:05 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/decorator/5.2.1/decorator-5.2.1-py3-none-any.whl (9.2 kB) 03:19:05 Collecting elastic-transport<10,>=9.1.0 (from elasticsearch->-r requirements.txt (line 18)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/9.1.0/elastic_transport-9.1.0-py3-none-any.whl (65 kB) 03:19:05 Collecting python-dateutil (from elasticsearch->-r requirements.txt (line 18)) 03:19:05 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/python-dateutil/2.9.0.post0/python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) 03:19:05 Collecting typing-extensions (from elasticsearch->-r requirements.txt (line 18)) 03:19:05 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/typing-extensions/4.14.1/typing_extensions-4.14.1-py3-none-any.whl (43 kB) 03:19:05 Collecting elasticsearch (from -r requirements.txt (line 18)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/8.19.0/elasticsearch-8.19.0-py3-none-any.whl (926 kB) 03:19:05 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 926.9/926.9 kB 35.7 MB/s 0:00:00 03:19:05 INFO: pip is looking at multiple versions of elasticsearch-dsl to determine which version is compatible with other requirements. This could take a while. 03:19:05 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.1/elasticsearch_dsl-8.17.1-py3-none-any.whl (158 kB) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.0/elasticsearch_dsl-8.17.0-py3-none-any.whl (158 kB) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.16.0/elasticsearch_dsl-8.16.0-py3-none-any.whl (158 kB) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.15.4/elasticsearch_dsl-8.15.4-py3-none-any.whl (104 kB) 03:19:05 Collecting elastic-transport<9,>=8.15.1 (from elasticsearch->-r requirements.txt (line 18)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/8.17.1/elastic_transport-8.17.1-py3-none-any.whl (64 kB) 03:19:05 Collecting pyang (from pyangbind->-r requirements.txt (line 22)) 03:19:05 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyang/2.6.1/pyang-2.6.1-py2.py3-none-any.whl (594 kB) 03:19:05 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 594.7/594.7 kB 31.1 MB/s 0:00:00 03:19:05 Collecting lxml (from pyangbind->-r requirements.txt (line 22)) 03:19:05 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/lxml/6.0.1/lxml-6.0.1-cp311-cp311-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl (5.2 MB) 03:19:06 Collecting regex (from pyangbind->-r requirements.txt (line 22)) 03:19:06 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/regex/2025.7.34/regex-2025.7.34-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (798 kB) 03:19:06 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 798.9/798.9 kB 28.7 MB/s 0:00:00 03:19:06 Collecting enum34 (from pyangbind->-r requirements.txt (line 22)) 03:19:06 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/enum34/1.1.10/enum34-1.1.10-py3-none-any.whl (11 kB) 03:19:06 Collecting bcrypt>=3.2 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:06 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/bcrypt/4.3.0/bcrypt-4.3.0-cp39-abi3-manylinux_2_28_x86_64.whl (284 kB) 03:19:07 Collecting cryptography>=3.3 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:07 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cryptography/45.0.6/cryptography-45.0.6-cp311-abi3-manylinux_2_28_x86_64.whl (4.5 MB) 03:19:07 Collecting invoke>=2.0 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:07 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/invoke/2.2.0/invoke-2.2.0-py3-none-any.whl (160 kB) 03:19:07 Collecting pynacl>=1.5 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:07 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pynacl/1.5.0/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (856 kB) 03:19:07 Collecting cffi>=1.14 (from cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:07 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cffi/1.17.1/cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (467 kB) 03:19:07 Collecting pycparser (from cffi>=1.14->cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 03:19:07 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pycparser/2.22/pycparser-2.22-py3-none-any.whl (117 kB) 03:19:07 Collecting selenium>=4.3.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:07 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/selenium/4.35.0/selenium-4.35.0-py3-none-any.whl (9.6 MB) 03:19:07 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.6/9.6 MB 81.0 MB/s 0:00:00 03:19:07 Collecting robotframework-pythonlibcore>=4.4.1 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:07 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-pythonlibcore/4.4.1/robotframework_pythonlibcore-4.4.1-py2.py3-none-any.whl (12 kB) 03:19:08 Collecting click>=8.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/click/8.2.1/click-8.2.1-py3-none-any.whl (102 kB) 03:19:08 Collecting trio~=0.30.0 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio/0.30.0/trio-0.30.0-py3-none-any.whl (499 kB) 03:19:08 Collecting trio-websocket~=0.12.2 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio-websocket/0.12.2/trio_websocket-0.12.2-py3-none-any.whl (21 kB) 03:19:08 Collecting attrs>=23.2.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/attrs/25.3.0/attrs-25.3.0-py3-none-any.whl (63 kB) 03:19:08 Collecting sortedcontainers (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sortedcontainers/2.4.0/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) 03:19:08 Collecting outcome (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/outcome/1.3.0.post0/outcome-1.3.0.post0-py2.py3-none-any.whl (10 kB) 03:19:08 Collecting sniffio>=1.3.0 (from trio~=0.30.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl (10 kB) 03:19:08 Collecting wsproto>=0.14 (from trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/wsproto/1.2.0/wsproto-1.2.0-py3-none-any.whl (24 kB) 03:19:08 Collecting pysocks!=1.5.7,<2.0,>=1.5.6 (from urllib3[socks]<3.0,>=2.5.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pysocks/1.7.1/PySocks-1.7.1-py3-none-any.whl (16 kB) 03:19:08 Collecting WebOb>=1.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webob/1.8.9/WebOb-1.8.9-py2.py3-none-any.whl (115 kB) 03:19:08 Collecting waitress>=3.0.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/waitress/3.0.2/waitress-3.0.2-py3-none-any.whl (56 kB) 03:19:08 Collecting beautifulsoup4 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:08 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/beautifulsoup4/4.13.4/beautifulsoup4-4.13.4-py3-none-any.whl (187 kB) 03:19:08 Collecting h11<1,>=0.9.0 (from wsproto>=0.14->trio-websocket~=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 03:19:08 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/h11/0.16.0/h11-0.16.0-py3-none-any.whl (37 kB) 03:19:08 Collecting soupsieve>1.2 (from beautifulsoup4->webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 03:19:08 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/soupsieve/2.7/soupsieve-2.7-py3-none-any.whl (36 kB) 03:19:08 Building wheels for collected packages: robotframework-sshlibrary, ipaddr, netifaces, robotframework-httplibrary, jsonpath-rw 03:19:08 DEPRECATION: Building 'robotframework-sshlibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-sshlibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 03:19:08 Building wheel for robotframework-sshlibrary (setup.py): started 03:19:09 Building wheel for robotframework-sshlibrary (setup.py): finished with status 'done' 03:19:09 Created wheel for robotframework-sshlibrary: filename=robotframework_sshlibrary-3.8.0-py3-none-any.whl size=55205 sha256=f2527f1f1a2b809d9738296fded5293ec5da4c523a711d49e04846534da47e73 03:19:09 Stored in directory: /home/jenkins/.cache/pip/wheels/f7/c9/b3/a977b7bcc410d45ae27d240df3d00a12585509180e373ecccc 03:19:09 DEPRECATION: Building 'ipaddr' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'ipaddr'. Discussion can be found at https://github.com/pypa/pip/issues/6334 03:19:09 Building wheel for ipaddr (setup.py): started 03:19:09 Building wheel for ipaddr (setup.py): finished with status 'done' 03:19:09 Created wheel for ipaddr: filename=ipaddr-2.2.0-py3-none-any.whl size=18353 sha256=5137a9fda90eff4fa2d51e4f8193e96da946f98c6912027de5556367c711a524 03:19:09 Stored in directory: /home/jenkins/.cache/pip/wheels/dc/6c/04/da2d847fa8d45c59af3e1d83e2acc29cb8adcbaf04c0898dbf 03:19:09 DEPRECATION: Building 'netifaces' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'netifaces'. Discussion can be found at https://github.com/pypa/pip/issues/6334 03:19:09 Building wheel for netifaces (setup.py): started 03:19:11 Building wheel for netifaces (setup.py): finished with status 'done' 03:19:11 Created wheel for netifaces: filename=netifaces-0.11.0-cp311-cp311-linux_x86_64.whl size=41080 sha256=4f33bacf2a6033a022a456cd96c5755270837abbba509682c983f9dc8e4d0ae5 03:19:11 Stored in directory: /home/jenkins/.cache/pip/wheels/f8/18/88/e61d54b995bea304bdb1d040a92b72228a1bf72ca2a3eba7c9 03:19:11 DEPRECATION: Building 'robotframework-httplibrary' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'robotframework-httplibrary'. Discussion can be found at https://github.com/pypa/pip/issues/6334 03:19:11 Building wheel for robotframework-httplibrary (setup.py): started 03:19:11 Building wheel for robotframework-httplibrary (setup.py): finished with status 'done' 03:19:11 Created wheel for robotframework-httplibrary: filename=robotframework_httplibrary-0.4.2-py3-none-any.whl size=10014 sha256=b0306d94b5b17b793a919d654dcabcc5f97e4d0ba01788aea62be3b4e6449332 03:19:11 Stored in directory: /home/jenkins/.cache/pip/wheels/aa/bc/0d/9a20dd51effef392aae2733cb4c7b66c6fa29fca33d88b57ed 03:19:11 DEPRECATION: Building 'jsonpath-rw' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'jsonpath-rw'. Discussion can be found at https://github.com/pypa/pip/issues/6334 03:19:11 Building wheel for jsonpath-rw (setup.py): started 03:19:12 Building wheel for jsonpath-rw (setup.py): finished with status 'done' 03:19:12 Created wheel for jsonpath-rw: filename=jsonpath_rw-1.4.0-py3-none-any.whl size=15176 sha256=13033466ade5742ad872ced7c0966e9c7c4a288ecaa72703b201200fcfab6145 03:19:12 Stored in directory: /home/jenkins/.cache/pip/wheels/f1/54/63/9a8da38cefae13755097b36cc852decc25d8ef69c37d58d4eb 03:19:12 Successfully built robotframework-sshlibrary ipaddr netifaces robotframework-httplibrary jsonpath-rw 03:19:12 Installing collected packages: sortedcontainers, ply, netifaces, ipaddr, enum34, websocket-client, WebOb, waitress, urllib3, typing-extensions, soupsieve, sniffio, six, scapy, robotframework-pythonlibcore, robotframework, regex, pysocks, pyparsing, pycparser, netaddr, lxml, jsonpointer, jmespath, isodate, invoke, idna, h11, decorator, click, charset_normalizer, certifi, bcrypt, attrs, wsproto, requests, python-dateutil, pyhocon, pyang, outcome, jsonpath-rw, jsonpatch, elastic-transport, docker-pycreds, cffi, beautifulsoup4, webtest, trio, robotframework-requests, pynacl, pyangbind, elasticsearch, docker-py, cryptography, trio-websocket, robotframework-httplibrary, paramiko, elasticsearch-dsl, selenium, scp, robotframework-sshlibrary, robotframework-seleniumlibrary, robotframework-selenium2library 03:19:18 03:19:18 Successfully installed WebOb-1.8.9 attrs-25.3.0 bcrypt-4.3.0 beautifulsoup4-4.13.4 certifi-2025.8.3 cffi-1.17.1 charset_normalizer-3.4.3 click-8.2.1 cryptography-45.0.6 decorator-5.2.1 docker-py-1.10.6 docker-pycreds-0.4.0 elastic-transport-8.17.1 elasticsearch-8.19.0 elasticsearch-dsl-8.15.4 enum34-1.1.10 h11-0.16.0 idna-3.10 invoke-2.2.0 ipaddr-2.2.0 isodate-0.7.2 jmespath-1.0.1 jsonpatch-1.33 jsonpath-rw-1.4.0 jsonpointer-3.0.0 lxml-6.0.1 netaddr-1.3.0 netifaces-0.11.0 outcome-1.3.0.post0 paramiko-4.0.0 ply-3.11 pyang-2.6.1 pyangbind-0.8.6 pycparser-2.22 pyhocon-0.3.61 pynacl-1.5.0 pyparsing-3.2.3 pysocks-1.7.1 python-dateutil-2.9.0.post0 regex-2025.7.34 requests-2.32.5 robotframework-7.3.2 robotframework-httplibrary-0.4.2 robotframework-pythonlibcore-4.4.1 robotframework-requests-0.9.7 robotframework-selenium2library-3.0.0 robotframework-seleniumlibrary-6.7.1 robotframework-sshlibrary-3.8.0 scapy-2.6.1 scp-0.15.0 selenium-4.35.0 six-1.17.0 sniffio-1.3.1 sortedcontainers-2.4.0 soupsieve-2.7 trio-0.30.0 trio-websocket-0.12.2 typing-extensions-4.14.1 urllib3-2.5.0 waitress-3.0.2 websocket-client-1.8.0 webtest-3.0.6 wsproto-1.2.0 03:19:19 + pip freeze 03:19:19 attrs==25.3.0 03:19:19 bcrypt==4.3.0 03:19:19 beautifulsoup4==4.13.4 03:19:19 certifi==2025.8.3 03:19:19 cffi==1.17.1 03:19:19 charset-normalizer==3.4.3 03:19:19 click==8.2.1 03:19:19 cryptography==45.0.6 03:19:19 decorator==5.2.1 03:19:19 distlib==0.4.0 03:19:19 docker-py==1.10.6 03:19:19 docker-pycreds==0.4.0 03:19:19 elastic-transport==8.17.1 03:19:19 elasticsearch==8.19.0 03:19:19 elasticsearch-dsl==8.15.4 03:19:19 enum34==1.1.10 03:19:19 filelock==3.19.1 03:19:19 h11==0.16.0 03:19:19 idna==3.10 03:19:19 invoke==2.2.0 03:19:19 ipaddr==2.2.0 03:19:19 isodate==0.7.2 03:19:19 jmespath==1.0.1 03:19:19 jsonpatch==1.33 03:19:19 jsonpath-rw==1.4.0 03:19:19 jsonpointer==3.0.0 03:19:19 lxml==6.0.1 03:19:19 netaddr==1.3.0 03:19:19 netifaces==0.11.0 03:19:19 outcome==1.3.0.post0 03:19:19 paramiko==4.0.0 03:19:19 platformdirs==4.3.8 03:19:19 ply==3.11 03:19:19 pyang==2.6.1 03:19:19 pyangbind==0.8.6 03:19:19 pycparser==2.22 03:19:19 pyhocon==0.3.61 03:19:19 PyNaCl==1.5.0 03:19:19 pyparsing==3.2.3 03:19:19 PySocks==1.7.1 03:19:19 python-dateutil==2.9.0.post0 03:19:19 regex==2025.7.34 03:19:19 requests==2.32.5 03:19:19 robotframework==7.3.2 03:19:19 robotframework-httplibrary==0.4.2 03:19:19 robotframework-pythonlibcore==4.4.1 03:19:19 robotframework-requests==0.9.7 03:19:19 robotframework-selenium2library==3.0.0 03:19:19 robotframework-seleniumlibrary==6.7.1 03:19:19 robotframework-sshlibrary==3.8.0 03:19:19 scapy==2.6.1 03:19:19 scp==0.15.0 03:19:19 selenium==4.35.0 03:19:19 six==1.17.0 03:19:19 sniffio==1.3.1 03:19:19 sortedcontainers==2.4.0 03:19:19 soupsieve==2.7 03:19:19 trio==0.30.0 03:19:19 trio-websocket==0.12.2 03:19:19 typing_extensions==4.14.1 03:19:19 urllib3==2.5.0 03:19:19 virtualenv==20.34.0 03:19:19 waitress==3.0.2 03:19:19 WebOb==1.8.9 03:19:19 websocket-client==1.8.0 03:19:19 WebTest==3.0.6 03:19:19 wsproto==1.2.0 03:19:19 [EnvInject] - Injecting environment variables from a build step. 03:19:19 [EnvInject] - Injecting as environment variables the properties file path 'env.properties' 03:19:19 [EnvInject] - Variables injected successfully. 03:19:19 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins13104609542133549813.sh 03:19:19 Setup pyenv: 03:19:19 system 03:19:19 3.8.13 03:19:19 3.9.13 03:19:19 3.10.13 03:19:19 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:19:20 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:19:22 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient yq 03:19:28 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 03:19:28 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 03:19:29 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:19:29 + ODL_SYSTEM=() 03:19:29 + TOOLS_SYSTEM=() 03:19:29 + OPENSTACK_SYSTEM=() 03:19:29 + OPENSTACK_CONTROLLERS=() 03:19:29 + mapfile -t ADDR 03:19:29 ++ jq -r '.outputs[] | select(.output_key | match("^vm_[0-9]+_ips$")) | .output_value | .[]' 03:19:29 ++ openstack stack show -f json -c outputs releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:19:30 + for i in "${ADDR[@]}" 03:19:30 ++ ssh 10.30.171.150 hostname -s 03:19:30 Warning: Permanently added '10.30.171.150' (ECDSA) to the list of known hosts. 03:19:31 + REMHOST=releng-62057-373-1-mininet-ovs-217-0 03:19:31 + case ${REMHOST} in 03:19:31 + TOOLS_SYSTEM=("${TOOLS_SYSTEM[@]}" "${i}") 03:19:31 + for i in "${ADDR[@]}" 03:19:31 ++ ssh 10.30.171.230 hostname -s 03:19:31 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:31 + REMHOST=releng-62057-373-0-builder-0 03:19:31 + case ${REMHOST} in 03:19:31 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 03:19:31 + for i in "${ADDR[@]}" 03:19:31 ++ ssh 10.30.171.111 hostname -s 03:19:31 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:31 + REMHOST=releng-62057-373-0-builder-1 03:19:31 + case ${REMHOST} in 03:19:31 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 03:19:31 + for i in "${ADDR[@]}" 03:19:31 ++ ssh 10.30.171.29 hostname -s 03:19:31 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:31 + REMHOST=releng-62057-373-0-builder-2 03:19:31 + case ${REMHOST} in 03:19:31 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 03:19:31 + echo NUM_ODL_SYSTEM=3 03:19:31 + echo NUM_TOOLS_SYSTEM=1 03:19:31 + '[' '' == yes ']' 03:19:31 + NUM_OPENSTACK_SYSTEM=0 03:19:31 + echo NUM_OPENSTACK_SYSTEM=0 03:19:31 + '[' 0 -eq 2 ']' 03:19:31 + echo ODL_SYSTEM_IP=10.30.171.230 03:19:31 ++ seq 0 2 03:19:31 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 03:19:31 + echo ODL_SYSTEM_1_IP=10.30.171.230 03:19:31 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 03:19:31 + echo ODL_SYSTEM_2_IP=10.30.171.111 03:19:31 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 03:19:31 + echo ODL_SYSTEM_3_IP=10.30.171.29 03:19:31 + echo TOOLS_SYSTEM_IP=10.30.171.150 03:19:31 ++ seq 0 0 03:19:31 + for i in $(seq 0 $(( ${#TOOLS_SYSTEM[@]} - 1 ))) 03:19:31 + echo TOOLS_SYSTEM_1_IP=10.30.171.150 03:19:31 + openstack_index=0 03:19:31 + NUM_OPENSTACK_CONTROL_NODES=1 03:19:31 + echo NUM_OPENSTACK_CONTROL_NODES=1 03:19:31 ++ seq 0 0 03:19:31 + for i in $(seq 0 $((NUM_OPENSTACK_CONTROL_NODES - 1))) 03:19:31 + echo OPENSTACK_CONTROL_NODE_1_IP= 03:19:31 + NUM_OPENSTACK_COMPUTE_NODES=-1 03:19:31 + echo NUM_OPENSTACK_COMPUTE_NODES=-1 03:19:31 + '[' -1 -ge 2 ']' 03:19:31 ++ seq 0 -2 03:19:31 + NUM_OPENSTACK_HAPROXY_NODES=0 03:19:31 + echo NUM_OPENSTACK_HAPROXY_NODES=0 03:19:31 ++ seq 0 -1 03:19:31 + echo 'Contents of slave_addresses.txt:' 03:19:31 Contents of slave_addresses.txt: 03:19:31 + cat slave_addresses.txt 03:19:31 NUM_ODL_SYSTEM=3 03:19:31 NUM_TOOLS_SYSTEM=1 03:19:31 NUM_OPENSTACK_SYSTEM=0 03:19:31 ODL_SYSTEM_IP=10.30.171.230 03:19:31 ODL_SYSTEM_1_IP=10.30.171.230 03:19:31 ODL_SYSTEM_2_IP=10.30.171.111 03:19:31 ODL_SYSTEM_3_IP=10.30.171.29 03:19:31 TOOLS_SYSTEM_IP=10.30.171.150 03:19:31 TOOLS_SYSTEM_1_IP=10.30.171.150 03:19:31 NUM_OPENSTACK_CONTROL_NODES=1 03:19:31 OPENSTACK_CONTROL_NODE_1_IP= 03:19:31 NUM_OPENSTACK_COMPUTE_NODES=-1 03:19:31 NUM_OPENSTACK_HAPROXY_NODES=0 03:19:31 [EnvInject] - Injecting environment variables from a build step. 03:19:31 [EnvInject] - Injecting as environment variables the properties file path 'slave_addresses.txt' 03:19:32 [EnvInject] - Variables injected successfully. 03:19:32 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins7612248716978754264.sh 03:19:32 Preparing for JRE Version 21 03:19:32 Karaf artifact is karaf 03:19:32 Karaf project is integration 03:19:32 Java home is /usr/lib/jvm/java-21-openjdk-amd64 03:19:32 [EnvInject] - Injecting environment variables from a build step. 03:19:32 [EnvInject] - Injecting as environment variables the properties file path 'set_variables.env' 03:19:32 [EnvInject] - Variables injected successfully. 03:19:32 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins3631431826925179649.sh 03:19:32 Distribution bundle URL is https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:32 Distribution bundle is karaf-0.22.1.zip 03:19:32 Distribution bundle version is 0.22.1 03:19:32 Distribution folder is karaf-0.22.1 03:19:32 Nexus prefix is https://nexus.opendaylight.org 03:19:32 [EnvInject] - Injecting environment variables from a build step. 03:19:32 [EnvInject] - Injecting as environment variables the properties file path 'detect_variables.env' 03:19:32 [EnvInject] - Variables injected successfully. 03:19:32 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins431391249707254624.sh 03:19:32 Setup pyenv: 03:19:32 system 03:19:32 3.8.13 03:19:32 3.9.13 03:19:32 3.10.13 03:19:32 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:19:32 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:19:34 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 03:19:34 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 03:19:34 lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient 03:19:39 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 03:19:39 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 03:19:40 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:19:40 Copying common-functions.sh to /tmp 03:19:41 Copying common-functions.sh to 10.30.171.150:/tmp 03:19:41 Warning: Permanently added '10.30.171.150' (ECDSA) to the list of known hosts. 03:19:42 Copying common-functions.sh to 10.30.171.230:/tmp 03:19:42 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:42 Copying common-functions.sh to 10.30.171.111:/tmp 03:19:43 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:43 Copying common-functions.sh to 10.30.171.29:/tmp 03:19:43 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:43 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins17296219814432629524.sh 03:19:43 common-functions.sh is being sourced 03:19:43 common-functions environment: 03:19:43 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:43 ACTUALFEATURES: 03:19:43 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:43 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 03:19:43 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:43 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 03:19:43 CONTROLLERMEM: 2048m 03:19:43 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:43 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:43 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:43 SUITES: 03:19:43 03:19:43 ################################################# 03:19:43 ## Configure Cluster and Start ## 03:19:43 ################################################# 03:19:43 ACTUALFEATURES: odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer 03:19:43 SPACE_SEPARATED_FEATURES: odl-infrautils-ready odl-jolokia odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer 03:19:43 Locating script plan to use... 03:19:43 Finished running script plans 03:19:43 Configuring member-1 with IP address 10.30.171.230 03:19:44 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:44 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:44 + source /tmp/common-functions.sh karaf-0.22.1 titanium 03:19:44 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 03:19:44 common-functions.sh is being sourced 03:19:44 ++ echo 'common-functions.sh is being sourced' 03:19:44 ++ BUNDLEFOLDER=karaf-0.22.1 03:19:44 ++ DISTROSTREAM=titanium 03:19:44 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:44 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:44 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:44 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:44 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:44 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:44 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:44 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:44 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:44 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:44 ++ export CONTROLLERMEM= 03:19:44 ++ CONTROLLERMEM= 03:19:44 ++ case "${DISTROSTREAM}" in 03:19:44 ++ CLUSTER_SYSTEM=pekko 03:19:44 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:44 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:44 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:44 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:44 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:44 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:44 ++ print_common_env 03:19:44 ++ cat 03:19:44 common-functions environment: 03:19:44 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:44 ACTUALFEATURES: 03:19:44 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:44 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 03:19:44 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:44 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 03:19:44 CONTROLLERMEM: 03:19:44 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:44 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:44 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:44 SUITES: 03:19:44 03:19:44 ++ SSH='ssh -t -t' 03:19:44 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 03:19:44 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 03:19:44 Changing to /tmp 03:19:44 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:44 + echo 'Changing to /tmp' 03:19:44 + cd /tmp 03:19:44 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 03:19:44 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:44 --2025-08-23 03:19:44-- https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:44 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 03:19:44 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 03:19:44 HTTP request sent, awaiting response... 200 OK 03:19:44 Length: 236634156 (226M) [application/zip] 03:19:44 Saving to: ‘karaf-0.22.1.zip’ 03:19:44 03:19:44 0K ........ ........ ........ ........ ........ ........ 1% 62.5M 4s 03:19:44 3072K ........ ........ ........ ........ ........ ........ 2% 104M 3s 03:19:44 6144K ........ ........ ........ ........ ........ ........ 3% 134M 2s 03:19:44 9216K ........ ........ ........ ........ ........ ........ 5% 127M 2s 03:19:44 12288K ........ ........ ........ ........ ........ ........ 6% 145M 2s 03:19:44 15360K ........ ........ ........ ........ ........ ........ 7% 177M 2s 03:19:44 18432K ........ ........ ........ ........ ........ ........ 9% 213M 2s 03:19:44 21504K ........ ........ ........ ........ ........ ........ 10% 208M 2s 03:19:44 24576K ........ ........ ........ ........ ........ ........ 11% 243M 1s 03:19:44 27648K ........ ........ ........ ........ ........ ........ 13% 241M 1s 03:19:44 30720K ........ ........ ........ ........ ........ ........ 14% 226M 1s 03:19:44 33792K ........ ........ ........ ........ ........ ........ 15% 237M 1s 03:19:44 36864K ........ ........ ........ ........ ........ ........ 17% 275M 1s 03:19:44 39936K ........ ........ ........ ........ ........ ........ 18% 283M 1s 03:19:44 43008K ........ ........ ........ ........ ........ ........ 19% 232M 1s 03:19:44 46080K ........ ........ ........ ........ ........ ........ 21% 337M 1s 03:19:44 49152K ........ ........ ........ ........ ........ ........ 22% 357M 1s 03:19:44 52224K ........ ........ ........ ........ ........ ........ 23% 316M 1s 03:19:44 55296K ........ ........ ........ ........ ........ ........ 25% 357M 1s 03:19:44 58368K ........ ........ ........ ........ ........ ........ 26% 350M 1s 03:19:44 61440K ........ ........ ........ ........ ........ ........ 27% 354M 1s 03:19:44 64512K ........ ........ ........ ........ ........ ........ 29% 338M 1s 03:19:44 67584K ........ ........ ........ ........ ........ ........ 30% 351M 1s 03:19:44 70656K ........ ........ ........ ........ ........ ........ 31% 340M 1s 03:19:44 73728K ........ ........ ........ ........ ........ ........ 33% 295M 1s 03:19:44 76800K ........ ........ ........ ........ ........ ........ 34% 318M 1s 03:19:44 79872K ........ ........ ........ ........ ........ ........ 35% 179M 1s 03:19:44 82944K ........ ........ ........ ........ ........ ........ 37% 268M 1s 03:19:44 86016K ........ ........ ........ ........ ........ ........ 38% 331M 1s 03:19:44 89088K ........ ........ ........ ........ ........ ........ 39% 352M 1s 03:19:44 92160K ........ ........ ........ ........ ........ ........ 41% 338M 1s 03:19:44 95232K ........ ........ ........ ........ ........ ........ 42% 346M 1s 03:19:44 98304K ........ ........ ........ ........ ........ ........ 43% 337M 1s 03:19:44 101376K ........ ........ ........ ........ ........ ........ 45% 296M 1s 03:19:44 104448K ........ ........ ........ ........ ........ ........ 46% 300M 1s 03:19:44 107520K ........ ........ ........ ........ ........ ........ 47% 219M 1s 03:19:44 110592K ........ ........ ........ ........ ........ ........ 49% 339M 1s 03:19:44 113664K ........ ........ ........ ........ ........ ........ 50% 288M 0s 03:19:44 116736K ........ ........ ........ ........ ........ ........ 51% 283M 0s 03:19:44 119808K ........ ........ ........ ........ ........ ........ 53% 272M 0s 03:19:44 122880K ........ ........ ........ ........ ........ ........ 54% 305M 0s 03:19:44 125952K ........ ........ ........ ........ ........ ........ 55% 292M 0s 03:19:44 129024K ........ ........ ........ ........ ........ ........ 57% 298M 0s 03:19:44 132096K ........ ........ ........ ........ ........ ........ 58% 305M 0s 03:19:44 135168K ........ ........ ........ ........ ........ ........ 59% 296M 0s 03:19:44 138240K ........ ........ ........ ........ ........ ........ 61% 294M 0s 03:19:45 141312K ........ ........ ........ ........ ........ ........ 62% 298M 0s 03:19:45 144384K ........ ........ ........ ........ ........ ........ 63% 289M 0s 03:19:45 147456K ........ ........ ........ ........ ........ ........ 65% 306M 0s 03:19:45 150528K ........ ........ ........ ........ ........ ........ 66% 309M 0s 03:19:45 153600K ........ ........ ........ ........ ........ ........ 67% 297M 0s 03:19:45 156672K ........ ........ ........ ........ ........ ........ 69% 257M 0s 03:19:45 159744K ........ ........ ........ ........ ........ ........ 70% 304M 0s 03:19:45 162816K ........ ........ ........ ........ ........ ........ 71% 314M 0s 03:19:45 165888K ........ ........ ........ ........ ........ ........ 73% 310M 0s 03:19:45 168960K ........ ........ ........ ........ ........ ........ 74% 307M 0s 03:19:45 172032K ........ ........ ........ ........ ........ ........ 75% 290M 0s 03:19:45 175104K ........ ........ ........ ........ ........ ........ 77% 296M 0s 03:19:45 178176K ........ ........ ........ ........ ........ ........ 78% 300M 0s 03:19:45 181248K ........ ........ ........ ........ ........ ........ 79% 296M 0s 03:19:45 184320K ........ ........ ........ ........ ........ ........ 81% 288M 0s 03:19:45 187392K ........ ........ ........ ........ ........ ........ 82% 279M 0s 03:19:45 190464K ........ ........ ........ ........ ........ ........ 83% 288M 0s 03:19:45 193536K ........ ........ ........ ........ ........ ........ 85% 285M 0s 03:19:45 196608K ........ ........ ........ ........ ........ ........ 86% 290M 0s 03:19:45 199680K ........ ........ ........ ........ ........ ........ 87% 295M 0s 03:19:45 202752K ........ ........ ........ ........ ........ ........ 89% 286M 0s 03:19:45 205824K ........ ........ ........ ........ ........ ........ 90% 269M 0s 03:19:45 208896K ........ ........ ........ ........ ........ ........ 91% 293M 0s 03:19:45 211968K ........ ........ ........ ........ ........ ........ 93% 290M 0s 03:19:45 215040K ........ ........ ........ ........ ........ ........ 94% 266M 0s 03:19:45 218112K ........ ........ ........ ........ ........ ........ 95% 259M 0s 03:19:45 221184K ........ ........ ........ ........ ........ ........ 97% 266M 0s 03:19:45 224256K ........ ........ ........ ........ ........ ........ 98% 303M 0s 03:19:45 227328K ........ ........ ........ ........ ........ ........ 99% 284M 0s 03:19:45 230400K ........ .. 100% 318M=0.9s 03:19:45 03:19:45 2025-08-23 03:19:45 (256 MB/s) - ‘karaf-0.22.1.zip’ saved [236634156/236634156] 03:19:45 03:19:45 Extracting the new controller... 03:19:45 + echo 'Extracting the new controller...' 03:19:45 + unzip -q karaf-0.22.1.zip 03:19:47 Adding external repositories... 03:19:47 + echo 'Adding external repositories...' 03:19:47 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:47 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:47 ################################################################################ 03:19:47 # 03:19:47 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:47 # contributor license agreements. See the NOTICE file distributed with 03:19:47 # this work for additional information regarding copyright ownership. 03:19:47 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:47 # (the "License"); you may not use this file except in compliance with 03:19:47 # the License. You may obtain a copy of the License at 03:19:47 # 03:19:47 # http://www.apache.org/licenses/LICENSE-2.0 03:19:47 # 03:19:47 # Unless required by applicable law or agreed to in writing, software 03:19:47 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:47 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:47 # See the License for the specific language governing permissions and 03:19:47 # limitations under the License. 03:19:47 # 03:19:47 ################################################################################ 03:19:47 03:19:47 # 03:19:47 # If set to true, the following property will not allow any certificate to be used 03:19:47 # when accessing Maven repositories through SSL 03:19:47 # 03:19:47 #org.ops4j.pax.url.mvn.certificateCheck= 03:19:47 03:19:47 # 03:19:47 # Path to the local Maven settings file. 03:19:47 # The repositories defined in this file will be automatically added to the list 03:19:47 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 03:19:47 # below is not set. 03:19:47 # The following locations are checked for the existence of the settings.xml file 03:19:47 # * 1. looks for the specified url 03:19:47 # * 2. if not found looks for ${user.home}/.m2/settings.xml 03:19:47 # * 3. if not found looks for ${maven.home}/conf/settings.xml 03:19:47 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 03:19:47 # 03:19:47 #org.ops4j.pax.url.mvn.settings= 03:19:47 03:19:47 # 03:19:47 # Path to the local Maven repository which is used to avoid downloading 03:19:47 # artifacts when they already exist locally. 03:19:47 # The value of this property will be extracted from the settings.xml file 03:19:47 # above, or defaulted to: 03:19:47 # System.getProperty( "user.home" ) + "/.m2/repository" 03:19:47 # 03:19:47 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 03:19:47 03:19:47 # 03:19:47 # Default this to false. It's just weird to use undocumented repos 03:19:47 # 03:19:47 org.ops4j.pax.url.mvn.useFallbackRepositories=false 03:19:47 03:19:47 # 03:19:47 # Uncomment if you don't wanna use the proxy settings 03:19:47 # from the Maven conf/settings.xml file 03:19:47 # 03:19:47 # org.ops4j.pax.url.mvn.proxySupport=false 03:19:47 03:19:47 # 03:19:47 # Comma separated list of repositories scanned when resolving an artifact. 03:19:47 # Those repositories will be checked before iterating through the 03:19:47 # below list of repositories and even before the local repository 03:19:47 # A repository url can be appended with zero or more of the following flags: 03:19:47 # @snapshots : the repository contains snaphots 03:19:47 # @noreleases : the repository does not contain any released artifacts 03:19:47 # 03:19:47 # The following property value will add the system folder as a repo. 03:19:47 # 03:19:47 org.ops4j.pax.url.mvn.defaultRepositories=\ 03:19:47 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 03:19:47 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 03:19:47 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 03:19:47 03:19:47 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 03:19:47 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 03:19:47 03:19:47 # 03:19:47 # Comma separated list of repositories scanned when resolving an artifact. 03:19:47 # The default list includes the following repositories: 03:19:47 # http://repo1.maven.org/maven2@id=central 03:19:47 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 03:19:47 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 03:19:47 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 03:19:47 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 03:19:47 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 03:19:47 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:47 # To add repositories to the default ones, prepend '+' to the list of repositories 03:19:47 # to add. 03:19:47 # A repository url can be appended with zero or more of the following flags: 03:19:47 # @snapshots : the repository contains snapshots 03:19:47 # @noreleases : the repository does not contain any released artifacts 03:19:47 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 03:19:47 # 03:19:47 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:47 03:19:47 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 03:19:47 + [[ True == \T\r\u\e ]] 03:19:47 + echo 'Configuring the startup features...' 03:19:47 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:47 + FEATURE_TEST_STRING=features-test 03:19:47 + FEATURE_TEST_VERSION=0.22.1 03:19:47 + KARAF_VERSION=karaf4 03:19:47 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 03:19:47 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:47 + [[ ! -z '' ]] 03:19:47 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:47 ################################################################################ 03:19:47 # 03:19:47 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:47 # contributor license agreements. See the NOTICE file distributed with 03:19:47 # this work for additional information regarding copyright ownership. 03:19:47 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:47 # (the "License"); you may not use this file except in compliance with 03:19:47 # the License. You may obtain a copy of the License at 03:19:47 # 03:19:47 # http://www.apache.org/licenses/LICENSE-2.0 03:19:47 # 03:19:47 # Unless required by applicable law or agreed to in writing, software 03:19:47 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:47 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:47 # See the License for the specific language governing permissions and 03:19:47 # limitations under the License. 03:19:47 # 03:19:47 ################################################################################ 03:19:47 03:19:47 # 03:19:47 # Comma separated list of features repositories to register by default 03:19:47 # 03:19:47 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/660b995b-9997-42c5-9b03-155bcd1db9f9.xml 03:19:47 03:19:47 # 03:19:47 # Comma separated list of features to install at startup 03:19:47 # 03:19:47 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, daa896e6-53a8-4216-a40d-9155d1754fac 03:19:47 03:19:47 # 03:19:47 # Resource repositories (OBR) that the features resolver can use 03:19:47 # to resolve requirements/capabilities 03:19:47 # 03:19:47 # The format of the resourceRepositories is 03:19:47 # resourceRepositories=[xml:url|json:url],... 03:19:47 # for Instance: 03:19:47 # 03:19:47 #resourceRepositories=xml:http://host/path/to/index.xml 03:19:47 # or 03:19:47 #resourceRepositories=json:http://host/path/to/index.json 03:19:47 # 03:19:47 03:19:47 # 03:19:47 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 03:19:47 # 03:19:47 featuresBootAsynchronous=false 03:19:47 03:19:47 # 03:19:47 # Service requirements enforcement 03:19:47 # 03:19:47 # By default, the feature resolver checks the service requirements/capabilities of 03:19:47 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 03:19:47 # the required bundles. 03:19:47 # The following flag can have those values: 03:19:47 # - disable: service requirements are completely ignored 03:19:47 # - default: service requirements are ignored for old features 03:19:47 # - enforce: service requirements are always verified 03:19:47 # 03:19:47 #serviceRequirements=default 03:19:47 03:19:47 # 03:19:47 # Store cfg file for config element in feature 03:19:47 # 03:19:47 #configCfgStore=true 03:19:47 03:19:47 # 03:19:47 # Define if the feature service automatically refresh bundles 03:19:47 # 03:19:47 autoRefresh=true 03:19:47 03:19:47 # 03:19:47 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 03:19:47 # XML file defines instructions related to features processing 03:19:47 # versions.properties may declare properties to resolve placeholders in XML file 03:19:47 # both files are relative to ${karaf.etc} 03:19:47 # 03:19:47 #featureProcessing=org.apache.karaf.features.xml 03:19:47 #featureProcessingVersions=versions.properties 03:19:47 + configure_karaf_log karaf4 '' 03:19:47 + local -r karaf_version=karaf4 03:19:47 + local -r controllerdebugmap= 03:19:47 + local logapi=log4j 03:19:47 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:47 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:47 log4j2.rootLogger.level = INFO 03:19:47 #log4j2.rootLogger.type = asyncRoot 03:19:47 #log4j2.rootLogger.includeLocation = false 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:47 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:47 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:47 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:47 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:47 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:47 log4j2.logger.spifly.level = WARN 03:19:47 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:47 log4j2.logger.audit.level = INFO 03:19:47 log4j2.logger.audit.additivity = false 03:19:47 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:47 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:47 log4j2.appender.console.type = Console 03:19:47 log4j2.appender.console.name = Console 03:19:47 log4j2.appender.console.layout.type = PatternLayout 03:19:47 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:47 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:47 log4j2.appender.rolling.name = RollingFile 03:19:47 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:47 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:47 #log4j2.appender.rolling.immediateFlush = false 03:19:47 log4j2.appender.rolling.append = true 03:19:47 log4j2.appender.rolling.layout.type = PatternLayout 03:19:47 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:47 log4j2.appender.rolling.policies.type = Policies 03:19:47 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:47 log4j2.appender.rolling.policies.size.size = 64MB 03:19:47 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:47 log4j2.appender.rolling.strategy.max = 7 03:19:47 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:47 log4j2.appender.audit.name = AuditRollingFile 03:19:47 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:47 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:47 log4j2.appender.audit.append = true 03:19:47 log4j2.appender.audit.layout.type = PatternLayout 03:19:47 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:47 log4j2.appender.audit.policies.type = Policies 03:19:47 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:47 log4j2.appender.audit.policies.size.size = 8MB 03:19:47 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:47 log4j2.appender.audit.strategy.max = 7 03:19:47 log4j2.appender.osgi.type = PaxOsgi 03:19:47 log4j2.appender.osgi.name = PaxOsgi 03:19:47 log4j2.appender.osgi.filter = * 03:19:47 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:47 #log4j2.logger.aether.level = TRACE 03:19:47 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:47 #log4j2.logger.http-headers.level = DEBUG 03:19:47 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:47 #log4j2.logger.maven.level = TRACE 03:19:47 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 03:19:47 + logapi=log4j2 03:19:47 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 03:19:47 + '[' log4j2 == log4j2 ']' 03:19:47 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:47 controllerdebugmap: 03:19:47 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 03:19:47 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 03:19:47 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 03:19:47 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 03:19:47 + unset IFS 03:19:47 + echo 'controllerdebugmap: ' 03:19:47 + '[' -n '' ']' 03:19:47 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:47 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 03:19:47 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:47 ################################################################################ 03:19:47 # 03:19:47 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:47 # contributor license agreements. See the NOTICE file distributed with 03:19:47 # this work for additional information regarding copyright ownership. 03:19:47 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:47 # (the "License"); you may not use this file except in compliance with 03:19:47 # the License. You may obtain a copy of the License at 03:19:47 # 03:19:47 # http://www.apache.org/licenses/LICENSE-2.0 03:19:47 # 03:19:47 # Unless required by applicable law or agreed to in writing, software 03:19:47 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:47 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:47 # See the License for the specific language governing permissions and 03:19:47 # limitations under the License. 03:19:47 # 03:19:47 ################################################################################ 03:19:47 03:19:47 # Common pattern layout for appenders 03:19:47 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:47 03:19:47 # Root logger 03:19:47 log4j2.rootLogger.level = INFO 03:19:47 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:47 #log4j2.rootLogger.type = asyncRoot 03:19:47 #log4j2.rootLogger.includeLocation = false 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:47 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:47 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:47 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:47 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:47 03:19:47 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:47 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:47 03:19:47 # Loggers configuration 03:19:47 03:19:47 # Spifly logger 03:19:47 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:47 log4j2.logger.spifly.level = WARN 03:19:47 03:19:47 # Security audit logger 03:19:47 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:47 log4j2.logger.audit.level = INFO 03:19:47 log4j2.logger.audit.additivity = false 03:19:47 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:47 03:19:47 # Appenders configuration 03:19:47 03:19:47 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:47 log4j2.appender.console.type = Console 03:19:47 log4j2.appender.console.name = Console 03:19:47 log4j2.appender.console.layout.type = PatternLayout 03:19:47 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:47 03:19:47 # Rolling file appender 03:19:47 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:47 log4j2.appender.rolling.name = RollingFile 03:19:47 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:47 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:47 # uncomment to not force a disk flush 03:19:47 #log4j2.appender.rolling.immediateFlush = false 03:19:47 log4j2.appender.rolling.append = true 03:19:47 log4j2.appender.rolling.layout.type = PatternLayout 03:19:47 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:47 log4j2.appender.rolling.policies.type = Policies 03:19:47 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:47 log4j2.appender.rolling.policies.size.size = 1GB 03:19:47 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:47 log4j2.appender.rolling.strategy.max = 7 03:19:47 03:19:47 # Audit file appender 03:19:47 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:47 log4j2.appender.audit.name = AuditRollingFile 03:19:47 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:47 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:47 log4j2.appender.audit.append = true 03:19:47 log4j2.appender.audit.layout.type = PatternLayout 03:19:47 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:47 log4j2.appender.audit.policies.type = Policies 03:19:47 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:47 log4j2.appender.audit.policies.size.size = 8MB 03:19:47 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:47 log4j2.appender.audit.strategy.max = 7 03:19:47 03:19:47 # OSGi appender 03:19:47 log4j2.appender.osgi.type = PaxOsgi 03:19:47 log4j2.appender.osgi.name = PaxOsgi 03:19:47 log4j2.appender.osgi.filter = * 03:19:47 03:19:47 # help with identification of maven-related problems with pax-url-aether 03:19:47 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:47 #log4j2.logger.aether.level = TRACE 03:19:47 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:47 #log4j2.logger.http-headers.level = DEBUG 03:19:47 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:47 #log4j2.logger.maven.level = TRACE 03:19:47 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:47 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:47 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 03:19:47 Configure 03:19:47 java home: /usr/lib/jvm/java-21-openjdk-amd64 03:19:47 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 03:19:47 + local -r controllermem=2048m 03:19:47 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 03:19:47 + echo Configure 03:19:47 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 03:19:47 max memory: 2048m 03:19:47 memconf: /tmp/karaf-0.22.1/bin/setenv 03:19:47 + echo ' max memory: 2048m' 03:19:47 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 03:19:47 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 03:19:47 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 03:19:47 cat /tmp/karaf-0.22.1/bin/setenv 03:19:47 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 03:19:47 + cat /tmp/karaf-0.22.1/bin/setenv 03:19:47 #!/bin/sh 03:19:47 # 03:19:47 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:47 # contributor license agreements. See the NOTICE file distributed with 03:19:47 # this work for additional information regarding copyright ownership. 03:19:47 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:47 # (the "License"); you may not use this file except in compliance with 03:19:47 # the License. You may obtain a copy of the License at 03:19:47 # 03:19:47 # http://www.apache.org/licenses/LICENSE-2.0 03:19:47 # 03:19:47 # Unless required by applicable law or agreed to in writing, software 03:19:47 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:47 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:47 # See the License for the specific language governing permissions and 03:19:47 # limitations under the License. 03:19:47 # 03:19:47 03:19:47 # 03:19:47 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 03:19:47 # script: client, instance, shell, start, status, stop, karaf 03:19:47 # 03:19:47 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 03:19:47 # Actions go here... 03:19:47 # fi 03:19:47 03:19:47 # 03:19:47 # general settings which should be applied for all scripts go here; please keep 03:19:47 # in mind that it is possible that scripts might be executed more than once, e.g. 03:19:47 # in example of the start script where the start script is executed first and the 03:19:47 # karaf script afterwards. 03:19:47 # 03:19:47 03:19:47 # 03:19:47 # The following section shows the possible configuration options for the default 03:19:47 # karaf scripts 03:19:47 # 03:19:47 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 03:19:47 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 03:19:47 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 03:19:47 # export EXTRA_JAVA_OPTS # Additional JVM options 03:19:47 # export KARAF_HOME # Karaf home folder 03:19:47 # export KARAF_DATA # Karaf data folder 03:19:47 # export KARAF_BASE # Karaf base folder 03:19:47 # export KARAF_ETC # Karaf etc folder 03:19:47 # export KARAF_LOG # Karaf log folder 03:19:47 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 03:19:47 # export KARAF_OPTS # Additional available Karaf options 03:19:47 # export KARAF_DEBUG # Enable debug mode 03:19:47 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 03:19:47 # export KARAF_NOROOT # Prevent execution as root if set to true 03:19:47 Set Java version 03:19:47 + echo 'Set Java version' 03:19:47 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 03:19:47 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:47 sudo: a password is required 03:19:47 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:47 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:47 sudo: a password is required 03:19:47 JDK default version ... 03:19:47 + echo 'JDK default version ...' 03:19:47 + java -version 03:19:47 openjdk version "21.0.5" 2024-10-15 03:19:47 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 03:19:47 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 03:19:47 Set JAVA_HOME 03:19:47 + echo 'Set JAVA_HOME' 03:19:47 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:47 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:47 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:47 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:47 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:47 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 03:19:47 Listing all open ports on controller system... 03:19:47 + echo 'Listing all open ports on controller system...' 03:19:47 + netstat -pnatu 03:19:47 /tmp/configuration-script.sh: line 40: netstat: command not found 03:19:47 Configuring cluster 03:19:47 + '[' -f /tmp/custom_shard_config.txt ']' 03:19:47 + echo 'Configuring cluster' 03:19:47 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 1 10.30.171.230 10.30.171.111 10.30.171.29 03:19:47 ################################################ 03:19:47 ## Configure Cluster ## 03:19:47 ################################################ 03:19:47 ERROR: Cluster configurations files not found. Please configure clustering feature. 03:19:47 Dump pekko.conf 03:19:47 + echo 'Dump pekko.conf' 03:19:47 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:47 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:47 Dump modules.conf 03:19:47 + echo 'Dump modules.conf' 03:19:47 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:47 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 03:19:47 Dump module-shards.conf 03:19:47 + echo 'Dump module-shards.conf' 03:19:47 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:47 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 03:19:47 Configuring member-2 with IP address 10.30.171.111 03:19:47 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:48 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:48 + source /tmp/common-functions.sh karaf-0.22.1 titanium 03:19:48 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 03:19:48 common-functions.sh is being sourced 03:19:48 ++ echo 'common-functions.sh is being sourced' 03:19:48 ++ BUNDLEFOLDER=karaf-0.22.1 03:19:48 ++ DISTROSTREAM=titanium 03:19:48 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:48 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:48 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:48 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:48 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:48 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:48 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:48 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:48 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:48 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:48 ++ export CONTROLLERMEM= 03:19:48 ++ CONTROLLERMEM= 03:19:48 ++ case "${DISTROSTREAM}" in 03:19:48 ++ CLUSTER_SYSTEM=pekko 03:19:48 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:48 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:48 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:48 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:48 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:48 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:48 ++ print_common_env 03:19:48 ++ cat 03:19:48 common-functions environment: 03:19:48 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:48 ACTUALFEATURES: 03:19:48 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:48 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 03:19:48 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:48 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 03:19:48 CONTROLLERMEM: 03:19:48 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:48 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:48 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:48 SUITES: 03:19:48 03:19:48 ++ SSH='ssh -t -t' 03:19:48 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 03:19:48 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 03:19:48 Changing to /tmp 03:19:48 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:48 + echo 'Changing to /tmp' 03:19:48 + cd /tmp 03:19:48 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 03:19:48 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:48 --2025-08-23 03:19:48-- https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:48 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 03:19:48 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 03:19:48 HTTP request sent, awaiting response... 200 OK 03:19:48 Length: 236634156 (226M) [application/zip] 03:19:48 Saving to: ‘karaf-0.22.1.zip’ 03:19:48 03:19:48 0K ........ ........ ........ ........ ........ ........ 1% 59.0M 4s 03:19:48 3072K ........ ........ ........ ........ ........ ........ 2% 123M 3s 03:19:48 6144K ........ ........ ........ ........ ........ ........ 3% 161M 2s 03:19:48 9216K ........ ........ ........ ........ ........ ........ 5% 156M 2s 03:19:48 12288K ........ ........ ........ ........ ........ ........ 6% 174M 2s 03:19:48 15360K ........ ........ ........ ........ ........ ........ 7% 186M 2s 03:19:48 18432K ........ ........ ........ ........ ........ ........ 9% 234M 2s 03:19:48 21504K ........ ........ ........ ........ ........ ........ 10% 156M 2s 03:19:48 24576K ........ ........ ........ ........ ........ ........ 11% 206M 1s 03:19:48 27648K ........ ........ ........ ........ ........ ........ 13% 222M 1s 03:19:48 30720K ........ ........ ........ ........ ........ ........ 14% 271M 1s 03:19:48 33792K ........ ........ ........ ........ ........ ........ 15% 263M 1s 03:19:48 36864K ........ ........ ........ ........ ........ ........ 17% 250M 1s 03:19:48 39936K ........ ........ ........ ........ ........ ........ 18% 297M 1s 03:19:48 43008K ........ ........ ........ ........ ........ ........ 19% 276M 1s 03:19:48 46080K ........ ........ ........ ........ ........ ........ 21% 316M 1s 03:19:48 49152K ........ ........ ........ ........ ........ ........ 22% 286M 1s 03:19:48 52224K ........ ........ ........ ........ ........ ........ 23% 233M 1s 03:19:48 55296K ........ ........ ........ ........ ........ ........ 25% 263M 1s 03:19:48 58368K ........ ........ ........ ........ ........ ........ 26% 286M 1s 03:19:48 61440K ........ ........ ........ ........ ........ ........ 27% 313M 1s 03:19:48 64512K ........ ........ ........ ........ ........ ........ 29% 291M 1s 03:19:48 67584K ........ ........ ........ ........ ........ ........ 30% 261M 1s 03:19:48 70656K ........ ........ ........ ........ ........ ........ 31% 304M 1s 03:19:48 73728K ........ ........ ........ ........ ........ ........ 33% 269M 1s 03:19:48 76800K ........ ........ ........ ........ ........ ........ 34% 271M 1s 03:19:48 79872K ........ ........ ........ ........ ........ ........ 35% 293M 1s 03:19:48 82944K ........ ........ ........ ........ ........ ........ 37% 265M 1s 03:19:48 86016K ........ ........ ........ ........ ........ ........ 38% 289M 1s 03:19:48 89088K ........ ........ ........ ........ ........ ........ 39% 273M 1s 03:19:48 92160K ........ ........ ........ ........ ........ ........ 41% 270M 1s 03:19:48 95232K ........ ........ ........ ........ ........ ........ 42% 221M 1s 03:19:48 98304K ........ ........ ........ ........ ........ ........ 43% 279M 1s 03:19:48 101376K ........ ........ ........ ........ ........ ........ 45% 326M 1s 03:19:48 104448K ........ ........ ........ ........ ........ ........ 46% 314M 1s 03:19:48 107520K ........ ........ ........ ........ ........ ........ 47% 239M 1s 03:19:48 110592K ........ ........ ........ ........ ........ ........ 49% 286M 1s 03:19:48 113664K ........ ........ ........ ........ ........ ........ 50% 305M 0s 03:19:48 116736K ........ ........ ........ ........ ........ ........ 51% 247M 0s 03:19:48 119808K ........ ........ ........ ........ ........ ........ 53% 287M 0s 03:19:48 122880K ........ ........ ........ ........ ........ ........ 54% 305M 0s 03:19:48 125952K ........ ........ ........ ........ ........ ........ 55% 243M 0s 03:19:48 129024K ........ ........ ........ ........ ........ ........ 57% 322M 0s 03:19:48 132096K ........ ........ ........ ........ ........ ........ 58% 257M 0s 03:19:48 135168K ........ ........ ........ ........ ........ ........ 59% 312M 0s 03:19:48 138240K ........ ........ ........ ........ ........ ........ 61% 297M 0s 03:19:48 141312K ........ ........ ........ ........ ........ ........ 62% 33.8M 0s 03:19:48 144384K ........ ........ ........ ........ ........ ........ 63% 279M 0s 03:19:48 147456K ........ ........ ........ ........ ........ ........ 65% 331M 0s 03:19:48 150528K ........ ........ ........ ........ ........ ........ 66% 106M 0s 03:19:48 153600K ........ ........ ........ ........ ........ ........ 67% 325M 0s 03:19:48 156672K ........ ........ ........ ........ ........ ........ 69% 320M 0s 03:19:48 159744K ........ ........ ........ ........ ........ ........ 70% 348M 0s 03:19:49 162816K ........ ........ ........ ........ ........ ........ 71% 328M 0s 03:19:49 165888K ........ ........ ........ ........ ........ ........ 73% 352M 0s 03:19:49 168960K ........ ........ ........ ........ ........ ........ 74% 313M 0s 03:19:49 172032K ........ ........ ........ ........ ........ ........ 75% 318M 0s 03:19:49 175104K ........ ........ ........ ........ ........ ........ 77% 249M 0s 03:19:49 178176K ........ ........ ........ ........ ........ ........ 78% 338M 0s 03:19:49 181248K ........ ........ ........ ........ ........ ........ 79% 254M 0s 03:19:49 184320K ........ ........ ........ ........ ........ ........ 81% 275M 0s 03:19:49 187392K ........ ........ ........ ........ ........ ........ 82% 296M 0s 03:19:49 190464K ........ ........ ........ ........ ........ ........ 83% 233M 0s 03:19:49 193536K ........ ........ ........ ........ ........ ........ 85% 275M 0s 03:19:49 196608K ........ ........ ........ ........ ........ ........ 86% 323M 0s 03:19:49 199680K ........ ........ ........ ........ ........ ........ 87% 286M 0s 03:19:49 202752K ........ ........ ........ ........ ........ ........ 89% 249M 0s 03:19:49 205824K ........ ........ ........ ........ ........ ........ 90% 267M 0s 03:19:49 208896K ........ ........ ........ ........ ........ ........ 91% 274M 0s 03:19:49 211968K ........ ........ ........ ........ ........ ........ 93% 230M 0s 03:19:49 215040K ........ ........ ........ ........ ........ ........ 94% 266M 0s 03:19:49 218112K ........ ........ ........ ........ ........ ........ 95% 298M 0s 03:19:49 221184K ........ ........ ........ ........ ........ ........ 97% 244M 0s 03:19:49 224256K ........ ........ ........ ........ ........ ........ 98% 312M 0s 03:19:49 227328K ........ ........ ........ ........ ........ ........ 99% 287M 0s 03:19:49 230400K ........ .. 100% 221M=1.0s 03:19:49 03:19:49 2025-08-23 03:19:49 (226 MB/s) - ‘karaf-0.22.1.zip’ saved [236634156/236634156] 03:19:49 03:19:49 Extracting the new controller... 03:19:49 + echo 'Extracting the new controller...' 03:19:49 + unzip -q karaf-0.22.1.zip 03:19:51 Adding external repositories... 03:19:51 + echo 'Adding external repositories...' 03:19:51 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:51 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:51 ################################################################################ 03:19:51 # 03:19:51 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:51 # contributor license agreements. See the NOTICE file distributed with 03:19:51 # this work for additional information regarding copyright ownership. 03:19:51 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:51 # (the "License"); you may not use this file except in compliance with 03:19:51 # the License. You may obtain a copy of the License at 03:19:51 # 03:19:51 # http://www.apache.org/licenses/LICENSE-2.0 03:19:51 # 03:19:51 # Unless required by applicable law or agreed to in writing, software 03:19:51 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:51 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:51 # See the License for the specific language governing permissions and 03:19:51 # limitations under the License. 03:19:51 # 03:19:51 ################################################################################ 03:19:51 03:19:51 # 03:19:51 # If set to true, the following property will not allow any certificate to be used 03:19:51 # when accessing Maven repositories through SSL 03:19:51 # 03:19:51 #org.ops4j.pax.url.mvn.certificateCheck= 03:19:51 03:19:51 # 03:19:51 # Path to the local Maven settings file. 03:19:51 # The repositories defined in this file will be automatically added to the list 03:19:51 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 03:19:51 # below is not set. 03:19:51 # The following locations are checked for the existence of the settings.xml file 03:19:51 # * 1. looks for the specified url 03:19:51 # * 2. if not found looks for ${user.home}/.m2/settings.xml 03:19:51 # * 3. if not found looks for ${maven.home}/conf/settings.xml 03:19:51 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 03:19:51 # 03:19:51 #org.ops4j.pax.url.mvn.settings= 03:19:51 03:19:51 # 03:19:51 # Path to the local Maven repository which is used to avoid downloading 03:19:51 # artifacts when they already exist locally. 03:19:51 # The value of this property will be extracted from the settings.xml file 03:19:51 # above, or defaulted to: 03:19:51 # System.getProperty( "user.home" ) + "/.m2/repository" 03:19:51 # 03:19:51 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 03:19:51 03:19:51 # 03:19:51 # Default this to false. It's just weird to use undocumented repos 03:19:51 # 03:19:51 org.ops4j.pax.url.mvn.useFallbackRepositories=false 03:19:51 03:19:51 # 03:19:51 # Uncomment if you don't wanna use the proxy settings 03:19:51 # from the Maven conf/settings.xml file 03:19:51 # 03:19:51 # org.ops4j.pax.url.mvn.proxySupport=false 03:19:51 03:19:51 # 03:19:51 # Comma separated list of repositories scanned when resolving an artifact. 03:19:51 # Those repositories will be checked before iterating through the 03:19:51 # below list of repositories and even before the local repository 03:19:51 # A repository url can be appended with zero or more of the following flags: 03:19:51 # @snapshots : the repository contains snaphots 03:19:51 # @noreleases : the repository does not contain any released artifacts 03:19:51 # 03:19:51 # The following property value will add the system folder as a repo. 03:19:51 # 03:19:51 org.ops4j.pax.url.mvn.defaultRepositories=\ 03:19:51 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 03:19:51 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 03:19:51 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 03:19:51 03:19:51 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 03:19:51 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 03:19:51 03:19:51 # 03:19:51 # Comma separated list of repositories scanned when resolving an artifact. 03:19:51 # The default list includes the following repositories: 03:19:51 # http://repo1.maven.org/maven2@id=central 03:19:51 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 03:19:51 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 03:19:51 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 03:19:51 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 03:19:51 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 03:19:51 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:51 # To add repositories to the default ones, prepend '+' to the list of repositories 03:19:51 # to add. 03:19:51 # A repository url can be appended with zero or more of the following flags: 03:19:51 # @snapshots : the repository contains snapshots 03:19:51 # @noreleases : the repository does not contain any released artifacts 03:19:51 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 03:19:51 # 03:19:51 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:51 03:19:51 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 03:19:51 + [[ True == \T\r\u\e ]] 03:19:51 + echo 'Configuring the startup features...' 03:19:51 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:51 + FEATURE_TEST_STRING=features-test 03:19:51 + FEATURE_TEST_VERSION=0.22.1 03:19:51 + KARAF_VERSION=karaf4 03:19:51 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 03:19:51 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:51 + [[ ! -z '' ]] 03:19:51 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:51 ################################################################################ 03:19:51 # 03:19:51 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:51 # contributor license agreements. See the NOTICE file distributed with 03:19:51 # this work for additional information regarding copyright ownership. 03:19:51 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:51 # (the "License"); you may not use this file except in compliance with 03:19:51 # the License. You may obtain a copy of the License at 03:19:51 # 03:19:51 # http://www.apache.org/licenses/LICENSE-2.0 03:19:51 # 03:19:51 # Unless required by applicable law or agreed to in writing, software 03:19:51 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:51 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:51 # See the License for the specific language governing permissions and 03:19:51 # limitations under the License. 03:19:51 # 03:19:51 ################################################################################ 03:19:51 03:19:51 # 03:19:51 # Comma separated list of features repositories to register by default 03:19:51 # 03:19:51 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/660b995b-9997-42c5-9b03-155bcd1db9f9.xml 03:19:51 03:19:51 # 03:19:51 # Comma separated list of features to install at startup 03:19:51 # 03:19:51 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, daa896e6-53a8-4216-a40d-9155d1754fac 03:19:51 03:19:51 # 03:19:51 # Resource repositories (OBR) that the features resolver can use 03:19:51 # to resolve requirements/capabilities 03:19:51 # 03:19:51 # The format of the resourceRepositories is 03:19:51 # resourceRepositories=[xml:url|json:url],... 03:19:51 # for Instance: 03:19:51 # 03:19:51 #resourceRepositories=xml:http://host/path/to/index.xml 03:19:51 # or 03:19:51 #resourceRepositories=json:http://host/path/to/index.json 03:19:51 # 03:19:51 03:19:51 # 03:19:51 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 03:19:51 # 03:19:51 featuresBootAsynchronous=false 03:19:51 03:19:51 # 03:19:51 # Service requirements enforcement 03:19:51 # 03:19:51 # By default, the feature resolver checks the service requirements/capabilities of 03:19:51 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 03:19:51 # the required bundles. 03:19:51 # The following flag can have those values: 03:19:51 # - disable: service requirements are completely ignored 03:19:51 # - default: service requirements are ignored for old features 03:19:51 # - enforce: service requirements are always verified 03:19:51 # 03:19:51 #serviceRequirements=default 03:19:51 03:19:51 # 03:19:51 # Store cfg file for config element in feature 03:19:51 # 03:19:51 #configCfgStore=true 03:19:51 03:19:51 # 03:19:51 # Define if the feature service automatically refresh bundles 03:19:51 # 03:19:51 autoRefresh=true 03:19:51 03:19:51 # 03:19:51 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 03:19:51 # XML file defines instructions related to features processing 03:19:51 # versions.properties may declare properties to resolve placeholders in XML file 03:19:51 # both files are relative to ${karaf.etc} 03:19:51 # 03:19:51 #featureProcessing=org.apache.karaf.features.xml 03:19:51 #featureProcessingVersions=versions.properties 03:19:51 + configure_karaf_log karaf4 '' 03:19:51 + local -r karaf_version=karaf4 03:19:51 + local -r controllerdebugmap= 03:19:51 + local logapi=log4j 03:19:51 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:51 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:51 log4j2.rootLogger.level = INFO 03:19:51 #log4j2.rootLogger.type = asyncRoot 03:19:51 #log4j2.rootLogger.includeLocation = false 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:51 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:51 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:51 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:51 log4j2.logger.spifly.level = WARN 03:19:51 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:51 log4j2.logger.audit.level = INFO 03:19:51 log4j2.logger.audit.additivity = false 03:19:51 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:51 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:51 log4j2.appender.console.type = Console 03:19:51 log4j2.appender.console.name = Console 03:19:51 log4j2.appender.console.layout.type = PatternLayout 03:19:51 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:51 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:51 log4j2.appender.rolling.name = RollingFile 03:19:51 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:51 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:51 #log4j2.appender.rolling.immediateFlush = false 03:19:51 log4j2.appender.rolling.append = true 03:19:51 log4j2.appender.rolling.layout.type = PatternLayout 03:19:51 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:51 log4j2.appender.rolling.policies.type = Policies 03:19:51 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:51 log4j2.appender.rolling.policies.size.size = 64MB 03:19:51 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:51 log4j2.appender.rolling.strategy.max = 7 03:19:51 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:51 log4j2.appender.audit.name = AuditRollingFile 03:19:51 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:51 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:51 log4j2.appender.audit.append = true 03:19:51 log4j2.appender.audit.layout.type = PatternLayout 03:19:51 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:51 log4j2.appender.audit.policies.type = Policies 03:19:51 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:51 log4j2.appender.audit.policies.size.size = 8MB 03:19:51 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:51 log4j2.appender.audit.strategy.max = 7 03:19:51 log4j2.appender.osgi.type = PaxOsgi 03:19:51 log4j2.appender.osgi.name = PaxOsgi 03:19:51 log4j2.appender.osgi.filter = * 03:19:51 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:51 #log4j2.logger.aether.level = TRACE 03:19:51 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:51 #log4j2.logger.http-headers.level = DEBUG 03:19:51 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:51 #log4j2.logger.maven.level = TRACE 03:19:51 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 03:19:51 + logapi=log4j2 03:19:51 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 03:19:51 + '[' log4j2 == log4j2 ']' 03:19:51 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:51 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 03:19:51 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 03:19:51 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 03:19:51 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 03:19:51 controllerdebugmap: 03:19:51 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:51 + unset IFS 03:19:51 + echo 'controllerdebugmap: ' 03:19:51 + '[' -n '' ']' 03:19:51 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 03:19:51 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:51 ################################################################################ 03:19:51 # 03:19:51 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:51 # contributor license agreements. See the NOTICE file distributed with 03:19:51 # this work for additional information regarding copyright ownership. 03:19:51 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:51 # (the "License"); you may not use this file except in compliance with 03:19:51 # the License. You may obtain a copy of the License at 03:19:51 # 03:19:51 # http://www.apache.org/licenses/LICENSE-2.0 03:19:51 # 03:19:51 # Unless required by applicable law or agreed to in writing, software 03:19:51 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:51 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:51 # See the License for the specific language governing permissions and 03:19:51 # limitations under the License. 03:19:51 # 03:19:51 ################################################################################ 03:19:51 03:19:51 # Common pattern layout for appenders 03:19:51 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:51 03:19:51 # Root logger 03:19:51 log4j2.rootLogger.level = INFO 03:19:51 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:51 #log4j2.rootLogger.type = asyncRoot 03:19:51 #log4j2.rootLogger.includeLocation = false 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:51 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:51 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:51 03:19:51 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:51 03:19:51 # Loggers configuration 03:19:51 03:19:51 # Spifly logger 03:19:51 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:51 log4j2.logger.spifly.level = WARN 03:19:51 03:19:51 # Security audit logger 03:19:51 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:51 log4j2.logger.audit.level = INFO 03:19:51 log4j2.logger.audit.additivity = false 03:19:51 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:51 03:19:51 # Appenders configuration 03:19:51 03:19:51 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:51 log4j2.appender.console.type = Console 03:19:51 log4j2.appender.console.name = Console 03:19:51 log4j2.appender.console.layout.type = PatternLayout 03:19:51 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:51 03:19:51 # Rolling file appender 03:19:51 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:51 log4j2.appender.rolling.name = RollingFile 03:19:51 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:51 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:51 # uncomment to not force a disk flush 03:19:51 #log4j2.appender.rolling.immediateFlush = false 03:19:51 log4j2.appender.rolling.append = true 03:19:51 log4j2.appender.rolling.layout.type = PatternLayout 03:19:51 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:51 log4j2.appender.rolling.policies.type = Policies 03:19:51 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:51 log4j2.appender.rolling.policies.size.size = 1GB 03:19:51 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:51 log4j2.appender.rolling.strategy.max = 7 03:19:51 03:19:51 # Audit file appender 03:19:51 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:51 log4j2.appender.audit.name = AuditRollingFile 03:19:51 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:51 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:51 log4j2.appender.audit.append = true 03:19:51 log4j2.appender.audit.layout.type = PatternLayout 03:19:51 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:51 log4j2.appender.audit.policies.type = Policies 03:19:51 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:51 log4j2.appender.audit.policies.size.size = 8MB 03:19:51 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:51 log4j2.appender.audit.strategy.max = 7 03:19:51 03:19:51 # OSGi appender 03:19:51 log4j2.appender.osgi.type = PaxOsgi 03:19:51 log4j2.appender.osgi.name = PaxOsgi 03:19:51 log4j2.appender.osgi.filter = * 03:19:51 03:19:51 # help with identification of maven-related problems with pax-url-aether 03:19:51 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:51 #log4j2.logger.aether.level = TRACE 03:19:51 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:51 #log4j2.logger.http-headers.level = DEBUG 03:19:51 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:51 #log4j2.logger.maven.level = TRACE 03:19:51 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:51 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:51 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 03:19:51 Configure 03:19:51 java home: /usr/lib/jvm/java-21-openjdk-amd64 03:19:51 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 03:19:51 + local -r controllermem=2048m 03:19:51 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 03:19:51 + echo Configure 03:19:51 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 03:19:51 + echo ' max memory: 2048m' 03:19:51 max memory: 2048m 03:19:51 memconf: /tmp/karaf-0.22.1/bin/setenv 03:19:51 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 03:19:51 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 03:19:51 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 03:19:51 cat /tmp/karaf-0.22.1/bin/setenv 03:19:51 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 03:19:51 + cat /tmp/karaf-0.22.1/bin/setenv 03:19:51 #!/bin/sh 03:19:51 # 03:19:51 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:51 # contributor license agreements. See the NOTICE file distributed with 03:19:51 # this work for additional information regarding copyright ownership. 03:19:51 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:51 # (the "License"); you may not use this file except in compliance with 03:19:51 # the License. You may obtain a copy of the License at 03:19:51 # 03:19:51 # http://www.apache.org/licenses/LICENSE-2.0 03:19:51 # 03:19:51 # Unless required by applicable law or agreed to in writing, software 03:19:51 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:51 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:51 # See the License for the specific language governing permissions and 03:19:51 # limitations under the License. 03:19:51 # 03:19:51 03:19:51 # 03:19:51 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 03:19:51 # script: client, instance, shell, start, status, stop, karaf 03:19:51 # 03:19:51 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 03:19:51 # Actions go here... 03:19:51 # fi 03:19:51 03:19:51 # 03:19:51 # general settings which should be applied for all scripts go here; please keep 03:19:51 # in mind that it is possible that scripts might be executed more than once, e.g. 03:19:51 # in example of the start script where the start script is executed first and the 03:19:51 # karaf script afterwards. 03:19:51 # 03:19:51 03:19:51 # 03:19:51 # The following section shows the possible configuration options for the default 03:19:51 # karaf scripts 03:19:51 # 03:19:51 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 03:19:51 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 03:19:51 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 03:19:51 # export EXTRA_JAVA_OPTS # Additional JVM options 03:19:51 # export KARAF_HOME # Karaf home folder 03:19:51 # export KARAF_DATA # Karaf data folder 03:19:51 # export KARAF_BASE # Karaf base folder 03:19:51 # export KARAF_ETC # Karaf etc folder 03:19:51 # export KARAF_LOG # Karaf log folder 03:19:51 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 03:19:51 # export KARAF_OPTS # Additional available Karaf options 03:19:51 # export KARAF_DEBUG # Enable debug mode 03:19:51 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 03:19:51 # export KARAF_NOROOT # Prevent execution as root if set to true 03:19:51 Set Java version 03:19:51 + echo 'Set Java version' 03:19:51 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 03:19:51 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:51 sudo: a password is required 03:19:51 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:51 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:51 sudo: a password is required 03:19:51 JDK default version ... 03:19:51 + echo 'JDK default version ...' 03:19:51 + java -version 03:19:51 openjdk version "21.0.5" 2024-10-15 03:19:51 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 03:19:51 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 03:19:51 Set JAVA_HOME 03:19:51 + echo 'Set JAVA_HOME' 03:19:51 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:51 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:51 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:51 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:51 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 03:19:51 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:51 Listing all open ports on controller system... 03:19:51 + echo 'Listing all open ports on controller system...' 03:19:51 + netstat -pnatu 03:19:51 /tmp/configuration-script.sh: line 40: netstat: command not found 03:19:51 Configuring cluster 03:19:51 + '[' -f /tmp/custom_shard_config.txt ']' 03:19:51 + echo 'Configuring cluster' 03:19:51 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 2 10.30.171.230 10.30.171.111 10.30.171.29 03:19:51 ################################################ 03:19:51 ## Configure Cluster ## 03:19:51 ################################################ 03:19:51 ERROR: Cluster configurations files not found. Please configure clustering feature. 03:19:51 Dump pekko.conf 03:19:51 + echo 'Dump pekko.conf' 03:19:51 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:51 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:51 Dump modules.conf 03:19:51 + echo 'Dump modules.conf' 03:19:51 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:51 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 03:19:51 Dump module-shards.conf 03:19:51 + echo 'Dump module-shards.conf' 03:19:51 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:51 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 03:19:51 Configuring member-3 with IP address 10.30.171.29 03:19:51 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:52 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:52 + source /tmp/common-functions.sh karaf-0.22.1 titanium 03:19:52 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 03:19:52 common-functions.sh is being sourced 03:19:52 ++ echo 'common-functions.sh is being sourced' 03:19:52 ++ BUNDLEFOLDER=karaf-0.22.1 03:19:52 ++ DISTROSTREAM=titanium 03:19:52 ++ export MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:52 ++ MAVENCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:52 ++ export FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:52 ++ FEATURESCONF=/tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:52 ++ export CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:52 ++ CUSTOMPROP=/tmp/karaf-0.22.1/etc/custom.properties 03:19:52 ++ export LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:52 ++ LOGCONF=/tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:52 ++ export MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:52 ++ MEMCONF=/tmp/karaf-0.22.1/bin/setenv 03:19:52 ++ export CONTROLLERMEM= 03:19:52 ++ CONTROLLERMEM= 03:19:52 ++ case "${DISTROSTREAM}" in 03:19:52 ++ CLUSTER_SYSTEM=pekko 03:19:52 ++ export AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:52 ++ AKKACONF=/tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:52 ++ export MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:52 ++ MODULESCONF=/tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:52 ++ export MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:52 ++ MODULESHARDSCONF=/tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:52 ++ print_common_env 03:19:52 ++ cat 03:19:52 common-functions environment: 03:19:52 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:52 ACTUALFEATURES: 03:19:52 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:52 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 03:19:52 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:52 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 03:19:52 CONTROLLERMEM: 03:19:52 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:52 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:52 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:52 SUITES: 03:19:52 03:19:52 ++ SSH='ssh -t -t' 03:19:52 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 03:19:52 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 03:19:52 Changing to /tmp 03:19:52 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:52 + echo 'Changing to /tmp' 03:19:52 + cd /tmp 03:19:52 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip' 03:19:52 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:52 --2025-08-23 03:19:52-- https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip 03:19:52 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 03:19:52 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 03:19:52 HTTP request sent, awaiting response... 200 OK 03:19:52 Length: 236634156 (226M) [application/zip] 03:19:52 Saving to: ‘karaf-0.22.1.zip’ 03:19:52 03:19:52 0K ........ ........ ........ ........ ........ ........ 1% 57.9M 4s 03:19:52 3072K ........ ........ ........ ........ ........ ........ 2% 85.8M 3s 03:19:52 6144K ........ ........ ........ ........ ........ ........ 3% 149M 3s 03:19:52 9216K ........ ........ ........ ........ ........ ........ 5% 145M 2s 03:19:52 12288K ........ ........ ........ ........ ........ ........ 6% 159M 2s 03:19:52 15360K ........ ........ ........ ........ ........ ........ 7% 163M 2s 03:19:52 18432K ........ ........ ........ ........ ........ ........ 9% 126M 2s 03:19:52 21504K ........ ........ ........ ........ ........ ........ 10% 164M 2s 03:19:52 24576K ........ ........ ........ ........ ........ ........ 11% 199M 2s 03:19:52 27648K ........ ........ ........ ........ ........ ........ 13% 196M 2s 03:19:52 30720K ........ ........ ........ ........ ........ ........ 14% 204M 1s 03:19:52 33792K ........ ........ ........ ........ ........ ........ 15% 259M 1s 03:19:52 36864K ........ ........ ........ ........ ........ ........ 17% 230M 1s 03:19:52 39936K ........ ........ ........ ........ ........ ........ 18% 273M 1s 03:19:52 43008K ........ ........ ........ ........ ........ ........ 19% 274M 1s 03:19:52 46080K ........ ........ ........ ........ ........ ........ 21% 266M 1s 03:19:52 49152K ........ ........ ........ ........ ........ ........ 22% 244M 1s 03:19:52 52224K ........ ........ ........ ........ ........ ........ 23% 229M 1s 03:19:52 55296K ........ ........ ........ ........ ........ ........ 25% 225M 1s 03:19:52 58368K ........ ........ ........ ........ ........ ........ 26% 277M 1s 03:19:52 61440K ........ ........ ........ ........ ........ ........ 27% 243M 1s 03:19:52 64512K ........ ........ ........ ........ ........ ........ 29% 240M 1s 03:19:52 67584K ........ ........ ........ ........ ........ ........ 30% 273M 1s 03:19:52 70656K ........ ........ ........ ........ ........ ........ 31% 253M 1s 03:19:52 73728K ........ ........ ........ ........ ........ ........ 33% 254M 1s 03:19:52 76800K ........ ........ ........ ........ ........ ........ 34% 270M 1s 03:19:52 79872K ........ ........ ........ ........ ........ ........ 35% 241M 1s 03:19:52 82944K ........ ........ ........ ........ ........ ........ 37% 239M 1s 03:19:52 86016K ........ ........ ........ ........ ........ ........ 38% 264M 1s 03:19:52 89088K ........ ........ ........ ........ ........ ........ 39% 271M 1s 03:19:52 92160K ........ ........ ........ ........ ........ ........ 41% 253M 1s 03:19:52 95232K ........ ........ ........ ........ ........ ........ 42% 214M 1s 03:19:52 98304K ........ ........ ........ ........ ........ ........ 43% 273M 1s 03:19:52 101376K ........ ........ ........ ........ ........ ........ 45% 261M 1s 03:19:52 104448K ........ ........ ........ ........ ........ ........ 46% 260M 1s 03:19:52 107520K ........ ........ ........ ........ ........ ........ 47% 257M 1s 03:19:52 110592K ........ ........ ........ ........ ........ ........ 49% 256M 1s 03:19:52 113664K ........ ........ ........ ........ ........ ........ 50% 266M 1s 03:19:52 116736K ........ ........ ........ ........ ........ ........ 51% 264M 1s 03:19:52 119808K ........ ........ ........ ........ ........ ........ 53% 274M 1s 03:19:53 122880K ........ ........ ........ ........ ........ ........ 54% 281M 1s 03:19:53 125952K ........ ........ ........ ........ ........ ........ 55% 269M 0s 03:19:53 129024K ........ ........ ........ ........ ........ ........ 57% 270M 0s 03:19:53 132096K ........ ........ ........ ........ ........ ........ 58% 269M 0s 03:19:53 135168K ........ ........ ........ ........ ........ ........ 59% 256M 0s 03:19:53 138240K ........ ........ ........ ........ ........ ........ 61% 271M 0s 03:19:53 141312K ........ ........ ........ ........ ........ ........ 62% 272M 0s 03:19:53 144384K ........ ........ ........ ........ ........ ........ 63% 258M 0s 03:19:53 147456K ........ ........ ........ ........ ........ ........ 65% 265M 0s 03:19:53 150528K ........ ........ ........ ........ ........ ........ 66% 272M 0s 03:19:53 153600K ........ ........ ........ ........ ........ ........ 67% 276M 0s 03:19:53 156672K ........ ........ ........ ........ ........ ........ 69% 243M 0s 03:19:53 159744K ........ ........ ........ ........ ........ ........ 70% 266M 0s 03:19:53 162816K ........ ........ ........ ........ ........ ........ 71% 246M 0s 03:19:53 165888K ........ ........ ........ ........ ........ ........ 73% 264M 0s 03:19:53 168960K ........ ........ ........ ........ ........ ........ 74% 277M 0s 03:19:53 172032K ........ ........ ........ ........ ........ ........ 75% 270M 0s 03:19:53 175104K ........ ........ ........ ........ ........ ........ 77% 289M 0s 03:19:53 178176K ........ ........ ........ ........ ........ ........ 78% 279M 0s 03:19:53 181248K ........ ........ ........ ........ ........ ........ 79% 273M 0s 03:19:53 184320K ........ ........ ........ ........ ........ ........ 81% 276M 0s 03:19:53 187392K ........ ........ ........ ........ ........ ........ 82% 282M 0s 03:19:53 190464K ........ ........ ........ ........ ........ ........ 83% 267M 0s 03:19:53 193536K ........ ........ ........ ........ ........ ........ 85% 268M 0s 03:19:53 196608K ........ ........ ........ ........ ........ ........ 86% 269M 0s 03:19:53 199680K ........ ........ ........ ........ ........ ........ 87% 253M 0s 03:19:53 202752K ........ ........ ........ ........ ........ ........ 89% 266M 0s 03:19:53 205824K ........ ........ ........ ........ ........ ........ 90% 244M 0s 03:19:53 208896K ........ ........ ........ ........ ........ ........ 91% 257M 0s 03:19:53 211968K ........ ........ ........ ........ ........ ........ 93% 269M 0s 03:19:53 215040K ........ ........ ........ ........ ........ ........ 94% 265M 0s 03:19:53 218112K ........ ........ ........ ........ ........ ........ 95% 271M 0s 03:19:53 221184K ........ ........ ........ ........ ........ ........ 97% 274M 0s 03:19:53 224256K ........ ........ ........ ........ ........ ........ 98% 289M 0s 03:19:53 227328K ........ ........ ........ ........ ........ ........ 99% 286M 0s 03:19:53 230400K ........ .. 100% 313M=1.0s 03:19:53 03:19:53 2025-08-23 03:19:53 (228 MB/s) - ‘karaf-0.22.1.zip’ saved [236634156/236634156] 03:19:53 03:19:53 Extracting the new controller... 03:19:53 + echo 'Extracting the new controller...' 03:19:53 + unzip -q karaf-0.22.1.zip 03:19:55 Adding external repositories... 03:19:55 + echo 'Adding external repositories...' 03:19:55 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:55 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:55 ################################################################################ 03:19:55 # 03:19:55 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:55 # contributor license agreements. See the NOTICE file distributed with 03:19:55 # this work for additional information regarding copyright ownership. 03:19:55 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:55 # (the "License"); you may not use this file except in compliance with 03:19:55 # the License. You may obtain a copy of the License at 03:19:55 # 03:19:55 # http://www.apache.org/licenses/LICENSE-2.0 03:19:55 # 03:19:55 # Unless required by applicable law or agreed to in writing, software 03:19:55 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:55 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:55 # See the License for the specific language governing permissions and 03:19:55 # limitations under the License. 03:19:55 # 03:19:55 ################################################################################ 03:19:55 03:19:55 # 03:19:55 # If set to true, the following property will not allow any certificate to be used 03:19:55 # when accessing Maven repositories through SSL 03:19:55 # 03:19:55 #org.ops4j.pax.url.mvn.certificateCheck= 03:19:55 03:19:55 # 03:19:55 # Path to the local Maven settings file. 03:19:55 # The repositories defined in this file will be automatically added to the list 03:19:55 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 03:19:55 # below is not set. 03:19:55 # The following locations are checked for the existence of the settings.xml file 03:19:55 # * 1. looks for the specified url 03:19:55 # * 2. if not found looks for ${user.home}/.m2/settings.xml 03:19:55 # * 3. if not found looks for ${maven.home}/conf/settings.xml 03:19:55 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 03:19:55 # 03:19:55 #org.ops4j.pax.url.mvn.settings= 03:19:55 03:19:55 # 03:19:55 # Path to the local Maven repository which is used to avoid downloading 03:19:55 # artifacts when they already exist locally. 03:19:55 # The value of this property will be extracted from the settings.xml file 03:19:55 # above, or defaulted to: 03:19:55 # System.getProperty( "user.home" ) + "/.m2/repository" 03:19:55 # 03:19:55 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 03:19:55 03:19:55 # 03:19:55 # Default this to false. It's just weird to use undocumented repos 03:19:55 # 03:19:55 org.ops4j.pax.url.mvn.useFallbackRepositories=false 03:19:55 03:19:55 # 03:19:55 # Uncomment if you don't wanna use the proxy settings 03:19:55 # from the Maven conf/settings.xml file 03:19:55 # 03:19:55 # org.ops4j.pax.url.mvn.proxySupport=false 03:19:55 03:19:55 # 03:19:55 # Comma separated list of repositories scanned when resolving an artifact. 03:19:55 # Those repositories will be checked before iterating through the 03:19:55 # below list of repositories and even before the local repository 03:19:55 # A repository url can be appended with zero or more of the following flags: 03:19:55 # @snapshots : the repository contains snaphots 03:19:55 # @noreleases : the repository does not contain any released artifacts 03:19:55 # 03:19:55 # The following property value will add the system folder as a repo. 03:19:55 # 03:19:55 org.ops4j.pax.url.mvn.defaultRepositories=\ 03:19:55 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 03:19:55 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 03:19:55 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 03:19:55 03:19:55 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 03:19:55 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 03:19:55 03:19:55 # 03:19:55 # Comma separated list of repositories scanned when resolving an artifact. 03:19:55 # The default list includes the following repositories: 03:19:55 # http://repo1.maven.org/maven2@id=central 03:19:55 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 03:19:55 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 03:19:55 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 03:19:55 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 03:19:55 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 03:19:55 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:55 # To add repositories to the default ones, prepend '+' to the list of repositories 03:19:55 # to add. 03:19:55 # A repository url can be appended with zero or more of the following flags: 03:19:55 # @snapshots : the repository contains snapshots 03:19:55 # @noreleases : the repository does not contain any released artifacts 03:19:55 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 03:19:55 # 03:19:55 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 03:19:55 03:19:55 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 03:19:55 + [[ True == \T\r\u\e ]] 03:19:55 + echo 'Configuring the startup features...' 03:19:55 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:55 + FEATURE_TEST_STRING=features-test 03:19:55 + FEATURE_TEST_VERSION=0.22.1 03:19:55 + KARAF_VERSION=karaf4 03:19:55 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 03:19:55 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:55 + [[ ! -z '' ]] 03:19:55 + cat /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:55 ################################################################################ 03:19:55 # 03:19:55 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:55 # contributor license agreements. See the NOTICE file distributed with 03:19:55 # this work for additional information regarding copyright ownership. 03:19:55 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:55 # (the "License"); you may not use this file except in compliance with 03:19:55 # the License. You may obtain a copy of the License at 03:19:55 # 03:19:55 # http://www.apache.org/licenses/LICENSE-2.0 03:19:55 # 03:19:55 # Unless required by applicable law or agreed to in writing, software 03:19:55 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:55 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:55 # See the License for the specific language governing permissions and 03:19:55 # limitations under the License. 03:19:55 # 03:19:55 ################################################################################ 03:19:55 03:19:55 # 03:19:55 # Comma separated list of features repositories to register by default 03:19:55 # 03:19:55 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.22.1/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/660b995b-9997-42c5-9b03-155bcd1db9f9.xml 03:19:55 03:19:55 # 03:19:55 # Comma separated list of features to install at startup 03:19:55 # 03:19:55 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, daa896e6-53a8-4216-a40d-9155d1754fac 03:19:55 03:19:55 # 03:19:55 # Resource repositories (OBR) that the features resolver can use 03:19:55 # to resolve requirements/capabilities 03:19:55 # 03:19:55 # The format of the resourceRepositories is 03:19:55 # resourceRepositories=[xml:url|json:url],... 03:19:55 # for Instance: 03:19:55 # 03:19:55 #resourceRepositories=xml:http://host/path/to/index.xml 03:19:55 # or 03:19:55 #resourceRepositories=json:http://host/path/to/index.json 03:19:55 # 03:19:55 03:19:55 # 03:19:55 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 03:19:55 # 03:19:55 featuresBootAsynchronous=false 03:19:55 03:19:55 # 03:19:55 # Service requirements enforcement 03:19:55 # 03:19:55 # By default, the feature resolver checks the service requirements/capabilities of 03:19:55 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 03:19:55 # the required bundles. 03:19:55 # The following flag can have those values: 03:19:55 # - disable: service requirements are completely ignored 03:19:55 # - default: service requirements are ignored for old features 03:19:55 # - enforce: service requirements are always verified 03:19:55 # 03:19:55 #serviceRequirements=default 03:19:55 03:19:55 # 03:19:55 # Store cfg file for config element in feature 03:19:55 # 03:19:55 #configCfgStore=true 03:19:55 03:19:55 # 03:19:55 # Define if the feature service automatically refresh bundles 03:19:55 # 03:19:55 autoRefresh=true 03:19:55 03:19:55 # 03:19:55 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 03:19:55 # XML file defines instructions related to features processing 03:19:55 # versions.properties may declare properties to resolve placeholders in XML file 03:19:55 # both files are relative to ${karaf.etc} 03:19:55 # 03:19:55 #featureProcessing=org.apache.karaf.features.xml 03:19:55 #featureProcessingVersions=versions.properties 03:19:55 + configure_karaf_log karaf4 '' 03:19:55 + local -r karaf_version=karaf4 03:19:55 + local -r controllerdebugmap= 03:19:55 + local logapi=log4j 03:19:55 + grep log4j2 /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:55 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:55 log4j2.rootLogger.level = INFO 03:19:55 #log4j2.rootLogger.type = asyncRoot 03:19:55 #log4j2.rootLogger.includeLocation = false 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:55 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:55 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:55 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:55 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:55 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:55 log4j2.logger.spifly.level = WARN 03:19:55 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:55 log4j2.logger.audit.level = INFO 03:19:55 log4j2.logger.audit.additivity = false 03:19:55 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:55 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:55 log4j2.appender.console.type = Console 03:19:55 log4j2.appender.console.name = Console 03:19:55 log4j2.appender.console.layout.type = PatternLayout 03:19:55 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:55 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:55 log4j2.appender.rolling.name = RollingFile 03:19:55 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:55 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:55 #log4j2.appender.rolling.immediateFlush = false 03:19:55 log4j2.appender.rolling.append = true 03:19:55 log4j2.appender.rolling.layout.type = PatternLayout 03:19:55 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:55 log4j2.appender.rolling.policies.type = Policies 03:19:55 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:55 log4j2.appender.rolling.policies.size.size = 64MB 03:19:55 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:55 log4j2.appender.rolling.strategy.max = 7 03:19:55 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:55 log4j2.appender.audit.name = AuditRollingFile 03:19:55 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:55 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:55 log4j2.appender.audit.append = true 03:19:55 log4j2.appender.audit.layout.type = PatternLayout 03:19:55 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:55 log4j2.appender.audit.policies.type = Policies 03:19:55 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:55 log4j2.appender.audit.policies.size.size = 8MB 03:19:55 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:55 log4j2.appender.audit.strategy.max = 7 03:19:55 log4j2.appender.osgi.type = PaxOsgi 03:19:55 log4j2.appender.osgi.name = PaxOsgi 03:19:55 log4j2.appender.osgi.filter = * 03:19:55 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:55 #log4j2.logger.aether.level = TRACE 03:19:55 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:55 #log4j2.logger.http-headers.level = DEBUG 03:19:55 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:55 #log4j2.logger.maven.level = TRACE 03:19:55 + logapi=log4j2 03:19:55 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 03:19:55 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 03:19:55 + '[' log4j2 == log4j2 ']' 03:19:55 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:55 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 03:19:55 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 03:19:55 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 03:19:55 controllerdebugmap: 03:19:55 cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:55 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 03:19:55 + unset IFS 03:19:55 + echo 'controllerdebugmap: ' 03:19:55 + '[' -n '' ']' 03:19:55 + echo 'cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg' 03:19:55 + cat /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:55 ################################################################################ 03:19:55 # 03:19:55 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:55 # contributor license agreements. See the NOTICE file distributed with 03:19:55 # this work for additional information regarding copyright ownership. 03:19:55 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:55 # (the "License"); you may not use this file except in compliance with 03:19:55 # the License. You may obtain a copy of the License at 03:19:55 # 03:19:55 # http://www.apache.org/licenses/LICENSE-2.0 03:19:55 # 03:19:55 # Unless required by applicable law or agreed to in writing, software 03:19:55 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:55 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:55 # See the License for the specific language governing permissions and 03:19:55 # limitations under the License. 03:19:55 # 03:19:55 ################################################################################ 03:19:55 03:19:55 # Common pattern layout for appenders 03:19:55 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:55 03:19:55 # Root logger 03:19:55 log4j2.rootLogger.level = INFO 03:19:55 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:55 #log4j2.rootLogger.type = asyncRoot 03:19:55 #log4j2.rootLogger.includeLocation = false 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:55 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:55 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:55 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:55 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:55 03:19:55 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:55 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:55 03:19:55 # Loggers configuration 03:19:55 03:19:55 # Spifly logger 03:19:55 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:55 log4j2.logger.spifly.level = WARN 03:19:55 03:19:55 # Security audit logger 03:19:55 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:55 log4j2.logger.audit.level = INFO 03:19:55 log4j2.logger.audit.additivity = false 03:19:55 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:55 03:19:55 # Appenders configuration 03:19:55 03:19:55 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:55 log4j2.appender.console.type = Console 03:19:55 log4j2.appender.console.name = Console 03:19:55 log4j2.appender.console.layout.type = PatternLayout 03:19:55 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:55 03:19:55 # Rolling file appender 03:19:55 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:55 log4j2.appender.rolling.name = RollingFile 03:19:55 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:55 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:55 # uncomment to not force a disk flush 03:19:55 #log4j2.appender.rolling.immediateFlush = false 03:19:55 log4j2.appender.rolling.append = true 03:19:55 log4j2.appender.rolling.layout.type = PatternLayout 03:19:55 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:55 log4j2.appender.rolling.policies.type = Policies 03:19:55 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:55 log4j2.appender.rolling.policies.size.size = 1GB 03:19:55 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:55 log4j2.appender.rolling.strategy.max = 7 03:19:55 03:19:55 # Audit file appender 03:19:55 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:55 log4j2.appender.audit.name = AuditRollingFile 03:19:55 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:55 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:55 log4j2.appender.audit.append = true 03:19:55 log4j2.appender.audit.layout.type = PatternLayout 03:19:55 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:55 log4j2.appender.audit.policies.type = Policies 03:19:55 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:55 log4j2.appender.audit.policies.size.size = 8MB 03:19:55 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:55 log4j2.appender.audit.strategy.max = 7 03:19:55 03:19:55 # OSGi appender 03:19:55 log4j2.appender.osgi.type = PaxOsgi 03:19:55 log4j2.appender.osgi.name = PaxOsgi 03:19:55 log4j2.appender.osgi.filter = * 03:19:55 03:19:55 # help with identification of maven-related problems with pax-url-aether 03:19:55 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:55 #log4j2.logger.aether.level = TRACE 03:19:55 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:55 #log4j2.logger.http-headers.level = DEBUG 03:19:55 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:55 #log4j2.logger.maven.level = TRACE 03:19:55 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:55 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:55 Configure 03:19:55 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.22.1/bin/setenv 03:19:55 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 03:19:55 + local -r controllermem=2048m 03:19:55 + local -r memconf=/tmp/karaf-0.22.1/bin/setenv 03:19:55 + echo Configure 03:19:55 java home: /usr/lib/jvm/java-21-openjdk-amd64 03:19:55 max memory: 2048m 03:19:55 memconf: /tmp/karaf-0.22.1/bin/setenv 03:19:55 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 03:19:55 + echo ' max memory: 2048m' 03:19:55 + echo ' memconf: /tmp/karaf-0.22.1/bin/setenv' 03:19:55 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.22.1/bin/setenv 03:19:55 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.22.1/bin/setenv 03:19:55 cat /tmp/karaf-0.22.1/bin/setenv 03:19:55 + echo 'cat /tmp/karaf-0.22.1/bin/setenv' 03:19:55 + cat /tmp/karaf-0.22.1/bin/setenv 03:19:55 #!/bin/sh 03:19:55 # 03:19:55 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:55 # contributor license agreements. See the NOTICE file distributed with 03:19:55 # this work for additional information regarding copyright ownership. 03:19:55 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:55 # (the "License"); you may not use this file except in compliance with 03:19:55 # the License. You may obtain a copy of the License at 03:19:55 # 03:19:55 # http://www.apache.org/licenses/LICENSE-2.0 03:19:55 # 03:19:55 # Unless required by applicable law or agreed to in writing, software 03:19:55 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:55 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:55 # See the License for the specific language governing permissions and 03:19:55 # limitations under the License. 03:19:55 # 03:19:55 03:19:55 # 03:19:55 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 03:19:55 # script: client, instance, shell, start, status, stop, karaf 03:19:55 # 03:19:55 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 03:19:55 # Actions go here... 03:19:55 # fi 03:19:55 03:19:55 # 03:19:55 # general settings which should be applied for all scripts go here; please keep 03:19:55 # in mind that it is possible that scripts might be executed more than once, e.g. 03:19:55 # in example of the start script where the start script is executed first and the 03:19:55 # karaf script afterwards. 03:19:55 # 03:19:55 03:19:55 # 03:19:55 # The following section shows the possible configuration options for the default 03:19:55 # karaf scripts 03:19:55 # 03:19:55 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 03:19:55 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 03:19:55 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 03:19:55 # export EXTRA_JAVA_OPTS # Additional JVM options 03:19:55 # export KARAF_HOME # Karaf home folder 03:19:55 # export KARAF_DATA # Karaf data folder 03:19:55 # export KARAF_BASE # Karaf base folder 03:19:55 # export KARAF_ETC # Karaf etc folder 03:19:55 # export KARAF_LOG # Karaf log folder 03:19:55 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 03:19:55 # export KARAF_OPTS # Additional available Karaf options 03:19:55 # export KARAF_DEBUG # Enable debug mode 03:19:55 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 03:19:55 # export KARAF_NOROOT # Prevent execution as root if set to true 03:19:55 Set Java version 03:19:55 + echo 'Set Java version' 03:19:55 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 03:19:55 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:55 sudo: a password is required 03:19:55 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:55 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 03:19:55 sudo: a password is required 03:19:55 JDK default version ... 03:19:55 + echo 'JDK default version ...' 03:19:55 + java -version 03:19:55 openjdk version "21.0.5" 2024-10-15 03:19:55 OpenJDK Runtime Environment (build 21.0.5+11-Ubuntu-1ubuntu122.04) 03:19:55 OpenJDK 64-Bit Server VM (build 21.0.5+11-Ubuntu-1ubuntu122.04, mixed mode, sharing) 03:19:55 Set JAVA_HOME 03:19:55 + echo 'Set JAVA_HOME' 03:19:55 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:55 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 03:19:55 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:55 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:55 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 03:19:55 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 03:19:55 Listing all open ports on controller system... 03:19:55 + echo 'Listing all open ports on controller system...' 03:19:55 + netstat -pnatu 03:19:55 /tmp/configuration-script.sh: line 40: netstat: command not found 03:19:55 Configuring cluster 03:19:55 + '[' -f /tmp/custom_shard_config.txt ']' 03:19:55 + echo 'Configuring cluster' 03:19:55 + /tmp/karaf-0.22.1/bin/configure_cluster.sh 3 10.30.171.230 10.30.171.111 10.30.171.29 03:19:55 ################################################ 03:19:55 ## Configure Cluster ## 03:19:55 ################################################ 03:19:55 ERROR: Cluster configurations files not found. Please configure clustering feature. 03:19:55 Dump pekko.conf 03:19:55 + echo 'Dump pekko.conf' 03:19:55 + cat /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:55 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:55 Dump modules.conf 03:19:55 + echo 'Dump modules.conf' 03:19:55 + cat /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:55 Dump module-shards.conf 03:19:55 cat: /tmp/karaf-0.22.1/configuration/initial/modules.conf: No such file or directory 03:19:55 + echo 'Dump module-shards.conf' 03:19:55 + cat /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:55 cat: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf: No such file or directory 03:19:55 Locating config plan to use... 03:19:55 config plan exists!!! 03:19:55 Changing the config plan path... 03:19:55 # Place the suites in run order: 03:19:55 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh 03:19:55 Executing /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/scripts/set_akka_debug.sh... 03:19:55 Copying config files to ODL Controller folder 03:19:55 Set AKKA/PEKKO debug on 10.30.171.230 03:19:55 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:56 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:56 Enable AKKA/PEKKO debug 03:19:56 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:56 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:56 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:56 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:56 ################################################################################ 03:19:56 # 03:19:56 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:56 # contributor license agreements. See the NOTICE file distributed with 03:19:56 # this work for additional information regarding copyright ownership. 03:19:56 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:56 # (the "License"); you may not use this file except in compliance with 03:19:56 # the License. You may obtain a copy of the License at 03:19:56 # 03:19:56 # http://www.apache.org/licenses/LICENSE-2.0 03:19:56 # 03:19:56 # Unless required by applicable law or agreed to in writing, software 03:19:56 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:56 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:56 # See the License for the specific language governing permissions and 03:19:56 # limitations under the License. 03:19:56 # 03:19:56 ################################################################################ 03:19:56 03:19:56 # Common pattern layout for appenders 03:19:56 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:56 03:19:56 # Root logger 03:19:56 log4j2.rootLogger.level = INFO 03:19:56 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:56 #log4j2.rootLogger.type = asyncRoot 03:19:56 #log4j2.rootLogger.includeLocation = false 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:56 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:56 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:56 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:56 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:56 03:19:56 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:56 03:19:56 # Loggers configuration 03:19:56 03:19:56 # Spifly logger 03:19:56 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:56 log4j2.logger.spifly.level = WARN 03:19:56 03:19:56 # Security audit logger 03:19:56 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:56 log4j2.logger.audit.level = INFO 03:19:56 log4j2.logger.audit.additivity = false 03:19:56 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:56 03:19:56 # Appenders configuration 03:19:56 03:19:56 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:56 log4j2.appender.console.type = Console 03:19:56 log4j2.appender.console.name = Console 03:19:56 log4j2.appender.console.layout.type = PatternLayout 03:19:56 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:56 03:19:56 # Rolling file appender 03:19:56 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:56 log4j2.appender.rolling.name = RollingFile 03:19:56 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:56 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:56 # uncomment to not force a disk flush 03:19:56 #log4j2.appender.rolling.immediateFlush = false 03:19:56 log4j2.appender.rolling.append = true 03:19:56 log4j2.appender.rolling.layout.type = PatternLayout 03:19:56 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:56 log4j2.appender.rolling.policies.type = Policies 03:19:56 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:56 log4j2.appender.rolling.policies.size.size = 1GB 03:19:56 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:56 log4j2.appender.rolling.strategy.max = 7 03:19:56 03:19:56 # Audit file appender 03:19:56 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:56 log4j2.appender.audit.name = AuditRollingFile 03:19:56 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:56 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:56 log4j2.appender.audit.append = true 03:19:56 log4j2.appender.audit.layout.type = PatternLayout 03:19:56 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:56 log4j2.appender.audit.policies.type = Policies 03:19:56 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:56 log4j2.appender.audit.policies.size.size = 8MB 03:19:56 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:56 log4j2.appender.audit.strategy.max = 7 03:19:56 03:19:56 # OSGi appender 03:19:56 log4j2.appender.osgi.type = PaxOsgi 03:19:56 log4j2.appender.osgi.name = PaxOsgi 03:19:56 log4j2.appender.osgi.filter = * 03:19:56 03:19:56 # help with identification of maven-related problems with pax-url-aether 03:19:56 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:56 #log4j2.logger.aether.level = TRACE 03:19:56 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:56 #log4j2.logger.http-headers.level = DEBUG 03:19:56 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:56 #log4j2.logger.maven.level = TRACE 03:19:56 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:56 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:56 log4j2.logger.cluster.name=akka.cluster 03:19:56 log4j2.logger.cluster.level=DEBUG 03:19:56 log4j2.logger.remote.name=akka.remote 03:19:56 log4j2.logger.remote.level=DEBUG 03:19:56 Set AKKA/PEKKO debug on 10.30.171.111 03:19:56 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:56 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:56 Enable AKKA/PEKKO debug 03:19:56 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:56 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:56 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:56 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:56 ################################################################################ 03:19:56 # 03:19:56 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:56 # contributor license agreements. See the NOTICE file distributed with 03:19:56 # this work for additional information regarding copyright ownership. 03:19:56 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:56 # (the "License"); you may not use this file except in compliance with 03:19:56 # the License. You may obtain a copy of the License at 03:19:56 # 03:19:56 # http://www.apache.org/licenses/LICENSE-2.0 03:19:56 # 03:19:56 # Unless required by applicable law or agreed to in writing, software 03:19:56 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:56 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:56 # See the License for the specific language governing permissions and 03:19:56 # limitations under the License. 03:19:56 # 03:19:56 ################################################################################ 03:19:56 03:19:56 # Common pattern layout for appenders 03:19:56 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:56 03:19:56 # Root logger 03:19:56 log4j2.rootLogger.level = INFO 03:19:56 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:56 #log4j2.rootLogger.type = asyncRoot 03:19:56 #log4j2.rootLogger.includeLocation = false 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:56 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:56 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:56 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:56 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:56 03:19:56 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:56 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:56 03:19:56 # Loggers configuration 03:19:56 03:19:56 # Spifly logger 03:19:56 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:56 log4j2.logger.spifly.level = WARN 03:19:56 03:19:56 # Security audit logger 03:19:56 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:56 log4j2.logger.audit.level = INFO 03:19:56 log4j2.logger.audit.additivity = false 03:19:56 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:56 03:19:56 # Appenders configuration 03:19:56 03:19:56 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:56 log4j2.appender.console.type = Console 03:19:56 log4j2.appender.console.name = Console 03:19:56 log4j2.appender.console.layout.type = PatternLayout 03:19:56 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:56 03:19:56 # Rolling file appender 03:19:56 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:56 log4j2.appender.rolling.name = RollingFile 03:19:56 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:56 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:56 # uncomment to not force a disk flush 03:19:56 #log4j2.appender.rolling.immediateFlush = false 03:19:56 log4j2.appender.rolling.append = true 03:19:56 log4j2.appender.rolling.layout.type = PatternLayout 03:19:56 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:56 log4j2.appender.rolling.policies.type = Policies 03:19:56 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:56 log4j2.appender.rolling.policies.size.size = 1GB 03:19:56 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:56 log4j2.appender.rolling.strategy.max = 7 03:19:56 03:19:56 # Audit file appender 03:19:56 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:56 log4j2.appender.audit.name = AuditRollingFile 03:19:56 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:56 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:56 log4j2.appender.audit.append = true 03:19:56 log4j2.appender.audit.layout.type = PatternLayout 03:19:56 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:56 log4j2.appender.audit.policies.type = Policies 03:19:56 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:56 log4j2.appender.audit.policies.size.size = 8MB 03:19:56 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:56 log4j2.appender.audit.strategy.max = 7 03:19:56 03:19:56 # OSGi appender 03:19:56 log4j2.appender.osgi.type = PaxOsgi 03:19:56 log4j2.appender.osgi.name = PaxOsgi 03:19:56 log4j2.appender.osgi.filter = * 03:19:56 03:19:56 # help with identification of maven-related problems with pax-url-aether 03:19:56 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:56 #log4j2.logger.aether.level = TRACE 03:19:56 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:56 #log4j2.logger.http-headers.level = DEBUG 03:19:56 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:56 #log4j2.logger.maven.level = TRACE 03:19:56 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:56 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:56 log4j2.logger.cluster.name=akka.cluster 03:19:56 log4j2.logger.cluster.level=DEBUG 03:19:56 log4j2.logger.remote.name=akka.remote 03:19:56 log4j2.logger.remote.level=DEBUG 03:19:56 Set AKKA/PEKKO debug on 10.30.171.29 03:19:56 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:57 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:57 Enable AKKA/PEKKO debug 03:19:57 sed: can't read /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:57 Dump /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:57 cat: /tmp/karaf-0.22.1/configuration/initial/pekko.conf: No such file or directory 03:19:57 Dump /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:57 ################################################################################ 03:19:57 # 03:19:57 # Licensed to the Apache Software Foundation (ASF) under one or more 03:19:57 # contributor license agreements. See the NOTICE file distributed with 03:19:57 # this work for additional information regarding copyright ownership. 03:19:57 # The ASF licenses this file to You under the Apache License, Version 2.0 03:19:57 # (the "License"); you may not use this file except in compliance with 03:19:57 # the License. You may obtain a copy of the License at 03:19:57 # 03:19:57 # http://www.apache.org/licenses/LICENSE-2.0 03:19:57 # 03:19:57 # Unless required by applicable law or agreed to in writing, software 03:19:57 # distributed under the License is distributed on an "AS IS" BASIS, 03:19:57 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 03:19:57 # See the License for the specific language governing permissions and 03:19:57 # limitations under the License. 03:19:57 # 03:19:57 ################################################################################ 03:19:57 03:19:57 # Common pattern layout for appenders 03:19:57 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 03:19:57 03:19:57 # Root logger 03:19:57 log4j2.rootLogger.level = INFO 03:19:57 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 03:19:57 #log4j2.rootLogger.type = asyncRoot 03:19:57 #log4j2.rootLogger.includeLocation = false 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 03:19:57 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 03:19:57 log4j2.rootLogger.appenderRef.Console.ref = Console 03:19:57 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 03:19:57 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 03:19:57 03:19:57 # Filters for logs marked by org.opendaylight.odlparent.Markers 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 03:19:57 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 03:19:57 03:19:57 # Loggers configuration 03:19:57 03:19:57 # Spifly logger 03:19:57 log4j2.logger.spifly.name = org.apache.aries.spifly 03:19:57 log4j2.logger.spifly.level = WARN 03:19:57 03:19:57 # Security audit logger 03:19:57 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 03:19:57 log4j2.logger.audit.level = INFO 03:19:57 log4j2.logger.audit.additivity = false 03:19:57 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 03:19:57 03:19:57 # Appenders configuration 03:19:57 03:19:57 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 03:19:57 log4j2.appender.console.type = Console 03:19:57 log4j2.appender.console.name = Console 03:19:57 log4j2.appender.console.layout.type = PatternLayout 03:19:57 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 03:19:57 03:19:57 # Rolling file appender 03:19:57 log4j2.appender.rolling.type = RollingRandomAccessFile 03:19:57 log4j2.appender.rolling.name = RollingFile 03:19:57 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 03:19:57 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 03:19:57 # uncomment to not force a disk flush 03:19:57 #log4j2.appender.rolling.immediateFlush = false 03:19:57 log4j2.appender.rolling.append = true 03:19:57 log4j2.appender.rolling.layout.type = PatternLayout 03:19:57 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 03:19:57 log4j2.appender.rolling.policies.type = Policies 03:19:57 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 03:19:57 log4j2.appender.rolling.policies.size.size = 1GB 03:19:57 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 03:19:57 log4j2.appender.rolling.strategy.max = 7 03:19:57 03:19:57 # Audit file appender 03:19:57 log4j2.appender.audit.type = RollingRandomAccessFile 03:19:57 log4j2.appender.audit.name = AuditRollingFile 03:19:57 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 03:19:57 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 03:19:57 log4j2.appender.audit.append = true 03:19:57 log4j2.appender.audit.layout.type = PatternLayout 03:19:57 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 03:19:57 log4j2.appender.audit.policies.type = Policies 03:19:57 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 03:19:57 log4j2.appender.audit.policies.size.size = 8MB 03:19:57 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 03:19:57 log4j2.appender.audit.strategy.max = 7 03:19:57 03:19:57 # OSGi appender 03:19:57 log4j2.appender.osgi.type = PaxOsgi 03:19:57 log4j2.appender.osgi.name = PaxOsgi 03:19:57 log4j2.appender.osgi.filter = * 03:19:57 03:19:57 # help with identification of maven-related problems with pax-url-aether 03:19:57 #log4j2.logger.aether.name = shaded.org.eclipse.aether 03:19:57 #log4j2.logger.aether.level = TRACE 03:19:57 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 03:19:57 #log4j2.logger.http-headers.level = DEBUG 03:19:57 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 03:19:57 #log4j2.logger.maven.level = TRACE 03:19:57 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 03:19:57 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 03:19:57 log4j2.logger.cluster.name=akka.cluster 03:19:57 log4j2.logger.cluster.level=DEBUG 03:19:57 log4j2.logger.remote.name=akka.remote 03:19:57 log4j2.logger.remote.level=DEBUG 03:19:57 Finished running config plans 03:19:57 Starting member-1 with IP address 10.30.171.230 03:19:57 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:57 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:57 Redirecting karaf console output to karaf_console.log 03:19:57 Starting controller... 03:19:57 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 03:19:57 Starting member-2 with IP address 10.30.171.111 03:19:57 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:58 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:19:58 Redirecting karaf console output to karaf_console.log 03:19:58 Starting controller... 03:19:58 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 03:19:58 Starting member-3 with IP address 10.30.171.29 03:19:58 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:58 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:19:58 Redirecting karaf console output to karaf_console.log 03:19:58 Starting controller... 03:19:58 start: Redirecting Karaf output to /tmp/karaf-0.22.1/data/log/karaf_console.log 03:19:58 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins16751704860582708211.sh 03:19:58 common-functions.sh is being sourced 03:19:58 common-functions environment: 03:19:58 MAVENCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.url.mvn.cfg 03:19:58 ACTUALFEATURES: 03:19:58 FEATURESCONF: /tmp/karaf-0.22.1/etc/org.apache.karaf.features.cfg 03:19:58 CUSTOMPROP: /tmp/karaf-0.22.1/etc/custom.properties 03:19:58 LOGCONF: /tmp/karaf-0.22.1/etc/org.ops4j.pax.logging.cfg 03:19:58 MEMCONF: /tmp/karaf-0.22.1/bin/setenv 03:19:58 CONTROLLERMEM: 2048m 03:19:58 AKKACONF: /tmp/karaf-0.22.1/configuration/initial/pekko.conf 03:19:58 MODULESCONF: /tmp/karaf-0.22.1/configuration/initial/modules.conf 03:19:58 MODULESHARDSCONF: /tmp/karaf-0.22.1/configuration/initial/module-shards.conf 03:19:58 SUITES: 03:19:58 03:19:58 + echo '#################################################' 03:19:58 ################################################# 03:19:58 + echo '## Verify Cluster is UP ##' 03:19:58 ## Verify Cluster is UP ## 03:19:58 + echo '#################################################' 03:19:58 ################################################# 03:19:58 + create_post_startup_script 03:19:58 + cat 03:19:58 + copy_and_run_post_startup_script 03:19:58 + seed_index=1 03:19:58 ++ seq 1 3 03:19:58 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:19:58 + CONTROLLERIP=ODL_SYSTEM_1_IP 03:19:58 + echo 'Execute the post startup script on controller 10.30.171.230' 03:19:58 Execute the post startup script on controller 10.30.171.230 03:19:58 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.230:/tmp/ 03:19:58 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:59 + ssh 10.30.171.230 'bash /tmp/post-startup-script.sh 1' 03:19:59 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:19:59 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:04 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:09 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:14 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:19 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:24 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:29 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:34 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:39 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:44 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:49 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:54 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:20:59 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 03:21:04 2025-08-23T03:20:25,274 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:21:04 Controller is UP 03:21:04 2025-08-23T03:20:25,274 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:21:04 Listing all open ports on controller system... 03:21:04 /tmp/post-startup-script.sh: line 51: netstat: command not found 03:21:04 looking for "BindException: Address already in use" in log file 03:21:04 looking for "server is unhealthy" in log file 03:21:04 + '[' 1 == 0 ']' 03:21:04 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:21:04 + CONTROLLERIP=ODL_SYSTEM_2_IP 03:21:04 + echo 'Execute the post startup script on controller 10.30.171.111' 03:21:04 Execute the post startup script on controller 10.30.171.111 03:21:04 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.111:/tmp/ 03:21:04 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:21:04 + ssh 10.30.171.111 'bash /tmp/post-startup-script.sh 2' 03:21:04 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:21:05 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:10 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:15 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:20 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:25 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:30 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:35 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:40 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:45 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:50 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:21:55 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:00 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:05 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 03:22:10 2025-08-23T03:20:15,337 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:22:10 Controller is UP 03:22:10 2025-08-23T03:20:15,337 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:22:10 Listing all open ports on controller system... 03:22:10 /tmp/post-startup-script.sh: line 51: netstat: command not found 03:22:10 looking for "BindException: Address already in use" in log file 03:22:10 looking for "server is unhealthy" in log file 03:22:10 + '[' 2 == 0 ']' 03:22:10 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:22:10 + CONTROLLERIP=ODL_SYSTEM_3_IP 03:22:10 + echo 'Execute the post startup script on controller 10.30.171.29' 03:22:10 Execute the post startup script on controller 10.30.171.29 03:22:10 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/post-startup-script.sh 10.30.171.29:/tmp/ 03:22:10 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:22:10 + ssh 10.30.171.29 'bash /tmp/post-startup-script.sh 3' 03:22:10 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:22:10 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:15 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:20 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:25 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:30 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:35 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:40 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:45 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:52 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:22:55 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:23:01 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:23:05 /tmp/post-startup-script.sh: line 4: netstat: command not found 03:23:10 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 03:23:15 2025-08-23T03:20:18,005 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:23:15 Controller is UP 03:23:15 2025-08-23T03:20:18,005 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 201 - org.opendaylight.infrautils.ready-api - 7.1.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 03:23:15 Listing all open ports on controller system... 03:23:15 /tmp/post-startup-script.sh: line 51: netstat: command not found 03:23:15 looking for "BindException: Address already in use" in log file 03:23:15 looking for "server is unhealthy" in log file 03:23:15 + '[' 0 == 0 ']' 03:23:15 + seed_index=1 03:23:15 + dump_controller_threads 03:23:15 ++ seq 1 3 03:23:15 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:15 + CONTROLLERIP=ODL_SYSTEM_1_IP 03:23:15 + echo 'Let'\''s take the karaf thread dump' 03:23:15 Let's take the karaf thread dump 03:23:15 + ssh 10.30.171.230 'sudo ps aux' 03:23:16 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:23:16 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 03:23:16 ++ grep -v grep 03:23:16 ++ tr -s ' ' 03:23:16 ++ cut -f2 '-d ' 03:23:16 + pid=2223 03:23:16 + echo 'karaf main: org.apache.karaf.main.Main, pid:2223' 03:23:16 karaf main: org.apache.karaf.main.Main, pid:2223 03:23:16 + ssh 10.30.171.230 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2223' 03:23:16 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:23:17 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:17 + CONTROLLERIP=ODL_SYSTEM_2_IP 03:23:17 + echo 'Let'\''s take the karaf thread dump' 03:23:17 Let's take the karaf thread dump 03:23:17 + ssh 10.30.171.111 'sudo ps aux' 03:23:17 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:23:17 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 03:23:17 ++ grep -v grep 03:23:17 ++ tr -s ' ' 03:23:17 ++ cut -f2 '-d ' 03:23:17 + pid=2114 03:23:17 + echo 'karaf main: org.apache.karaf.main.Main, pid:2114' 03:23:17 karaf main: org.apache.karaf.main.Main, pid:2114 03:23:17 + ssh 10.30.171.111 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2114' 03:23:17 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:23:18 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:18 + CONTROLLERIP=ODL_SYSTEM_3_IP 03:23:18 + echo 'Let'\''s take the karaf thread dump' 03:23:18 Let's take the karaf thread dump 03:23:18 + ssh 10.30.171.29 'sudo ps aux' 03:23:18 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:23:18 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_before.log 03:23:18 ++ grep -v grep 03:23:18 ++ tr -s ' ' 03:23:18 ++ cut -f2 '-d ' 03:23:18 + pid=2123 03:23:18 + echo 'karaf main: org.apache.karaf.main.Main, pid:2123' 03:23:18 karaf main: org.apache.karaf.main.Main, pid:2123 03:23:18 + ssh 10.30.171.29 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2123' 03:23:18 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:23:19 + '[' 0 -gt 0 ']' 03:23:19 + echo 'Generating controller variables...' 03:23:19 Generating controller variables... 03:23:19 ++ seq 1 3 03:23:19 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:19 + CONTROLLERIP=ODL_SYSTEM_1_IP 03:23:19 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.171.230' 03:23:19 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:19 + CONTROLLERIP=ODL_SYSTEM_2_IP 03:23:19 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.171.230 -v ODL_SYSTEM_2_IP:10.30.171.111' 03:23:19 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:23:19 + CONTROLLERIP=ODL_SYSTEM_3_IP 03:23:19 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.171.230 -v ODL_SYSTEM_2_IP:10.30.171.111 -v ODL_SYSTEM_3_IP:10.30.171.29' 03:23:19 + echo 'Generating mininet variables...' 03:23:19 Generating mininet variables... 03:23:19 ++ seq 1 1 03:23:19 + for i in $(seq 1 "${NUM_TOOLS_SYSTEM}") 03:23:19 + MININETIP=TOOLS_SYSTEM_1_IP 03:23:19 + tools_variables=' -v TOOLS_SYSTEM_1_IP:10.30.171.150' 03:23:19 + get_test_suites SUITES 03:23:19 + local __suite_list=SUITES 03:23:19 + echo 'Locating test plan to use...' 03:23:19 Locating test plan to use... 03:23:19 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt 03:23:19 + '[' '!' -f /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering-titanium.txt ']' 03:23:19 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 03:23:19 + '[' disabled '!=' disabled ']' 03:23:19 + echo 'Changing the testplan path...' 03:23:19 Changing the testplan path... 03:23:19 + sed s:integration:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/testplans/openflowplugin-clustering.txt 03:23:19 + cat testplan.txt 03:23:19 # Place the suites in run order: 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot 03:23:19 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 03:23:19 + '[' -z '' ']' 03:23:19 ++ grep -E -v '(^[[:space:]]*#|^[[:space:]]*$)' testplan.txt 03:23:19 ++ tr '\012' ' ' 03:23:19 + suite_list='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 03:23:19 + eval 'SUITES='\''/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot '\''' 03:23:19 ++ SUITES='/w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 03:23:19 + echo 'Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ...' 03:23:19 Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ... 03:23:19 + robot -N openflowplugin-clustering.txt --removekeywords wuks -e exclude -e skip_if_titanium -v BUNDLEFOLDER:karaf-0.22.1 -v BUNDLE_URL:https://nexus.opendaylight.org/content/repositories//autorelease-9133/org/opendaylight/integration/karaf/0.22.1/karaf-0.22.1.zip -v CONTROLLER:10.30.171.230 -v CONTROLLER1:10.30.171.111 -v CONTROLLER2:10.30.171.29 -v CONTROLLER_USER:jenkins -v JAVA_HOME:/usr/lib/jvm/java-21-openjdk-amd64 -v JDKVERSION:openjdk21 -v JENKINS_WORKSPACE:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium -v MININET:10.30.171.150 -v MININET1: -v MININET2: -v MININET_USER:jenkins -v NEXUSURL_PREFIX:https://nexus.opendaylight.org -v NUM_ODL_SYSTEM:3 -v NUM_TOOLS_SYSTEM:1 -v ODL_STREAM:titanium -v ODL_SYSTEM_IP:10.30.171.230 -v ODL_SYSTEM_1_IP:10.30.171.230 -v ODL_SYSTEM_2_IP:10.30.171.111 -v ODL_SYSTEM_3_IP:10.30.171.29 -v ODL_SYSTEM_USER:jenkins -v TOOLS_SYSTEM_IP:10.30.171.150 -v TOOLS_SYSTEM_1_IP:10.30.171.150 -v TOOLS_SYSTEM_USER:jenkins -v USER_HOME:/home/jenkins -v IS_KARAF_APPL:True -v WORKSPACE:/tmp -v ODL_OF_PLUGIN:lithium /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 03:23:19 ============================================================================== 03:23:19 openflowplugin-clustering.txt 03:23:19 ============================================================================== 03:23:20 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test suite for C... 03:23:20 ============================================================================== 03:23:25 Check Shards Status Before Fail :: Check Status for all shards in ... | FAIL | 03:23:27 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 03:23:27 [ Message content over the limit has been removed. ] 03:23:27 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:23:27 ------------------------------------------------------------------------------ 03:23:27 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 03:23:36 ------------------------------------------------------------------------------ 03:23:36 Check Entity Owner Status And Find Owner and Successor Before Fail... | FAIL | 03:24:07 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:24:07 Lengths are different: 2 != 0 03:24:07 ------------------------------------------------------------------------------ 03:24:07 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 03:24:07 Variable '@{original_successor_list}' not found. 03:24:07 ------------------------------------------------------------------------------ 03:24:07 Check Network Operational Information Before Fail :: Check devices... | FAIL | 03:24:14 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Add Configuration In Owner and Verify Before Fail :: Add Flow in O... | FAIL | 03:24:14 Variable '${original_owner}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Modify Configuration In Owner and Verify Before Fail :: Modify Flo... | FAIL | 03:24:14 Variable '${original_owner}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Delete Configuration In Owner and Verify Before Fail :: Delete Flo... | FAIL | 03:24:14 Variable '${original_owner}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Add Configuration In Successor and Verify Before Fail :: Add Flow ... | FAIL | 03:24:14 Variable '${original_successor}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Modify Configuration In Successor and Verify Before Fail :: Modify... | FAIL | 03:24:14 Variable '${original_successor}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Delete Configuration In Successor and Verify Before Fail :: Delete... | FAIL | 03:24:14 Variable '${original_successor}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Send RPC Add to Owner and Verify Before Fail :: Add Flow in Owner ... | FAIL | 03:24:14 Variable '${original_owner}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Send RPC Delete to Owner and Verify Before Fail :: Delete Flow in ... | FAIL | 03:24:14 Variable '${original_owner}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Send RPC Add to Successor and Verify Before Fail :: Add Flow in Su... | FAIL | 03:24:14 Variable '${original_successor}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Send RPC Delete to Successor and Verify Before Fail :: Delete Flow... | FAIL | 03:24:14 Variable '${original_successor}' not found. 03:24:14 ------------------------------------------------------------------------------ 03:24:14 Modify Network And Verify Before Fail :: Take a link down and veri... | FAIL | 03:24:35 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 03:24:35 ------------------------------------------------------------------------------ 03:24:35 Restore Network And Verify Before Fail :: Take the link up and ver... | FAIL | 03:24:46 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Kill Owner Instance :: Kill Owner Instance and verify it is dead | FAIL | 03:24:46 Variable '${original_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Check Shards Status After Fail :: Create original cluster list and... | FAIL | 03:24:46 Variable '${new_cluster_list}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Check Entity Owner Status And Find Owner and Successor After Fail ... | FAIL | 03:24:46 Variable '${original_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Check Network Operational Information After Fail :: Check devices ... | FAIL | 03:24:46 Variable '${new_cluster_list}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Add Configuration In Owner and Verify After Fail :: Add Flow in Ow... | FAIL | 03:24:46 Variable '${new_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Modify Configuration In Owner and Verify After Fail :: Modify Flow... | FAIL | 03:24:46 Variable '${new_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Delete Configuration In Owner and Verify After Fail :: Delete Flow... | FAIL | 03:24:46 Variable '${new_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Add Configuration In Successor and Verify After Fail :: Add Flow i... | FAIL | 03:24:46 Variable '${new_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Modify Configuration In Successor and Verify After Fail :: Modify ... | FAIL | 03:24:46 Variable '${new_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Delete Configuration In Successor and Verify After Fail :: Delete ... | FAIL | 03:24:46 Variable '${new_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Send RPC Add to Owner and Verify After Fail :: Add Flow in Owner a... | FAIL | 03:24:46 Variable '${new_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Send RPC Delete to Owner and Verify After Fail :: Delete Flow in O... | FAIL | 03:24:46 Variable '${new_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Send RPC Add to Successor and Verify After Fail :: Add Flow in Suc... | FAIL | 03:24:46 Variable '${new_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Send RPC Delete to Successor and Verify After Fail :: Delete Flow ... | FAIL | 03:24:46 Variable '${new_successor}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Modify Network and Verify After Fail :: Take a link down and verif... | FAIL | 03:24:46 Variable '${new_cluster_list}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Restore Network and Verify After Fail :: Take the link up and veri... | FAIL | 03:24:46 Variable '${new_cluster_list}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 03:24:46 This test fails due to https://jira.opendaylight.org/browse/CONTROLLER-1849 03:24:46 03:24:46 Variable '${original_owner}' not found. 03:24:46 ------------------------------------------------------------------------------ 03:24:46 Check Shards Status After Recover :: Create original cluster list ... | FAIL | 03:26:16 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 03:26:16 [ Message content over the limit has been removed. ] 03:26:16 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:26:16 ------------------------------------------------------------------------------ 03:26:16 Check Entity Owner Status After Recover :: Check Entity Owner Stat... | FAIL | 03:26:47 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:26:47 Lengths are different: 2 != 0 03:26:47 ------------------------------------------------------------------------------ 03:26:47 Check Network Operational Information After Recover :: Check devic... | FAIL | 03:26:53 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Add Configuration In Owner and Verify After Recover :: Add Flow in... | FAIL | 03:26:53 Variable '${new_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Modify Configuration In Owner and Verify After Recover :: Modify F... | FAIL | 03:26:53 Variable '${new_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Delete Configuration In Owner and Verify After Recover :: Delete F... | FAIL | 03:26:53 Variable '${new_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Add Configuration In Old Owner and Verify After Recover :: Add Flo... | FAIL | 03:26:53 Variable '${original_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Modify Configuration In Old Owner and Verify After Recover :: Modi... | FAIL | 03:26:53 Variable '${original_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Delete Configuration In Old Owner and Verify After Recover :: Dele... | FAIL | 03:26:53 Variable '${original_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Send RPC Add to Owner and Verify After Recover :: Add Flow in Owne... | FAIL | 03:26:53 Variable '${new_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Send RPC Delete to Owner and Verify After Recover :: Delete Flow i... | FAIL | 03:26:53 Variable '${new_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Send RPC Add to Old Owner and Verify After Recover :: Add Flow in ... | FAIL | 03:26:53 Variable '${original_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Send RPC Delete to Old Owner and Verify After Recover :: Delete Fl... | FAIL | 03:26:53 Variable '${original_owner}' not found. 03:26:53 ------------------------------------------------------------------------------ 03:26:53 Modify Network and Verify After Recover :: Take a link down and ve... | FAIL | 03:27:14 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 16 times. 03:27:14 ------------------------------------------------------------------------------ 03:27:14 Restore Network and Verify After Recover :: Take the link up and v... | FAIL | 03:27:24 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}]}]}}' contains 'openflow:1' 11 times, not 21 times. 03:27:24 ------------------------------------------------------------------------------ 03:27:24 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 03:27:27 ------------------------------------------------------------------------------ 03:27:27 Check No Network Operational Information :: Check device is not in... | PASS | 03:27:27 ------------------------------------------------------------------------------ 03:27:27 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test su... | FAIL | 03:27:27 51 tests, 3 passed, 48 failed 03:27:27 ============================================================================== 03:27:27 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test suite for Cl... 03:27:27 ============================================================================== 03:27:30 Check Shards Status Before Stop :: Check Status for all shards in ... | FAIL | 03:27:31 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 03:27:31 [ Message content over the limit has been removed. ] 03:27:31 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:27:31 ------------------------------------------------------------------------------ 03:27:31 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 03:27:42 ------------------------------------------------------------------------------ 03:27:42 Check Entity Owner Status And Find Owner and Successor Before Stop... | FAIL | 03:28:10 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:28:10 Lengths are different: 2 != 0 03:28:10 ------------------------------------------------------------------------------ 03:28:10 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 03:28:10 Variable '@{original_successor_list}' not found. 03:28:10 ------------------------------------------------------------------------------ 03:28:10 Check Network Operational Information Before Stop :: Check devices... | FAIL | 03:28:15 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:1:1","dest-node":"openflow:1"}},{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 15 times, not 21 times. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Add Configuration In Owner and Verify Before Stop :: Add Flow in O... | FAIL | 03:28:15 Variable '${original_owner}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Modify Configuration In Owner and Verify Before Stop :: Modify Flo... | FAIL | 03:28:15 Variable '${original_owner}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Delete Configuration In Owner and Verify Before Stop :: Delete Flo... | FAIL | 03:28:15 Variable '${original_owner}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Add Configuration In Successor and Verify Before Stop :: Add Flow ... | FAIL | 03:28:15 Variable '${original_successor}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Modify Configuration In Successor and Verify Before Stop :: Modify... | FAIL | 03:28:15 Variable '${original_successor}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Delete Configuration In Successor and Verify Before Stop :: Delete... | FAIL | 03:28:15 Variable '${original_successor}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Send RPC Add to Owner and Verify Before Stop :: Add Flow in Owner ... | FAIL | 03:28:15 Variable '${original_owner}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Send RPC Delete to Owner and Verify Before Stop :: Delete Flow in ... | FAIL | 03:28:15 Variable '${original_owner}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Send RPC Add to Successor and Verify Before Stop :: Add Flow in Su... | FAIL | 03:28:15 Variable '${original_successor}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Send RPC Delete to Successor and Verify Before Stop :: Delete Flow... | FAIL | 03:28:15 Variable '${original_successor}' not found. 03:28:15 ------------------------------------------------------------------------------ 03:28:15 Modify Network And Verify Before Stop :: Take a link down and veri... | FAIL | 03:28:37 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 13 times, not 16 times. 03:28:37 ------------------------------------------------------------------------------ 03:28:37 Restore Network And Verify Before Stop :: Take the link up and ver... | FAIL | 03:28:48 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:1:1","dest-node":"openflow:1"}},{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 15 times, not 21 times. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Stop Owner Instance :: Stop Owner Instance and verify it is dead | FAIL | 03:28:48 Variable '${original_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Check Shards Status After Stop :: Create original cluster list and... | FAIL | 03:28:48 Variable '${new_cluster_list}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Check Entity Owner Status And Find Owner and Successor After Stop ... | FAIL | 03:28:48 Variable '${original_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Check Network Operational Information After Stop :: Check devices ... | FAIL | 03:28:48 Variable '${new_cluster_list}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Add Configuration In Owner and Verify After Stop :: Add Flow in Ow... | FAIL | 03:28:48 Variable '${new_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Modify Configuration In Owner and Verify After Stop :: Modify Flow... | FAIL | 03:28:48 Variable '${new_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Delete Configuration In Owner and Verify After Stop :: Delete Flow... | FAIL | 03:28:48 Variable '${new_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Add Configuration In Successor and Verify After Stop :: Add Flow i... | FAIL | 03:28:48 Variable '${new_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Modify Configuration In Successor and Verify After Stop :: Modify ... | FAIL | 03:28:48 Variable '${new_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Delete Configuration In Successor and Verify After Stop :: Delete ... | FAIL | 03:28:48 Variable '${new_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Send RPC Add to Owner and Verify After Stop :: Add Flow in Owner a... | FAIL | 03:28:48 Variable '${new_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Send RPC Delete to Owner and Verify After Stop :: Delete Flow in O... | FAIL | 03:28:48 Variable '${new_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Send RPC Add to Successor and Verify After Stop :: Add Flow in Suc... | FAIL | 03:28:48 Variable '${new_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Send RPC Delete to Successor and Verify After Stop :: Delete Flow ... | FAIL | 03:28:48 Variable '${new_successor}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Modify Network and Verify After Stop :: Take a link down and verif... | FAIL | 03:28:48 Variable '${new_cluster_list}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Restore Network and Verify After Stop :: Take the link up and veri... | FAIL | 03:28:48 Variable '${new_cluster_list}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 03:28:48 Variable '${original_owner}' not found. 03:28:48 ------------------------------------------------------------------------------ 03:28:48 Check Shards Status After Start :: Create original cluster list an... | FAIL | 03:30:18 Keyword 'ClusterOpenFlow.Check OpenFlow Shards Status' failed after retrying for 1 minute 30 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.... 03:30:18 [ Message content over the limit has been removed. ] 03:30:18 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:30:18 ------------------------------------------------------------------------------ 03:30:18 Check Entity Owner Status After Start :: Check Entity Owner Status... | FAIL | 03:30:49 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:30:49 Lengths are different: 2 != 0 03:30:49 ------------------------------------------------------------------------------ 03:30:49 Check Network Operational Information After Start :: Check devices... | FAIL | 03:30:55 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 5 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:1:1","dest-node":"openflow:1"}},{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 15 times, not 21 times. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Add Configuration In Owner and Verify After Start :: Add Flow in O... | FAIL | 03:30:55 Variable '${new_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Modify Configuration In Owner and Verify After Start :: Modify Flo... | FAIL | 03:30:55 Variable '${new_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Delete Configuration In Owner and Verify After Start :: Delete Flo... | FAIL | 03:30:55 Variable '${new_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Add Configuration In Old Owner and Verify After Start :: Add Flow ... | FAIL | 03:30:55 Variable '${original_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Modify Configuration In Old Owner and Verify After Start :: Modify... | FAIL | 03:30:55 Variable '${original_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Delete Configuration In Old Owner and Verify After Start :: Delete... | FAIL | 03:30:55 Variable '${original_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Send RPC Add to Owner and Verify After Start :: Add Flow in Owner ... | FAIL | 03:30:55 Variable '${new_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Send RPC Delete to Owner and Verify After Start :: Delete Flow in ... | FAIL | 03:30:55 Variable '${new_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Send RPC Add to Old Owner and Verify After Start :: Add Flow in Ow... | FAIL | 03:30:55 Variable '${original_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Send RPC Delete to Old Owner and Verify After Start :: Delete Flow... | FAIL | 03:30:55 Variable '${original_owner}' not found. 03:30:55 ------------------------------------------------------------------------------ 03:30:55 Modify Network and Verify After Start :: Take a link down and veri... | FAIL | 03:31:16 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 20 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 13 times, not 16 times. 03:31:16 ------------------------------------------------------------------------------ 03:31:16 Restore Network and Verify After Start :: Take the link up and ver... | FAIL | 03:31:26 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:LOCAL\']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:1\']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:2\']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:2\']/node-connector[id=\'openflow:2:3\']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']","termination-point":[{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:LOCAL\']"},{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:1\']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:2\']"},{"tp-id":"openflow:3:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:3\']/node-connector[id=\'openflow:3:3\']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:2\']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:LOCAL\']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id=\'openflow:1\']/node-connector[id=\'openflow:1:1\']"}]}],"link":[{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:1:1","dest-node":"openflow:1"}},{"link-id":"openflow:3:3","source":{"source-node":"openflow:3","source-tp":"openflow:3:3"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}}]}]}}' contains 'openflow:1' 15 times, not 21 times. 03:31:26 ------------------------------------------------------------------------------ 03:31:26 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 03:31:29 ------------------------------------------------------------------------------ 03:31:29 Check No Network Operational Information :: Check device is not in... | PASS | 03:31:29 ------------------------------------------------------------------------------ 03:31:29 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test sui... | FAIL | 03:31:29 51 tests, 3 passed, 48 failed 03:31:29 ============================================================================== 03:31:29 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Follower Fail... 03:31:29 ============================================================================== 03:31:32 Check Shards Status Before Leader Restart :: Check Status for all ... | FAIL | 03:31:33 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 03:31:33 [ Message content over the limit has been removed. ] 03:31:33 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:31:33 ------------------------------------------------------------------------------ 03:31:33 Get inventory Leader Before Leader Restart :: Find leader in the i... | FAIL | 03:31:43 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 03:31:43 [ Message content over the limit has been removed. ] 03:31:43 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 03:31:43 ------------------------------------------------------------------------------ 03:31:43 Start Mininet Connect To Follower Node1 :: Start mininet with conn... | FAIL | 03:31:44 Variable '${follower_node_1}' not found. 03:31:44 ------------------------------------------------------------------------------ 03:31:44 Add Flows In Follower Node2 and Verify Before Leader Restart :: Ad... | FAIL | 03:31:44 Variable '${follower_node_2}' not found. 03:31:44 ------------------------------------------------------------------------------ 03:31:44 Stop Mininet Connected To Follower Node1 and Exit :: Stop mininet ... | FAIL | 03:31:45 Variable '${mininet_conn_id}' not found. 03:31:45 ------------------------------------------------------------------------------ 03:31:45 Restart Leader From Cluster Node :: Stop Leader Node and Start it ... | FAIL | 03:31:45 Variable '${inventory_leader}' not found. 03:31:45 ------------------------------------------------------------------------------ 03:31:45 Get inventory Follower After Leader Restart :: Find new Followers ... | FAIL | 03:31:56 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 03:31:56 [ Message content over the limit has been removed. ] 03:31:56 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 03:31:56 ------------------------------------------------------------------------------ 03:31:56 Start Mininet Connect To Old Leader :: Start mininet with connecti... | FAIL | 03:31:56 Variable '${inventory_leader_old}' not found. 03:31:56 ------------------------------------------------------------------------------ 03:31:56 Verify Flows In Switch After Leader Restart :: Verify flows are in... | FAIL | 03:32:12 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 03:32:12 ------------------------------------------------------------------------------ 03:32:12 Stop Mininet Connected To Old Leader and Exit :: Stop mininet and ... | FAIL | 03:32:13 Variable '${mininet_conn_id}' not found. 03:32:13 ------------------------------------------------------------------------------ 03:32:13 Restart Follower Node2 :: Stop Follower Node2 and Start it Up, Ver... | FAIL | 03:32:13 Variable '${follower_node_2}' not found. 03:32:13 ------------------------------------------------------------------------------ 03:32:13 Get inventory Follower After Follower Restart :: Find Followers an... | FAIL | 03:32:24 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 03:32:24 [ Message content over the limit has been removed. ] 03:32:24 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 03:32:24 ------------------------------------------------------------------------------ 03:32:24 Start Mininet Connect To Leader :: Start mininet with connection t... | FAIL | 03:32:24 Variable '${inventory_leader}' not found. 03:32:24 ------------------------------------------------------------------------------ 03:32:24 Verify Flows In Switch After Follower Restart :: Verify flows are ... | FAIL | 03:32:40 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 03:32:40 ------------------------------------------------------------------------------ 03:32:40 Stop Mininet Connected To Leader and Exit :: Stop mininet Connecte... | FAIL | 03:32:40 Variable '${mininet_conn_id}' not found. 03:32:40 ------------------------------------------------------------------------------ 03:32:40 Restart Full Cluster :: Stop all Cluster Nodes and Start it Up All. | PASS | 03:33:17 ------------------------------------------------------------------------------ 03:33:17 Get inventory Status After Cluster Restart :: Find New Followers a... | FAIL | 03:34:00 Keyword 'ClusterManagement.Get_Leader_And_Followers_For_Shard' failed after retrying for 10 seconds. The last error was: Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-config,type=DistributedConfigDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttr... 03:34:00 [ Message content over the limit has been removed. ] 03:34:00 ...lipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)\\n\\tat org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 173 (char 568) 03:34:00 ------------------------------------------------------------------------------ 03:34:00 Start Mininet Connect To Follower Node2 After Cluster Restart :: S... | FAIL | 03:34:00 Variable '${follower_node_2}' not found. 03:34:00 ------------------------------------------------------------------------------ 03:34:00 Verify Flows In Switch After Cluster Restart :: Verify flows are i... | FAIL | 03:34:17 Keyword 'ClusterManagement.Check_Item_Occurrence_Member_List_Or_All' failed after retrying for 15 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0?content=nonconfig 03:34:17 ------------------------------------------------------------------------------ 03:34:17 Delete Flows In Follower Node1 and Verify After Leader Restart :: ... | FAIL | 03:34:17 Variable '${follower_node_1}' not found. 03:34:17 ------------------------------------------------------------------------------ 03:34:17 Stop Mininet Connected To Follower Node2 and Exit After Cluster Re... | FAIL | 03:34:18 Variable '${mininet_conn_id}' not found. 03:34:18 ------------------------------------------------------------------------------ 03:34:18 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Foll... | FAIL | 03:34:18 21 tests, 1 passed, 20 failed 03:34:18 ============================================================================== 03:34:18 /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/test/csit/libraries/VsctlListParser.py:61: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:34:18 if ctl_ref is not "": 03:34:18 openflowplugin-clustering.txt.010 Group Flows :: Switch connections and clu... 03:34:18 ============================================================================== 03:34:21 Add Groups And Flows :: Add 100 groups 1&2 and flows in every switch. | PASS | 03:34:25 ------------------------------------------------------------------------------ 03:34:25 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 03:34:34 ------------------------------------------------------------------------------ 03:34:34 Check Linear Topology :: Check Linear Topology. | FAIL | 03:35:04 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:35:04 ------------------------------------------------------------------------------ 03:35:04 Check Stats Are Not Frozen :: Check that duration flow stat is inc... | FAIL | 03:35:35 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 03:35:35 ------------------------------------------------------------------------------ 03:35:35 Check Flows In Operational DS :: Check Flows in operational DS. | FAIL | 03:35:46 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 10 seconds. The last error was: 3 != 303 03:35:46 ------------------------------------------------------------------------------ 03:35:46 Check Groups In Operational DS :: Check Groups in operational DS. | FAIL | 03:35:57 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 03:35:57 ------------------------------------------------------------------------------ 03:35:57 Check Flows In Switch :: Check Flows in switch. | FAIL | 03:35:57 3.0 != 303.0 03:35:57 ------------------------------------------------------------------------------ 03:35:57 Check Entity Owner Status And Find Owner and Successor Before Fail... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:35:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:35:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:28 | FAIL | 03:36:28 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:36:28 Lengths are different: 2 != 0 03:36:28 ------------------------------------------------------------------------------ 03:36:28 Disconnect Mininet From Owner :: Disconnect mininet from the owner :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:29 | FAIL | 03:36:29 Variable '${original_owner}' not found. 03:36:29 ------------------------------------------------------------------------------ 03:36:29 Check Entity Owner Status And Find Owner and Successor After Fail ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:39 | FAIL | 03:36:39 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${new_cluster_list}' not found. 03:36:39 ------------------------------------------------------------------------------ 03:36:39 Check Switch Moves To New Master :: Check switch s1 is connected t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:39 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:40 | FAIL | 03:36:40 Variable '${new_owner}' not found. 03:36:40 ------------------------------------------------------------------------------ 03:36:40 Check Linear Topology After Disconnect :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:36:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:10 | FAIL | 03:37:10 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:37:10 ------------------------------------------------------------------------------ 03:37:10 Check Stats Are Not Frozen After Disconnect :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:41 | FAIL | 03:37:41 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 03:37:41 ------------------------------------------------------------------------------ 03:37:41 Remove Flows And Groups After Mininet Is Disconnected :: Remove 1 ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:41 | PASS | 03:37:41 ------------------------------------------------------------------------------ 03:37:41 Check Flows In Operational DS After Mininet Is Disconnected :: Che... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:37:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:12 | FAIL | 03:38:12 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 300 03:38:12 ------------------------------------------------------------------------------ 03:38:12 Check Groups In Operational DS After Mininet Is Disconnected :: Ch... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 | FAIL | 03:38:23 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 594 03:38:23 ------------------------------------------------------------------------------ 03:38:23 Check Flows In Switch After Mininet Is Disconnected :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 | FAIL | 03:38:23 3.0 != 300.0 03:38:23 ------------------------------------------------------------------------------ 03:38:23 Reconnect Mininet To Owner :: Reconnect mininet to switch 1 owner. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:23 | FAIL | 03:38:23 Variable '${original_owner_list}' not found. 03:38:23 ------------------------------------------------------------------------------ 03:38:23 Check Entity Owner Status And Find Owner and Successor After Recon... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:55 | FAIL | 03:38:55 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:38:55 Lengths are different: 2 != 0 03:38:55 ------------------------------------------------------------------------------ 03:38:55 Add Flows And Groups After Owner Reconnect :: Add 1 group type 1&2... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:57 | PASS | 03:38:57 ------------------------------------------------------------------------------ 03:38:57 Check Stats Are Not Frozen After Owner Reconnect :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:38:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:28 | FAIL | 03:39:28 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 03:39:28 ------------------------------------------------------------------------------ 03:39:28 Check Flows After Owner Reconnect In Operational DS :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:58 | FAIL | 03:39:58 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 03:39:58 ------------------------------------------------------------------------------ 03:39:58 Check Groups After Owner Reconnect In Operational DS :: Check Grou... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:39:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:09 | FAIL | 03:40:09 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 03:40:09 ------------------------------------------------------------------------------ 03:40:09 Check Flows After Owner Reconnect In Switch :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:09 | FAIL | 03:40:09 3.0 != 303.0 03:40:09 ------------------------------------------------------------------------------ 03:40:09 Check Switches Generate Slave Connection :: Check switches are con... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 | FAIL | 03:40:10 Variable '${original_owner}' not found. 03:40:10 ------------------------------------------------------------------------------ 03:40:10 Disconnect Mininet From Successor :: Disconnect mininet from the S... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 | FAIL | 03:40:10 Variable '${new_successor_list}' not found. 03:40:10 ------------------------------------------------------------------------------ 03:40:10 Check Entity Owner Status And Find New Owner and Successor After D... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:20 | FAIL | 03:40:20 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${owner_list}' not found. 03:40:20 ------------------------------------------------------------------------------ 03:40:20 Disconnect Mininet From Current Owner :: Disconnect mininet from t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:21 | FAIL | 03:40:21 Variable '${current_owner}' not found. 03:40:21 ------------------------------------------------------------------------------ 03:40:21 Check Entity Owner Status And Find Current Owner and Successor Aft... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:31 | FAIL | 03:40:31 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${original_owner_list}' not found. 03:40:31 ------------------------------------------------------------------------------ 03:40:31 Check Switch Moves To Current Master :: Check switch s1 is connect... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:32 | FAIL | 03:40:32 Variable '${current_new_owner}' not found. 03:40:32 ------------------------------------------------------------------------------ 03:40:32 Check Linear Topology After Owner Disconnect :: Check Linear Topol... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:40:32 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:02 | FAIL | 03:41:02 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:41:02 ------------------------------------------------------------------------------ 03:41:02 Check Stats Are Not Frozen After Owner Disconnect :: Check that du... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 | FAIL | 03:41:33 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 03:41:33 ------------------------------------------------------------------------------ 03:41:33 Remove Flows And Groups After Owner Disconnected :: Remove 1 group... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 | PASS | 03:41:33 ------------------------------------------------------------------------------ 03:41:33 Check Flows In Operational DS After Owner Disconnected :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:41:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:04 | FAIL | 03:42:04 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 300 03:42:04 ------------------------------------------------------------------------------ 03:42:04 Check Groups In Operational DS After Owner Disconnected :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 | FAIL | 03:42:15 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 594 03:42:15 ------------------------------------------------------------------------------ 03:42:15 Check Flows In Switch After Owner Disconnected :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 | FAIL | 03:42:15 3.0 != 300.0 03:42:15 ------------------------------------------------------------------------------ 03:42:15 Disconnect Mininet From Cluster :: Disconnect Mininet from Cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:15 | FAIL | 03:42:15 Variable '${original_owner_list}' not found. 03:42:15 ------------------------------------------------------------------------------ 03:42:15 Check No Switches After Disconnect :: Check no switches in topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:46 | FAIL | 03:42:46 Keyword 'ClusterOpenFlow.Check No Switches On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' contains 'openflow:1' 03:42:46 ------------------------------------------------------------------------------ 03:42:46 Check Switch Is Not Connected :: Check switch s1 is not connected ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:57 | FAIL | 03:42:57 Keyword 'OvsManager.Should Be Disconnected' failed after retrying for 10 seconds. The last error was: Dictionary does not contain key 's1'. 03:42:57 ------------------------------------------------------------------------------ 03:42:57 Reconnect Mininet To Cluster :: Reconnect mininet to cluster by re... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:58 10.30.171.230 03:42:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:42:59 10.30.171.111 03:42:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:00 10.30.171.29 03:43:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:01 | PASS | 03:43:01 ------------------------------------------------------------------------------ 03:43:01 Check Linear Topology After Mininet Reconnects :: Check Linear Top... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:11 | FAIL | 03:43:11 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:43:11 ------------------------------------------------------------------------------ 03:43:11 Add Flows And Groups After Mininet Reconnects :: Add 1 group type ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:13 | PASS | 03:43:13 ------------------------------------------------------------------------------ 03:43:13 Check Flows In Operational DS After Mininet Reconnects :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:44 | FAIL | 03:43:44 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 03:43:44 ------------------------------------------------------------------------------ 03:43:44 Check Groups In Operational DS After Mininet Reconnects :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:44 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:44 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:55 | FAIL | 03:43:55 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 03:43:55 ------------------------------------------------------------------------------ 03:43:55 Check Flows In Switch After Mininet Reconnects :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:55 | FAIL | 03:43:55 3.0 != 303.0 03:43:55 ------------------------------------------------------------------------------ 03:43:55 Check Entity Owner Status And Find Owner and Successor Before Owne... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:56 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:43:56 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 | FAIL | 03:44:27 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:44:27 Lengths are different: 2 != 0 03:44:27 ------------------------------------------------------------------------------ 03:44:27 Check Switch Generates Slave Connection Before Owner Stop :: Check... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 | FAIL | 03:44:27 Variable '${original_successor}' not found. 03:44:27 ------------------------------------------------------------------------------ 03:44:27 Check Shards Status Before Owner Stop :: Check Status for all shar... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 | FAIL | 03:44:28 Evaluating expression 'json.loads(\'\'\'{\n "error": "javax.management.InstanceNotFoundException : org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "error_type": "javax.management.InstanceNotFoundException",\n "request": {\n "mbean": "org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore",\n "type": "read"\n },\n "stacktrace": "javax.management.InstanceNotFoundException: org.opendaylight.controller:Category=Shards,name=member-2-shard-inventory-operational,type=DistributedOperationalDatastore\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBean(DefaultMBeanServerInterceptor.java:1073)\\n\\tat java.management/com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.getMBeanInfo(DefaultMBeanServerInterceptor.java:1343)\\n\\tat java.management/com.sun.jmx.mbeanserver.JmxMBeanServer.getMBeanInfo(JmxMBeanServer.java:921)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:46)\\n\\tat org.jolokia.handler.ReadHandler$1.execute(ReadHandler.java:41)\\n\\tat org.jolokia.backend.executor.AbstractMBeanServerExecutor.call(AbstractMBeanServerExecutor.java:90)\\n\\tat org.jolokia.handler.ReadHandler.getMBeanInfo(ReadHandler.java:233)\\n\\tat org.jolokia.handler.ReadHandler.getAllAttributesNames(ReadHandler.java:245)\\n\\tat org.jolokia.handler.ReadHandler.resolveAttributes(ReadHandler.java:221)\\n\\tat org.jolokia.handler.ReadHandler.fetchAttributes(ReadHa... 03:44:28 [ Message content over the limit has been removed. ] 03:44:28 ...rvice.jetty.internal.PrioritizedHandlerCollection.handle(PrioritizedHandlerCollection.java:96)\\n\\tat org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)\\n\\tat org.eclipse.jetty.server.Server.handle(Server.java:516)\\n\\tat org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487)\\n\\tat org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:732)\\n\\tat org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:479)\\n\\tat org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)\\n\\tat org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)\\n\\tat org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)\\n\\tat org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)\\n\\tat org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:137)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)\\n\\tat org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)\\n\\tat java.base/java.lang.Thread.run(Thread.java:1583)\\n",\n "status": 404\n}\n\'\'\')' failed: JSONDecodeError: Invalid control character at: line 8 column 183 (char 598) 03:44:28 ------------------------------------------------------------------------------ 03:44:28 Stop Owner Instance :: Stop Owner Instance and verify it is shutdown :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 | FAIL | 03:44:28 Variable '${original_owner}' not found. 03:44:28 ------------------------------------------------------------------------------ 03:44:28 Check Shards Status After Stop :: Check Status for all shards in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 | FAIL | 03:44:28 Variable '${new_cluster_list}' not found. 03:44:28 ------------------------------------------------------------------------------ 03:44:28 Check Entity Owner Status And Find Owner and Successor After Stop ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:29 | FAIL | 03:44:29 Variable '${original_successor}' not found. 03:44:29 ------------------------------------------------------------------------------ 03:44:29 Check Stats Are Not Frozen After Owner Stop :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:59 | FAIL | 03:44:59 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 03:44:59 ------------------------------------------------------------------------------ 03:44:59 Remove Configuration In Owner and Verify After Owner Stop :: Remov... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:44:59 | FAIL | 03:44:59 Variable '${new_owner}' not found. 03:44:59 ------------------------------------------------------------------------------ 03:44:59 Check Flows After Owner Stop In Operational DS :: Check Flows in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:30 | FAIL | 03:45:30 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 03:45:30 ------------------------------------------------------------------------------ 03:45:30 Check Groups After Owner Stop In Operational DS :: Check Groups in... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:40 | FAIL | 03:45:40 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: Variable '${new_owner}' not found. 03:45:40 ------------------------------------------------------------------------------ 03:45:40 Check Flows In Switch After Owner Stop :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 | FAIL | 03:45:41 3.0 != 300.0 03:45:41 ------------------------------------------------------------------------------ 03:45:41 Start Old Owner Instance :: Start old Owner Instance and verify it... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 | FAIL | 03:45:41 Variable '${original_owner}' not found. 03:45:41 ------------------------------------------------------------------------------ 03:45:41 Check Entity Owner Status And Find Owner and Successor After Start... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:45:41 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:13 | FAIL | 03:46:13 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:46:13 Lengths are different: 2 != 0 03:46:13 ------------------------------------------------------------------------------ 03:46:13 Check Linear Topology After Owner Restart :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:23 | FAIL | 03:46:23 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 10 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:46:23 ------------------------------------------------------------------------------ 03:46:23 Add Configuration In Owner and Verify After Owner Restart :: Add 1... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 | FAIL | 03:46:24 Variable '${new_owner}' not found. 03:46:24 ------------------------------------------------------------------------------ 03:46:24 Check Stats Are Not Frozen After Owner Restart :: Check that durat... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 | FAIL | 03:46:24 Variable '${new_owner}' not found. 03:46:24 ------------------------------------------------------------------------------ 03:46:24 Check Flows In Operational DS After Owner Restart :: Check Flows i... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:54 | FAIL | 03:46:54 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 03:46:54 ------------------------------------------------------------------------------ 03:46:54 Check Groups In Operational DS After Owner Restart :: Check Groups... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:46:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 | FAIL | 03:47:05 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 0 != 600 03:47:05 ------------------------------------------------------------------------------ 03:47:05 Check Flows In Switch After Owner Restart :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 | FAIL | 03:47:05 3.0 != 303.0 03:47:05 ------------------------------------------------------------------------------ 03:47:05 Restart Cluster :: Stop and Start cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:31 | PASS | 03:47:31 ------------------------------------------------------------------------------ 03:47:31 Check Linear Topology After Controller Restarts :: Check Linear To... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:47:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:48:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:48:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:48:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:06 | FAIL | 03:53:06 Keyword 'ClusterOpenFlow.Check Linear Topology On Member' failed after retrying for 5 minutes. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']"},{"node-id":"openflow:3","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']"},{"node-id":"openflow:1","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}],"opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']"}]}]}}' does not contain '"source-tp":"openflow:1:2"' 03:53:06 ------------------------------------------------------------------------------ 03:53:06 Check Stats Are Not Frozen After Cluster Restart :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:37 | FAIL | 03:53:37 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: HTTPError: 409 Client Error: Conflict for url: http://10.30.171.230:8181/rests/data/opendaylight-inventory:nodes/node=openflow%3A1/flow-node-inventory:table=0/flow=1?content=nonconfig 03:53:37 ------------------------------------------------------------------------------ 03:53:37 Check Flows In Operational DS After Controller Restarts :: Check F... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:53:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:08 | FAIL | 03:54:08 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: 3 != 303 03:54:08 ------------------------------------------------------------------------------ 03:54:08 Check Groups In Operational DS After Controller Restarts :: Check ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 | FAIL | 03:54:19 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: 192 != 600 03:54:19 ------------------------------------------------------------------------------ 03:54:19 Check Flows In Switch After Controller Restarts :: Check Flows in ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 | FAIL | 03:54:19 3.0 != 303.0 03:54:19 ------------------------------------------------------------------------------ 03:54:19 Stop Mininet :: Stop Mininet. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:20 | PASS | 03:54:20 ------------------------------------------------------------------------------ 03:54:20 Check No Switches :: Check no switches in topology. | PASS | 03:54:21 ------------------------------------------------------------------------------ 03:54:23 openflowplugin-clustering.txt.010 Group Flows :: Switch connection... | FAIL | 03:54:23 72 tests, 10 passed, 62 failed 03:54:23 ============================================================================== 03:54:23 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite for entit... 03:54:23 ============================================================================== 03:54:28 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 03:54:28 Parent suite setup failed: 03:54:28 Dictionary does not contain key 's1'. 03:54:28 ------------------------------------------------------------------------------ 03:54:28 Reconnecting Switch s1 | FAIL | 03:54:28 Parent suite setup failed: 03:54:28 Dictionary does not contain key 's1'. 03:54:28 ------------------------------------------------------------------------------ 03:54:28 Switches Still Be Connected To All Nodes | FAIL | 03:54:28 Parent suite setup failed: 03:54:28 Dictionary does not contain key 's1'. 03:54:28 ------------------------------------------------------------------------------ 03:54:28 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite ... | FAIL | 03:54:28 Suite setup failed: 03:54:28 Dictionary does not contain key 's1'. 03:54:28 03:54:28 3 tests, 0 passed, 3 failed 03:54:28 ============================================================================== 03:54:28 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test suite for en... 03:54:28 ============================================================================== 03:54:32 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 03:54:32 Parent suite setup failed: 03:54:32 Dictionary does not contain key 's1'. 03:54:32 ------------------------------------------------------------------------------ 03:54:32 Restarting Owner Of Switch s1 | FAIL | 03:54:32 Parent suite setup failed: 03:54:32 Dictionary does not contain key 's1'. 03:54:32 ------------------------------------------------------------------------------ 03:54:32 Switches Still Be Connected To All Nodes | FAIL | 03:54:32 Parent suite setup failed: 03:54:32 Dictionary does not contain key 's1'. 03:54:32 ------------------------------------------------------------------------------ 03:54:32 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test sui... | FAIL | 03:54:32 Suite setup failed: 03:54:32 Dictionary does not contain key 's1'. 03:54:32 03:54:32 3 tests, 0 passed, 3 failed 03:54:32 ============================================================================== 03:54:32 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test suite for e... 03:54:32 ============================================================================== 03:54:34 Start Mininet To All Nodes | FAIL | 03:54:37 Dictionary does not contain key 's1'. 03:54:37 ------------------------------------------------------------------------------ 03:54:37 Switches To Be Connected To All Nodes :: Initial check for correct... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:53 | FAIL | 03:54:53 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 03:54:53 ------------------------------------------------------------------------------ 03:54:53 Isolating Owner Of Switch s1 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:54:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:24 | FAIL | 03:55:24 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 03:55:24 03:55:24 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 03:55:24 ------------------------------------------------------------------------------ 03:55:24 Switches Still Be Connected To All Nodes :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:40 | FAIL | 03:55:40 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 03:55:40 03:55:40 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 03:55:40 ------------------------------------------------------------------------------ 03:55:40 Stop Mininet And Verify No Owners :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 03:55:55 | FAIL | 03:55:55 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 03:55:55 03:55:55 Keyword 'Check No Device Owners In Controller' failed after retrying 15 times. The last error was: Dictionary does not contain key '1'. 03:55:55 ------------------------------------------------------------------------------ 03:55:57 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test su... | FAIL | 03:55:57 5 tests, 0 passed, 5 failed 03:55:57 ============================================================================== 03:55:57 openflowplugin-clustering.txt.9145 :: Switch connections and cluster are re... 03:55:57 ============================================================================== 03:55:57 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 03:56:05 ------------------------------------------------------------------------------ 03:56:05 Check Entity Owner Status And Find Owner and Successor :: Check En... | FAIL | 03:56:36 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=9145 03:56:36 03:56:36 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Successor list [] is not the came as expected [2, 3] 03:56:36 Lengths are different: 2 != 0 03:56:36 ------------------------------------------------------------------------------ 03:56:36 Stop Mininet :: Stop Mininet. | PASS | 03:56:36 ------------------------------------------------------------------------------ 03:56:36 openflowplugin-clustering.txt.9145 :: Switch connections and clust... | FAIL | 03:56:36 3 tests, 2 passed, 1 failed 03:56:36 ============================================================================== 03:56:36 openflowplugin-clustering.txt | FAIL | 03:56:36 209 tests, 19 passed, 190 failed 03:56:36 ============================================================================== 03:56:36 Output: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/output.xml 03:56:43 Log: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/log.html 03:56:43 Report: /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/report.html 03:56:44 + true 03:56:44 + echo 'Examining the files in data/log and checking filesize' 03:56:44 Examining the files in data/log and checking filesize 03:56:44 + ssh 10.30.171.230 'ls -altr /tmp/karaf-0.22.1/data/log/' 03:56:44 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:44 total 1560 03:56:44 drwxrwxr-x 2 jenkins jenkins 4096 Aug 23 03:20 . 03:56:44 -rw-rw-r-- 1 jenkins jenkins 1720 Aug 23 03:20 karaf_console.log 03:56:44 drwxrwxr-x 9 jenkins jenkins 4096 Aug 23 03:20 .. 03:56:44 -rw-rw-r-- 1 jenkins jenkins 1581710 Aug 23 03:56 karaf.log 03:56:44 + ssh 10.30.171.230 'du -hs /tmp/karaf-0.22.1/data/log/*' 03:56:44 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:44 1.6M /tmp/karaf-0.22.1/data/log/karaf.log 03:56:44 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:44 + ssh 10.30.171.111 'ls -altr /tmp/karaf-0.22.1/data/log/' 03:56:44 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:44 total 1208 03:56:44 drwxrwxr-x 2 jenkins jenkins 4096 Aug 23 03:19 . 03:56:44 -rw-rw-r-- 1 jenkins jenkins 1720 Aug 23 03:20 karaf_console.log 03:56:44 drwxrwxr-x 9 jenkins jenkins 4096 Aug 23 03:20 .. 03:56:44 -rw-rw-r-- 1 jenkins jenkins 1223125 Aug 23 03:56 karaf.log 03:56:44 + ssh 10.30.171.111 'du -hs /tmp/karaf-0.22.1/data/log/*' 03:56:44 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:45 1.2M /tmp/karaf-0.22.1/data/log/karaf.log 03:56:45 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:45 + ssh 10.30.171.29 'ls -altr /tmp/karaf-0.22.1/data/log/' 03:56:45 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:45 total 1092 03:56:45 drwxrwxr-x 2 jenkins jenkins 4096 Aug 23 03:19 . 03:56:45 -rw-rw-r-- 1 jenkins jenkins 1720 Aug 23 03:20 karaf_console.log 03:56:45 drwxrwxr-x 9 jenkins jenkins 4096 Aug 23 03:20 .. 03:56:45 -rw-rw-r-- 1 jenkins jenkins 1102427 Aug 23 03:56 karaf.log 03:56:45 + ssh 10.30.171.29 'du -hs /tmp/karaf-0.22.1/data/log/*' 03:56:45 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:45 1.1M /tmp/karaf-0.22.1/data/log/karaf.log 03:56:45 4.0K /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:45 + set +e 03:56:45 ++ seq 1 3 03:56:45 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:45 + CONTROLLERIP=ODL_SYSTEM_1_IP 03:56:45 + echo 'Let'\''s take the karaf thread dump again' 03:56:45 Let's take the karaf thread dump again 03:56:45 + ssh 10.30.171.230 'sudo ps aux' 03:56:45 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:45 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 03:56:45 ++ grep -v grep 03:56:45 ++ tr -s ' ' 03:56:45 ++ cut -f2 '-d ' 03:56:45 + pid=6523 03:56:45 + echo 'karaf main: org.apache.karaf.main.Main, pid:6523' 03:56:45 karaf main: org.apache.karaf.main.Main, pid:6523 03:56:45 + ssh 10.30.171.230 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 6523' 03:56:45 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:46 + echo 'killing karaf process...' 03:56:46 killing karaf process... 03:56:46 + ssh 10.30.171.230 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 03:56:46 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:46 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:46 + CONTROLLERIP=ODL_SYSTEM_2_IP 03:56:46 + echo 'Let'\''s take the karaf thread dump again' 03:56:46 Let's take the karaf thread dump again 03:56:46 + ssh 10.30.171.111 'sudo ps aux' 03:56:46 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:46 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 03:56:46 ++ grep -v grep 03:56:46 ++ tr -s ' ' 03:56:46 ++ cut -f2 '-d ' 03:56:46 + pid=5935 03:56:46 + echo 'karaf main: org.apache.karaf.main.Main, pid:5935' 03:56:46 karaf main: org.apache.karaf.main.Main, pid:5935 03:56:46 + ssh 10.30.171.111 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 5935' 03:56:46 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:47 + echo 'killing karaf process...' 03:56:47 killing karaf process... 03:56:47 + ssh 10.30.171.111 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 03:56:47 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:47 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:47 + CONTROLLERIP=ODL_SYSTEM_3_IP 03:56:47 + echo 'Let'\''s take the karaf thread dump again' 03:56:47 Let's take the karaf thread dump again 03:56:47 + ssh 10.30.171.29 'sudo ps aux' 03:56:47 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:47 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/ps_after.log 03:56:47 ++ grep -v grep 03:56:47 ++ tr -s ' ' 03:56:47 ++ cut -f2 '-d ' 03:56:47 + pid=5871 03:56:47 + echo 'karaf main: org.apache.karaf.main.Main, pid:5871' 03:56:47 karaf main: org.apache.karaf.main.Main, pid:5871 03:56:47 + ssh 10.30.171.29 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 5871' 03:56:47 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:48 + echo 'killing karaf process...' 03:56:48 killing karaf process... 03:56:48 + ssh 10.30.171.29 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 03:56:48 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:48 + sleep 5 03:56:53 ++ seq 1 3 03:56:53 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:53 + CONTROLLERIP=ODL_SYSTEM_1_IP 03:56:53 + echo 'Compressing karaf.log 1' 03:56:53 Compressing karaf.log 1 03:56:53 + ssh 10.30.171.230 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 03:56:53 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:53 + echo 'Fetching compressed karaf.log 1' 03:56:53 Fetching compressed karaf.log 1 03:56:53 + scp 10.30.171.230:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl1_karaf.log.gz 03:56:53 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:53 + ssh 10.30.171.230 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 03:56:54 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:54 + scp 10.30.171.230:/tmp/karaf-0.22.1/data/log/karaf_console.log odl1_karaf_console.log 03:56:54 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:54 + ssh 10.30.171.230 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:54 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:54 + echo 'Fetch GC logs' 03:56:54 Fetch GC logs 03:56:54 + mkdir -p gclogs-1 03:56:54 + scp '10.30.171.230:/tmp/karaf-0.22.1/data/log/*.log' gclogs-1/ 03:56:54 Warning: Permanently added '10.30.171.230' (ECDSA) to the list of known hosts. 03:56:54 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 03:56:54 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:54 + CONTROLLERIP=ODL_SYSTEM_2_IP 03:56:54 + echo 'Compressing karaf.log 2' 03:56:54 Compressing karaf.log 2 03:56:54 + ssh 10.30.171.111 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 03:56:54 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:55 + echo 'Fetching compressed karaf.log 2' 03:56:55 Fetching compressed karaf.log 2 03:56:55 + scp 10.30.171.111:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl2_karaf.log.gz 03:56:55 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:55 + ssh 10.30.171.111 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 03:56:55 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:55 + scp 10.30.171.111:/tmp/karaf-0.22.1/data/log/karaf_console.log odl2_karaf_console.log 03:56:55 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:55 + ssh 10.30.171.111 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:55 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:55 + echo 'Fetch GC logs' 03:56:55 Fetch GC logs 03:56:55 + mkdir -p gclogs-2 03:56:55 + scp '10.30.171.111:/tmp/karaf-0.22.1/data/log/*.log' gclogs-2/ 03:56:55 Warning: Permanently added '10.30.171.111' (ECDSA) to the list of known hosts. 03:56:56 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 03:56:56 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 03:56:56 + CONTROLLERIP=ODL_SYSTEM_3_IP 03:56:56 + echo 'Compressing karaf.log 3' 03:56:56 Compressing karaf.log 3 03:56:56 + ssh 10.30.171.29 gzip --best /tmp/karaf-0.22.1/data/log/karaf.log 03:56:56 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:56 + echo 'Fetching compressed karaf.log 3' 03:56:56 Fetching compressed karaf.log 3 03:56:56 + scp 10.30.171.29:/tmp/karaf-0.22.1/data/log/karaf.log.gz odl3_karaf.log.gz 03:56:56 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:56 + ssh 10.30.171.29 rm -f /tmp/karaf-0.22.1/data/log/karaf.log.gz 03:56:56 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:56 + scp 10.30.171.29:/tmp/karaf-0.22.1/data/log/karaf_console.log odl3_karaf_console.log 03:56:56 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:57 + ssh 10.30.171.29 rm -f /tmp/karaf-0.22.1/data/log/karaf_console.log 03:56:57 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:57 + echo 'Fetch GC logs' 03:56:57 Fetch GC logs 03:56:57 + mkdir -p gclogs-3 03:56:57 + scp '10.30.171.29:/tmp/karaf-0.22.1/data/log/*.log' gclogs-3/ 03:56:57 Warning: Permanently added '10.30.171.29' (ECDSA) to the list of known hosts. 03:56:57 scp: /tmp/karaf-0.22.1/data/log/*.log: No such file or directory 03:56:57 + echo 'Examine copied files' 03:56:57 Examine copied files 03:56:57 + ls -lt 03:56:57 total 118148 03:56:57 drwxrwxr-x. 2 jenkins jenkins 6 Aug 23 03:56 gclogs-3 03:56:57 -rw-rw-r--. 1 jenkins jenkins 1720 Aug 23 03:56 odl3_karaf_console.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 93599 Aug 23 03:56 odl3_karaf.log.gz 03:56:57 drwxrwxr-x. 2 jenkins jenkins 6 Aug 23 03:56 gclogs-2 03:56:57 -rw-rw-r--. 1 jenkins jenkins 1720 Aug 23 03:56 odl2_karaf_console.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 95943 Aug 23 03:56 odl2_karaf.log.gz 03:56:57 drwxrwxr-x. 2 jenkins jenkins 6 Aug 23 03:56 gclogs-1 03:56:57 -rw-rw-r--. 1 jenkins jenkins 1720 Aug 23 03:56 odl1_karaf_console.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 103159 Aug 23 03:56 odl1_karaf.log.gz 03:56:57 -rw-rw-r--. 1 jenkins jenkins 131124 Aug 23 03:56 karaf_3_5871_threads_after.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 13429 Aug 23 03:56 ps_after.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 137725 Aug 23 03:56 karaf_2_5935_threads_after.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 143509 Aug 23 03:56 karaf_1_6523_threads_after.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 287266 Aug 23 03:56 report.html 03:56:57 -rw-rw-r--. 1 jenkins jenkins 2799691 Aug 23 03:56 log.html 03:56:57 -rw-rw-r--. 1 jenkins jenkins 116773211 Aug 23 03:56 output.xml 03:56:57 -rw-rw-r--. 1 jenkins jenkins 1180 Aug 23 03:23 testplan.txt 03:56:57 -rw-rw-r--. 1 jenkins jenkins 95200 Aug 23 03:23 karaf_3_2123_threads_before.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 13871 Aug 23 03:23 ps_before.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 95208 Aug 23 03:23 karaf_2_2114_threads_before.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 96054 Aug 23 03:23 karaf_1_2223_threads_before.log 03:56:57 -rw-rw-r--. 1 jenkins jenkins 3043 Aug 23 03:19 post-startup-script.sh 03:56:57 -rw-rw-r--. 1 jenkins jenkins 1183 Aug 23 03:19 set_akka_debug.sh 03:56:57 -rw-rw-r--. 1 jenkins jenkins 133 Aug 23 03:19 configplan.txt 03:56:57 -rw-rw-r--. 1 jenkins jenkins 225 Aug 23 03:19 startup-script.sh 03:56:57 -rw-rw-r--. 1 jenkins jenkins 3290 Aug 23 03:19 configuration-script.sh 03:56:57 -rw-rw-r--. 1 jenkins jenkins 266 Aug 23 03:19 detect_variables.env 03:56:57 -rw-rw-r--. 1 jenkins jenkins 92 Aug 23 03:19 set_variables.env 03:56:57 -rw-rw-r--. 1 jenkins jenkins 358 Aug 23 03:19 slave_addresses.txt 03:56:57 -rw-rw-r--. 1 jenkins jenkins 570 Aug 23 03:19 requirements.txt 03:56:57 -rw-rw-r--. 1 jenkins jenkins 26 Aug 23 03:19 env.properties 03:56:57 -rw-rw-r--. 1 jenkins jenkins 334 Aug 23 03:17 stack-parameters.yaml 03:56:57 drwxrwxr-x. 7 jenkins jenkins 4096 Aug 23 03:16 test 03:56:57 drwxrwxr-x. 2 jenkins jenkins 6 Aug 23 03:16 test@tmp 03:56:57 + true 03:56:57 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/sh /tmp/jenkins2498407010154260916.sh 03:56:57 Cleaning up Robot installation... 03:56:58 $ ssh-agent -k 03:56:58 unset SSH_AUTH_SOCK; 03:56:58 unset SSH_AGENT_PID; 03:56:58 echo Agent pid 5299 killed; 03:56:58 [ssh-agent] Stopped. 03:56:58 Recording plot data 03:56:58 Robot results publisher started... 03:56:58 INFO: Checking test criticality is deprecated and will be dropped in a future release! 03:56:58 -Parsing output xml: 03:57:00 Done! 03:57:00 -Copying log files to build dir: 03:57:04 Done! 03:57:04 -Assigning results to build: 03:57:04 Done! 03:57:04 -Checking thresholds: 03:57:04 Done! 03:57:04 Done publishing Robot results. 03:57:04 Build step 'Publish Robot Framework test results' changed build result to UNSTABLE 03:57:04 [PostBuildScript] - [INFO] Executing post build scripts. 03:57:04 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins8348705369692862087.sh 03:57:04 Archiving csit artifacts 03:57:04 mv: cannot stat '*_1.png': No such file or directory 03:57:04 mv: cannot stat '/tmp/odl1_*': No such file or directory 03:57:04 mv: cannot stat '*_2.png': No such file or directory 03:57:04 mv: cannot stat '/tmp/odl2_*': No such file or directory 03:57:04 mv: cannot stat '*_3.png': No such file or directory 03:57:04 mv: cannot stat '/tmp/odl3_*': No such file or directory 03:57:04 % Total % Received % Xferd Average Speed Time Time Time Current 03:57:04 Dload Upload Total Spent Left Speed 03:57:04 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 452k 0 452k 0 0 2036k 0 --:--:-- --:--:-- --:--:-- 2027k 100 6440k 0 6440k 0 0 5265k 0 --:--:-- 0:00:01 --:--:-- 5261k 100 9.9M 0 9.9M 0 0 5014k 0 --:--:-- 0:00:02 --:--:-- 5017k 03:57:06 Archive: robot-plugin.zip 03:57:06 inflating: ./archives/robot-plugin/log.html 03:57:06 inflating: ./archives/robot-plugin/output.xml 03:57:07 inflating: ./archives/robot-plugin/report.html 03:57:07 mv: cannot stat '*.log.gz': No such file or directory 03:57:07 mv: cannot stat '*.csv': No such file or directory 03:57:07 mv: cannot stat '*.png': No such file or directory 03:57:07 [PostBuildScript] - [INFO] Executing post build scripts. 03:57:07 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins15748808868984889002.sh 03:57:07 [PostBuildScript] - [INFO] Executing post build scripts. 03:57:07 [EnvInject] - Injecting environment variables from a build step. 03:57:07 [EnvInject] - Injecting as environment variables the properties content 03:57:07 OS_CLOUD=vex 03:57:07 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:57:07 03:57:07 [EnvInject] - Variables injected successfully. 03:57:07 provisioning config files... 03:57:07 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 03:57:07 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins1810768227769255644.sh 03:57:07 ---> openstack-stack-delete.sh 03:57:07 Setup pyenv: 03:57:07 system 03:57:07 3.8.13 03:57:07 3.9.13 03:57:07 3.10.13 03:57:07 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:57:07 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:57:09 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 03:57:09 lftools 0.37.13 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 03:57:10 lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient 03:57:26 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:57:26 INFO: Retrieving stack cost for: releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:57:31 DEBUG: Successfully retrieved stack cost: total: 0.39 03:57:43 INFO: Deleting stack releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:57:43 Successfully deleted stack releng-openflowplugin-csit-3node-clustering-only-titanium-373 03:57:43 [PostBuildScript] - [INFO] Executing post build scripts. 03:57:43 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins6162251401898908582.sh 03:57:44 ---> sysstat.sh 03:57:44 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins7872904562881479517.sh 03:57:44 ---> package-listing.sh 03:57:44 ++ facter osfamily 03:57:44 ++ tr '[:upper:]' '[:lower:]' 03:57:44 + OS_FAMILY=redhat 03:57:44 + workspace=/w/workspace/openflowplugin-csit-3node-clustering-only-titanium 03:57:44 + START_PACKAGES=/tmp/packages_start.txt 03:57:44 + END_PACKAGES=/tmp/packages_end.txt 03:57:44 + DIFF_PACKAGES=/tmp/packages_diff.txt 03:57:44 + PACKAGES=/tmp/packages_start.txt 03:57:44 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 03:57:44 + PACKAGES=/tmp/packages_end.txt 03:57:44 + case "${OS_FAMILY}" in 03:57:44 + rpm -qa 03:57:44 + sort 03:57:45 + '[' -f /tmp/packages_start.txt ']' 03:57:45 + '[' -f /tmp/packages_end.txt ']' 03:57:45 + diff /tmp/packages_start.txt /tmp/packages_end.txt 03:57:45 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-titanium ']' 03:57:45 + mkdir -p /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 03:57:45 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/archives/ 03:57:45 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins16489920636648099105.sh 03:57:45 ---> capture-instance-metadata.sh 03:57:45 Setup pyenv: 03:57:45 system 03:57:45 3.8.13 03:57:45 3.9.13 03:57:45 3.10.13 03:57:45 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:57:45 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:57:47 lf-activate-venv(): INFO: Installing: lftools 03:57:56 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:57:56 INFO: Running in OpenStack, capturing instance metadata 03:57:57 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins11987010923981132375.sh 03:57:57 provisioning config files... 03:57:57 Could not find credentials [logs] for openflowplugin-csit-3node-clustering-only-titanium #373 03:57:57 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/openflowplugin-csit-3node-clustering-only-titanium@tmp/config2298113358519216597tmp 03:57:57 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 03:57:57 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 03:57:57 provisioning config files... 03:57:58 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 03:57:58 [EnvInject] - Injecting environment variables from a build step. 03:57:58 [EnvInject] - Injecting as environment variables the properties content 03:57:58 SERVER_ID=logs 03:57:58 03:57:58 [EnvInject] - Variables injected successfully. 03:57:58 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins16070780650498356473.sh 03:57:58 ---> create-netrc.sh 03:57:58 WARN: Log server credential not found. 03:57:58 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins15988245178531193979.sh 03:57:58 ---> python-tools-install.sh 03:57:58 Setup pyenv: 03:57:58 system 03:57:58 3.8.13 03:57:58 3.9.13 03:57:58 3.10.13 03:57:58 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:57:58 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:58:00 lf-activate-venv(): INFO: Installing: lftools 03:58:09 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:58:09 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins14117641712814724404.sh 03:58:09 ---> sudo-logs.sh 03:58:09 Archiving 'sudo' log.. 03:58:10 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash /tmp/jenkins242198804289334980.sh 03:58:10 ---> job-cost.sh 03:58:10 Setup pyenv: 03:58:10 system 03:58:10 3.8.13 03:58:10 3.9.13 03:58:10 3.10.13 03:58:10 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:58:10 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:58:12 lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 03:58:19 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:58:19 DEBUG: total: 0.39 03:58:19 INFO: Retrieving Stack Cost... 03:58:19 INFO: Retrieving Pricing Info for: v3-standard-2 03:58:19 INFO: Archiving Costs 03:58:20 [openflowplugin-csit-3node-clustering-only-titanium] $ /bin/bash -l /tmp/jenkins11471914654904921847.sh 03:58:20 ---> logs-deploy.sh 03:58:20 Setup pyenv: 03:58:20 system 03:58:20 3.8.13 03:58:20 3.9.13 03:58:20 3.10.13 03:58:20 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-titanium/.python-version) 03:58:20 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-0MxD from file:/tmp/.os_lf_venv 03:58:22 lf-activate-venv(): INFO: Installing: lftools 03:58:31 lf-activate-venv(): INFO: Adding /tmp/venv-0MxD/bin to PATH 03:58:31 WARNING: Nexus logging server not set 03:58:31 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/openflowplugin-csit-3node-clustering-only-titanium/373/ 03:58:31 INFO: archiving logs to S3 03:58:32 ---> uname -a: 03:58:32 Linux prd-centos8-robot-2c-8g-3389.novalocal 4.18.0-553.5.1.el8.x86_64 #1 SMP Tue May 21 05:46:01 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 03:58:32 03:58:32 03:58:32 ---> lscpu: 03:58:32 Architecture: x86_64 03:58:32 CPU op-mode(s): 32-bit, 64-bit 03:58:32 Byte Order: Little Endian 03:58:32 CPU(s): 2 03:58:32 On-line CPU(s) list: 0,1 03:58:32 Thread(s) per core: 1 03:58:32 Core(s) per socket: 1 03:58:32 Socket(s): 2 03:58:32 NUMA node(s): 1 03:58:32 Vendor ID: AuthenticAMD 03:58:32 CPU family: 23 03:58:32 Model: 49 03:58:32 Model name: AMD EPYC-Rome Processor 03:58:32 Stepping: 0 03:58:32 CPU MHz: 2800.000 03:58:32 BogoMIPS: 5600.00 03:58:32 Virtualization: AMD-V 03:58:32 Hypervisor vendor: KVM 03:58:32 Virtualization type: full 03:58:32 L1d cache: 32K 03:58:32 L1i cache: 32K 03:58:32 L2 cache: 512K 03:58:32 L3 cache: 16384K 03:58:32 NUMA node0 CPU(s): 0,1 03:58:32 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 03:58:32 03:58:32 03:58:32 ---> nproc: 03:58:32 2 03:58:32 03:58:32 03:58:32 ---> df -h: 03:58:32 Filesystem Size Used Avail Use% Mounted on 03:58:32 devtmpfs 3.8G 0 3.8G 0% /dev 03:58:32 tmpfs 3.8G 0 3.8G 0% /dev/shm 03:58:32 tmpfs 3.8G 17M 3.8G 1% /run 03:58:32 tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup 03:58:32 /dev/vda1 40G 8.5G 32G 22% / 03:58:32 tmpfs 770M 0 770M 0% /run/user/1001 03:58:32 03:58:32 03:58:32 ---> free -m: 03:58:32 total used free shared buff/cache available 03:58:32 Mem: 7697 609 4682 19 2405 6789 03:58:32 Swap: 1023 0 1023 03:58:32 03:58:32 03:58:32 ---> ip addr: 03:58:32 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 03:58:32 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 03:58:32 inet 127.0.0.1/8 scope host lo 03:58:32 valid_lft forever preferred_lft forever 03:58:32 inet6 ::1/128 scope host 03:58:32 valid_lft forever preferred_lft forever 03:58:32 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 03:58:32 link/ether fa:16:3e:c5:2a:28 brd ff:ff:ff:ff:ff:ff 03:58:32 altname enp0s3 03:58:32 altname ens3 03:58:32 inet 10.30.171.77/23 brd 10.30.171.255 scope global dynamic noprefixroute eth0 03:58:32 valid_lft 83828sec preferred_lft 83828sec 03:58:32 inet6 fe80::f816:3eff:fec5:2a28/64 scope link 03:58:32 valid_lft forever preferred_lft forever 03:58:32 03:58:32 03:58:32 ---> sar -b -r -n DEV: 03:58:32 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 08/23/2025 _x86_64_ (2 CPU) 03:58:32 03:58:32 03:15:38 LINUX RESTART (2 CPU) 03:58:32 03:58:32 03:16:01 AM tps rtps wtps bread/s bwrtn/s 03:58:32 03:17:01 AM 135.28 55.44 79.84 7657.10 23445.91 03:58:32 03:18:01 AM 95.32 0.72 94.60 52.92 10995.20 03:58:32 03:19:01 AM 31.53 0.42 31.11 64.79 2811.10 03:58:32 03:20:01 AM 77.31 6.96 70.34 1292.77 7971.14 03:58:32 03:21:01 AM 10.83 0.00 10.83 0.00 1116.84 03:58:32 03:22:01 AM 2.42 0.00 2.42 0.00 70.04 03:58:32 03:23:01 AM 0.12 0.00 0.12 0.00 1.30 03:58:32 03:24:01 AM 0.55 0.22 0.33 11.73 51.67 03:58:32 03:25:01 AM 2.18 0.00 2.18 0.00 223.31 03:58:32 03:26:01 AM 0.48 0.00 0.48 0.00 213.80 03:58:32 03:27:01 AM 0.47 0.00 0.47 0.00 275.89 03:58:32 03:28:01 AM 0.27 0.00 0.27 0.00 94.80 03:58:32 03:29:01 AM 0.23 0.00 0.23 0.00 69.84 03:58:32 03:30:01 AM 0.35 0.00 0.35 0.00 327.42 03:58:32 03:31:01 AM 1.58 0.03 1.55 0.27 285.14 03:58:32 03:32:01 AM 0.42 0.00 0.42 0.00 170.77 03:58:32 03:33:01 AM 0.30 0.03 0.27 1.87 136.67 03:58:32 03:34:01 AM 0.17 0.00 0.17 0.00 18.80 03:58:32 03:35:01 AM 0.27 0.00 0.27 0.00 173.67 03:58:32 03:36:01 AM 0.53 0.00 0.53 0.00 80.14 03:58:32 03:37:01 AM 0.33 0.00 0.33 0.00 96.67 03:58:32 03:38:01 AM 0.17 0.00 0.17 0.00 36.27 03:58:32 03:39:01 AM 0.30 0.00 0.30 0.00 206.15 03:58:32 03:40:01 AM 0.30 0.00 0.30 0.00 173.91 03:58:32 03:41:01 AM 0.42 0.00 0.42 0.00 75.34 03:58:32 03:42:01 AM 0.20 0.00 0.20 0.00 11.63 03:58:32 03:43:01 AM 0.27 0.00 0.27 0.00 199.15 03:58:32 03:44:01 AM 0.27 0.00 0.27 0.00 185.62 03:58:32 03:45:01 AM 0.18 0.00 0.18 0.00 88.52 03:58:32 03:46:01 AM 0.22 0.00 0.22 0.00 6.08 03:58:32 03:47:01 AM 0.42 0.00 0.42 0.00 118.56 03:58:32 03:48:01 AM 0.38 0.00 0.38 0.00 136.81 03:58:32 03:49:02 AM 0.35 0.00 0.35 0.00 24.28 03:58:32 03:50:01 AM 0.34 0.00 0.34 0.00 26.04 03:58:32 03:51:01 AM 0.33 0.00 0.33 0.00 22.63 03:58:32 03:52:01 AM 0.28 0.00 0.28 0.00 25.96 03:58:32 03:53:01 AM 0.22 0.00 0.22 0.00 19.33 03:58:32 03:54:01 AM 0.28 0.00 0.28 0.00 109.05 03:58:32 03:55:01 AM 0.30 0.00 0.30 0.00 252.97 03:58:32 03:56:01 AM 0.55 0.00 0.55 0.00 41.05 03:58:32 03:57:01 AM 1.95 0.18 1.77 4.27 132.59 03:58:32 03:58:01 AM 31.16 0.47 30.69 69.84 5601.80 03:58:32 Average: 9.52 1.54 7.99 218.12 1336.97 03:58:32 03:58:32 03:16:01 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 03:58:32 03:17:01 AM 5376660 7062328 2505772 31.79 2688 1868528 678676 7.60 188236 1999484 174520 03:58:32 03:18:01 AM 5174948 7014028 2707484 34.35 2688 2014684 704624 7.89 198628 2158748 42480 03:58:32 03:19:01 AM 5160272 7044152 2722160 34.53 2688 2057352 655004 7.33 224640 2142624 40404 03:58:32 03:20:01 AM 4974216 7064688 2908216 36.89 2688 2257692 619040 6.93 262280 2265332 25936 03:58:32 03:21:01 AM 4974528 7064880 2907904 36.89 2688 2257696 619040 6.93 262288 2265172 4 03:58:32 03:22:01 AM 4974492 7064844 2907940 36.89 2688 2257696 619040 6.93 262292 2265332 8 03:58:32 03:23:01 AM 4974836 7065188 2907596 36.89 2688 2257696 619040 6.93 262292 2264992 4 03:58:32 03:24:01 AM 4925692 7017976 2956740 37.51 2688 2259644 721072 8.07 262424 2313516 264 03:58:32 03:25:01 AM 4914328 7011216 2968104 37.65 2688 2264256 721072 8.07 262424 2324532 280 03:58:32 03:26:01 AM 4884112 6991784 2998320 38.04 2688 2275084 752176 8.42 262424 2354356 4860 03:58:32 03:27:01 AM 4869444 6981668 3012988 38.22 2688 2279604 752176 8.42 262424 2369740 1216 03:58:32 03:28:01 AM 4864624 6978848 3017808 38.29 2688 2281592 792012 8.87 262424 2374660 408 03:58:32 03:29:01 AM 4859692 6978380 3022740 38.35 2688 2286088 792012 8.87 262424 2379296 2888 03:58:32 03:30:01 AM 4848140 6977480 3034292 38.49 2688 2296700 791180 8.86 262424 2390124 3708 03:58:32 03:31:01 AM 4842440 6976728 3039992 38.57 2688 2301692 789648 8.84 262556 2395848 1100 03:58:32 03:32:01 AM 4837324 6977068 3045108 38.63 2688 2307112 789648 8.84 262640 2401072 1532 03:58:32 03:33:01 AM 4837188 6979628 3045244 38.63 2688 2309792 736932 8.25 262832 2401872 112 03:58:32 03:34:01 AM 4834508 6979384 3047924 38.67 2688 2312228 736932 8.25 263056 2403828 2032 03:58:32 03:35:01 AM 4828540 6977160 3053892 38.74 2688 2316032 792028 8.87 263108 2409812 668 03:58:32 03:36:01 AM 4825164 6977228 3057268 38.79 2688 2319432 792028 8.87 263108 2413140 1912 03:58:32 03:37:01 AM 4823668 6977064 3058764 38.80 2688 2320836 792028 8.87 263108 2414540 552 03:58:32 03:38:01 AM 4820512 6976936 3061920 38.84 2688 2323800 792028 8.87 263108 2417616 2460 03:58:32 03:39:01 AM 4816444 6977376 3065988 38.90 2688 2328300 792028 8.87 263108 2422012 824 03:58:32 03:40:01 AM 4811316 6976720 3071116 38.96 2688 2332756 791244 8.86 263108 2426832 136 03:58:32 03:41:01 AM 4809956 6977328 3072476 38.98 2688 2334744 791244 8.86 263108 2428580 32 03:58:32 03:42:01 AM 4805252 6976396 3077180 39.04 2688 2338548 791244 8.86 263108 2432304 3560 03:58:32 03:43:01 AM 4802872 6977144 3079560 39.07 2688 2341648 791244 8.86 263108 2435544 716 03:58:32 03:44:01 AM 4796804 6977304 3085628 39.15 2688 2347860 791244 8.86 263108 2441772 1396 03:58:32 03:45:01 AM 4795488 6977312 3086944 39.16 2688 2349176 791244 8.86 263108 2442980 88 03:58:32 03:46:01 AM 4794372 6976944 3088060 39.18 2688 2349940 791244 8.86 263108 2444048 768 03:58:32 03:47:01 AM 4788508 6976592 3093924 39.25 2688 2355440 791244 8.86 263108 2449432 2864 03:58:32 03:48:01 AM 4790452 6979716 3091980 39.23 2688 2356608 774296 8.67 263108 2448368 4 03:58:32 03:49:02 AM 4789464 6979372 3092968 39.24 2688 2357276 774296 8.67 263108 2449012 52 03:58:32 03:50:01 AM 4788492 6979068 3093940 39.25 2688 2357924 774296 8.67 263108 2450028 32 03:58:32 03:51:01 AM 4788048 6979232 3094384 39.26 2688 2358568 774296 8.67 263108 2450608 116 03:58:32 03:52:01 AM 4787324 6979176 3095108 39.27 2688 2359216 774296 8.67 263108 2451228 96 03:58:32 03:53:01 AM 4786868 6979388 3095564 39.27 2688 2359880 774296 8.67 263108 2451828 252 03:58:32 03:54:01 AM 4780860 6978664 3101572 39.35 2688 2365112 774296 8.67 263108 2456904 2288 03:58:32 03:55:01 AM 4771212 6974312 3111220 39.47 2688 2370444 823700 9.22 263264 2466592 96 03:58:32 03:56:01 AM 4769900 6974212 3112532 39.49 2688 2371648 823700 9.22 263264 2467884 356 03:58:32 03:57:01 AM 4992364 7022840 2890068 36.66 2688 2199336 642432 7.19 371408 2137876 6580 03:58:32 03:58:01 AM 4850908 7002580 3031524 38.46 2688 2324072 653324 7.32 573072 2100088 11084 03:58:32 Average: 4870053 6994294 3012379 38.22 2688 2292470 751015 8.41 268616 2366180 8064 03:58:32 03:58:32 03:16:01 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 03:58:32 03:17:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:17:01 AM eth0 320.77 193.75 1757.22 62.40 0.00 0.00 0.00 0.00 03:58:32 03:18:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:18:01 AM eth0 72.08 51.60 734.14 8.39 0.00 0.00 0.00 0.00 03:58:32 03:19:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:19:01 AM eth0 19.03 15.83 7.68 5.41 0.00 0.00 0.00 0.00 03:58:32 03:20:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:20:01 AM eth0 615.66 453.95 443.01 101.23 0.00 0.00 0.00 0.00 03:58:32 03:21:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:21:01 AM eth0 1.55 0.75 0.30 0.12 0.00 0.00 0.00 0.00 03:58:32 03:22:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:22:01 AM eth0 2.65 2.02 0.57 0.49 0.00 0.00 0.00 0.00 03:58:32 03:23:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:23:01 AM eth0 3.53 2.02 0.57 0.46 0.00 0.00 0.00 0.00 03:58:32 03:24:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:24:01 AM eth0 54.06 37.26 14.40 4.50 0.00 0.00 0.00 0.00 03:58:32 03:25:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:25:01 AM eth0 11.33 8.26 21.25 1.59 0.00 0.00 0.00 0.00 03:58:32 03:26:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:26:01 AM eth0 13.01 11.56 50.74 1.64 0.00 0.00 0.00 0.00 03:58:32 03:27:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:27:01 AM eth0 10.21 11.50 20.17 1.77 0.00 0.00 0.00 0.00 03:58:32 03:28:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:28:01 AM eth0 24.20 24.36 6.20 3.14 0.00 0.00 0.00 0.00 03:58:32 03:29:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:29:01 AM eth0 7.68 8.48 19.73 1.72 0.00 0.00 0.00 0.00 03:58:32 03:30:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:30:01 AM eth0 10.56 11.28 49.98 1.61 0.00 0.00 0.00 0.00 03:58:32 03:31:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:31:01 AM eth0 10.30 11.81 22.31 1.76 0.00 0.00 0.00 0.00 03:58:32 03:32:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:32:01 AM eth0 100.63 97.95 35.18 7.78 0.00 0.00 0.00 0.00 03:58:32 03:33:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:33:01 AM eth0 68.84 66.84 20.98 6.42 0.00 0.00 0.00 0.00 03:58:32 03:34:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:34:01 AM eth0 26.33 25.26 15.18 2.24 0.00 0.00 0.00 0.00 03:58:32 03:35:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:35:01 AM eth0 60.29 56.64 10.43 13.13 0.00 0.00 0.00 0.00 03:58:32 03:36:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:36:01 AM eth0 36.64 37.55 47.43 2.96 0.00 0.00 0.00 0.00 03:58:32 03:37:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:37:01 AM eth0 31.11 32.73 5.57 2.79 0.00 0.00 0.00 0.00 03:58:32 03:38:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:38:01 AM eth0 26.17 26.86 40.45 2.28 0.00 0.00 0.00 0.00 03:58:32 03:39:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:39:01 AM eth0 52.72 53.94 49.58 4.41 0.00 0.00 0.00 0.00 03:58:32 03:40:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:40:01 AM eth0 21.62 22.34 65.87 1.85 0.00 0.00 0.00 0.00 03:58:32 03:41:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:41:01 AM eth0 56.91 57.40 24.99 4.20 0.00 0.00 0.00 0.00 03:58:32 03:42:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:42:01 AM eth0 29.04 27.81 55.68 2.40 0.00 0.00 0.00 0.00 03:58:32 03:43:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:43:01 AM eth0 57.85 53.37 38.68 4.37 0.00 0.00 0.00 0.00 03:58:32 03:44:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:44:01 AM eth0 44.96 45.73 83.13 3.66 0.00 0.00 0.00 0.00 03:58:32 03:45:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:45:01 AM eth0 55.61 57.07 9.77 4.41 0.00 0.00 0.00 0.00 03:58:32 03:46:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:46:01 AM eth0 29.57 29.82 4.84 2.56 0.00 0.00 0.00 0.00 03:58:32 03:47:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:47:01 AM eth0 43.89 44.79 77.32 3.58 0.00 0.00 0.00 0.00 03:58:32 03:48:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:48:01 AM eth0 22.70 21.71 13.73 2.67 0.00 0.00 0.00 0.00 03:58:32 03:49:02 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:49:02 AM eth0 2.23 2.18 1.55 0.35 0.00 0.00 0.00 0.00 03:58:32 03:50:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:50:01 AM eth0 0.73 1.05 1.22 0.22 0.00 0.00 0.00 0.00 03:58:32 03:51:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:51:01 AM eth0 1.38 1.02 1.30 0.22 0.00 0.00 0.00 0.00 03:58:32 03:52:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:52:01 AM eth0 1.08 1.15 1.34 0.28 0.00 0.00 0.00 0.00 03:58:32 03:53:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:53:01 AM eth0 1.13 1.15 1.26 0.22 0.00 0.00 0.00 0.00 03:58:32 03:54:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:54:01 AM eth0 22.06 22.85 80.99 1.92 0.00 0.00 0.00 0.00 03:58:32 03:55:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:55:01 AM eth0 114.90 94.52 79.87 8.31 0.00 0.00 0.00 0.00 03:58:32 03:56:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:56:01 AM eth0 36.42 26.79 6.52 2.60 0.00 0.00 0.00 0.00 03:58:32 03:57:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:57:01 AM eth0 90.30 59.75 81.02 163.97 0.00 0.00 0.00 0.00 03:58:32 03:58:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 03:58:01 AM eth0 92.25 65.19 199.20 172.10 0.00 0.00 0.00 0.00 03:58:32 Average: lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 03:58:32 Average: eth0 54.88 44.73 100.08 14.72 0.00 0.00 0.00 0.00 03:58:32 03:58:32 03:58:32 ---> sar -P ALL: 03:58:32 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 08/23/2025 _x86_64_ (2 CPU) 03:58:32 03:58:32 03:15:38 LINUX RESTART (2 CPU) 03:58:32 03:58:32 03:16:01 AM CPU %user %nice %system %iowait %steal %idle 03:58:32 03:17:01 AM all 46.25 0.02 8.36 2.78 0.12 42.48 03:58:32 03:17:01 AM 0 43.74 0.03 9.19 2.68 0.12 44.24 03:58:32 03:17:01 AM 1 48.76 0.00 7.53 2.87 0.12 40.72 03:58:32 03:18:01 AM all 24.85 0.00 3.44 1.23 0.09 70.39 03:58:32 03:18:01 AM 0 13.36 0.00 2.13 1.00 0.10 83.41 03:58:32 03:18:01 AM 1 36.36 0.00 4.74 1.45 0.08 57.36 03:58:32 03:19:01 AM all 19.18 0.00 3.03 0.62 0.07 77.10 03:58:32 03:19:01 AM 0 15.13 0.00 3.19 0.52 0.07 81.09 03:58:32 03:19:01 AM 1 23.22 0.00 2.87 0.72 0.07 73.12 03:58:32 03:20:01 AM all 26.49 0.00 5.63 0.70 0.09 67.09 03:58:32 03:20:01 AM 0 20.51 0.00 5.18 0.74 0.10 73.47 03:58:32 03:20:01 AM 1 32.45 0.00 6.07 0.67 0.08 60.73 03:58:32 03:21:01 AM all 0.29 0.00 0.15 0.11 0.03 99.42 03:58:32 03:21:01 AM 0 0.48 0.00 0.20 0.18 0.02 99.11 03:58:32 03:21:01 AM 1 0.10 0.00 0.10 0.03 0.05 99.72 03:58:32 03:22:01 AM all 0.24 0.00 0.08 0.02 0.03 99.62 03:58:32 03:22:01 AM 0 0.38 0.00 0.08 0.00 0.03 99.50 03:58:32 03:22:01 AM 1 0.10 0.00 0.08 0.03 0.03 99.75 03:58:32 03:23:01 AM all 0.34 0.00 0.07 0.00 0.04 99.55 03:58:32 03:23:01 AM 0 0.53 0.00 0.05 0.00 0.03 99.38 03:58:32 03:23:01 AM 1 0.15 0.00 0.08 0.00 0.05 99.72 03:58:32 03:24:01 AM all 5.63 0.00 0.64 0.00 0.08 93.65 03:58:32 03:24:01 AM 0 4.19 0.00 0.68 0.00 0.08 95.04 03:58:32 03:24:01 AM 1 7.08 0.00 0.60 0.00 0.07 92.25 03:58:32 03:25:01 AM all 6.01 0.00 0.32 0.02 0.09 93.56 03:58:32 03:25:01 AM 0 7.17 0.00 0.37 0.00 0.10 92.36 03:58:32 03:25:01 AM 1 4.86 0.00 0.27 0.03 0.08 94.76 03:58:32 03:26:01 AM all 13.38 0.00 0.50 0.02 0.09 86.01 03:58:32 03:26:01 AM 0 14.77 0.00 0.55 0.03 0.08 84.57 03:58:32 03:26:01 AM 1 11.99 0.00 0.45 0.00 0.10 87.46 03:58:32 03:27:01 AM all 6.84 0.00 0.54 0.02 0.09 92.51 03:58:32 03:27:01 AM 0 6.58 0.00 0.58 0.02 0.08 92.73 03:58:32 03:27:01 AM 1 7.09 0.00 0.50 0.02 0.10 92.29 03:58:32 03:58:32 03:27:01 AM CPU %user %nice %system %iowait %steal %idle 03:58:32 03:28:01 AM all 5.97 0.00 0.45 0.00 0.09 93.49 03:58:32 03:28:01 AM 0 7.81 0.00 0.60 0.00 0.10 91.49 03:58:32 03:28:01 AM 1 4.13 0.00 0.30 0.00 0.08 95.49 03:58:32 03:29:01 AM all 5.81 0.00 0.29 0.00 0.08 93.82 03:58:32 03:29:01 AM 0 1.27 0.00 0.18 0.00 0.07 98.48 03:58:32 03:29:01 AM 1 10.35 0.00 0.40 0.00 0.08 89.17 03:58:32 03:30:01 AM all 13.17 0.00 0.49 0.01 0.09 86.24 03:58:32 03:30:01 AM 0 5.58 0.00 0.33 0.00 0.08 94.00 03:58:32 03:30:01 AM 1 20.75 0.00 0.65 0.02 0.10 78.48 03:58:32 03:31:01 AM all 7.56 0.00 0.66 0.01 0.08 91.69 03:58:32 03:31:01 AM 0 3.44 0.00 0.52 0.02 0.08 95.94 03:58:32 03:31:01 AM 1 11.67 0.00 0.80 0.00 0.08 87.45 03:58:32 03:32:01 AM all 8.78 0.00 1.20 0.01 0.09 89.93 03:58:32 03:32:01 AM 0 2.43 0.00 0.45 0.02 0.08 97.02 03:58:32 03:32:01 AM 1 15.15 0.00 1.95 0.00 0.10 82.80 03:58:32 03:33:01 AM all 6.62 0.00 0.80 0.00 0.08 92.50 03:58:32 03:33:01 AM 0 2.96 0.00 0.42 0.00 0.07 96.55 03:58:32 03:33:01 AM 1 10.29 0.00 1.17 0.00 0.10 88.43 03:58:32 03:34:01 AM all 7.75 0.00 0.85 0.00 0.10 91.30 03:58:32 03:34:01 AM 0 0.67 0.00 0.18 0.00 0.12 99.03 03:58:32 03:34:01 AM 1 15.12 0.00 1.55 0.00 0.09 83.25 03:58:32 03:35:01 AM all 7.19 0.00 0.41 0.01 0.08 92.32 03:58:32 03:35:01 AM 0 6.05 0.00 0.50 0.00 0.07 93.38 03:58:32 03:35:01 AM 1 8.32 0.00 0.32 0.02 0.08 91.27 03:58:32 03:36:01 AM all 2.14 0.00 0.27 0.00 0.08 97.51 03:58:32 03:36:01 AM 0 3.12 0.00 0.28 0.00 0.08 96.51 03:58:32 03:36:01 AM 1 1.17 0.00 0.25 0.00 0.08 98.50 03:58:32 03:37:01 AM all 3.74 0.00 0.45 0.01 0.08 95.72 03:58:32 03:37:01 AM 0 3.98 0.00 0.54 0.00 0.08 95.40 03:58:32 03:37:01 AM 1 3.50 0.00 0.37 0.02 0.08 96.03 03:58:32 03:38:01 AM all 1.52 0.00 0.20 0.00 0.08 98.21 03:58:32 03:38:01 AM 0 1.89 0.00 0.18 0.00 0.07 97.86 03:58:32 03:38:01 AM 1 1.14 0.00 0.22 0.00 0.08 98.56 03:58:32 03:58:32 03:38:01 AM CPU %user %nice %system %iowait %steal %idle 03:58:32 03:39:01 AM all 5.47 0.00 0.65 0.01 0.09 93.77 03:58:32 03:39:01 AM 0 2.49 0.00 0.54 0.02 0.10 96.85 03:58:32 03:39:01 AM 1 8.46 0.00 0.77 0.00 0.08 90.69 03:58:32 03:40:01 AM all 1.61 0.00 0.20 0.00 0.08 98.11 03:58:32 03:40:01 AM 0 1.92 0.00 0.18 0.00 0.08 97.81 03:58:32 03:40:01 AM 1 1.30 0.00 0.22 0.00 0.07 98.41 03:58:32 03:41:01 AM all 2.65 0.00 0.30 0.02 0.08 96.96 03:58:32 03:41:01 AM 0 2.13 0.00 0.28 0.02 0.07 97.50 03:58:32 03:41:01 AM 1 3.16 0.00 0.32 0.02 0.08 96.42 03:58:32 03:42:01 AM all 1.67 0.00 0.18 0.00 0.08 98.06 03:58:32 03:42:01 AM 0 1.39 0.00 0.22 0.00 0.08 98.31 03:58:32 03:42:01 AM 1 1.96 0.00 0.15 0.00 0.08 97.81 03:58:32 03:43:01 AM all 3.84 0.00 0.33 0.01 0.08 95.74 03:58:32 03:43:01 AM 0 3.40 0.00 0.30 0.02 0.08 96.20 03:58:32 03:43:01 AM 1 4.28 0.00 0.35 0.00 0.08 95.28 03:58:32 03:44:01 AM all 3.85 0.00 0.29 0.01 0.08 95.78 03:58:32 03:44:01 AM 0 5.41 0.00 0.32 0.00 0.08 94.19 03:58:32 03:44:01 AM 1 2.29 0.00 0.27 0.02 0.07 97.36 03:58:32 03:45:01 AM all 4.02 0.00 0.50 0.00 0.08 95.40 03:58:32 03:45:01 AM 0 5.73 0.00 0.62 0.00 0.08 93.56 03:58:32 03:45:01 AM 1 2.31 0.00 0.38 0.00 0.08 97.22 03:58:32 03:46:01 AM all 2.94 0.00 0.45 0.00 0.08 96.54 03:58:32 03:46:01 AM 0 2.88 0.00 0.45 0.00 0.07 96.60 03:58:32 03:46:01 AM 1 2.99 0.00 0.45 0.00 0.08 96.47 03:58:32 03:47:01 AM all 2.96 0.00 0.38 0.01 0.08 96.56 03:58:32 03:47:01 AM 0 3.85 0.00 0.45 0.00 0.10 95.60 03:58:32 03:47:01 AM 1 2.07 0.00 0.32 0.02 0.07 97.53 03:58:32 03:48:01 AM all 7.28 0.00 0.53 0.00 0.07 92.12 03:58:32 03:48:01 AM 0 1.96 0.00 0.22 0.00 0.07 97.76 03:58:32 03:48:01 AM 1 12.80 0.00 0.85 0.00 0.07 86.28 03:58:32 03:49:02 AM all 1.41 0.00 0.12 0.00 0.08 98.39 03:58:32 03:49:02 AM 0 0.72 0.00 0.10 0.00 0.07 99.11 03:58:32 03:49:02 AM 1 2.11 0.00 0.13 0.00 0.10 97.66 03:58:32 03:58:32 03:49:02 AM CPU %user %nice %system %iowait %steal %idle 03:58:32 03:50:01 AM all 1.16 0.00 0.13 0.00 0.08 98.64 03:58:32 03:50:01 AM 0 0.90 0.00 0.14 0.00 0.07 98.90 03:58:32 03:50:01 AM 1 1.41 0.00 0.12 0.00 0.08 98.39 03:58:32 03:51:01 AM all 1.14 0.00 0.09 0.01 0.08 98.69 03:58:32 03:51:01 AM 0 1.29 0.00 0.10 0.00 0.08 98.53 03:58:32 03:51:01 AM 1 0.99 0.00 0.08 0.02 0.07 98.85 03:58:32 03:52:01 AM all 1.13 0.00 0.11 0.00 0.08 98.69 03:58:32 03:52:01 AM 0 1.49 0.00 0.10 0.00 0.08 98.33 03:58:32 03:52:01 AM 1 0.77 0.00 0.12 0.00 0.07 99.05 03:58:32 03:53:01 AM all 1.09 0.00 0.10 0.00 0.06 98.76 03:58:32 03:53:01 AM 0 0.95 0.00 0.10 0.00 0.05 98.90 03:58:32 03:53:01 AM 1 1.22 0.00 0.10 0.00 0.07 98.61 03:58:32 03:54:01 AM all 1.58 0.00 0.16 0.00 0.08 98.19 03:58:32 03:54:01 AM 0 1.84 0.00 0.15 0.00 0.07 97.94 03:58:32 03:54:01 AM 1 1.32 0.00 0.17 0.00 0.08 98.43 03:58:32 03:55:01 AM all 6.66 0.00 0.87 0.01 0.08 92.38 03:58:32 03:55:01 AM 0 4.93 0.00 0.84 0.02 0.08 94.13 03:58:32 03:55:01 AM 1 8.38 0.00 0.91 0.00 0.08 90.62 03:58:32 03:56:01 AM all 4.43 0.00 0.69 0.00 0.08 94.81 03:58:32 03:56:01 AM 0 2.49 0.00 0.52 0.00 0.07 96.92 03:58:32 03:56:01 AM 1 6.37 0.00 0.85 0.00 0.08 92.70 03:58:32 03:57:01 AM all 14.26 0.00 1.45 0.00 0.08 84.20 03:58:32 03:57:01 AM 0 8.33 0.00 1.40 0.00 0.08 90.18 03:58:32 03:57:01 AM 1 20.19 0.00 1.50 0.00 0.08 78.22 03:58:32 03:58:01 AM all 28.05 0.00 3.57 0.29 0.09 67.99 03:58:32 03:58:01 AM 0 25.52 0.00 3.93 0.20 0.08 70.27 03:58:32 03:58:01 AM 1 30.58 0.00 3.22 0.38 0.10 65.72 03:58:32 Average: all 7.55 0.00 0.95 0.14 0.08 91.28 03:58:32 Average: 0 5.85 0.00 0.89 0.13 0.08 93.05 03:58:32 Average: 1 9.25 0.00 1.01 0.15 0.08 89.50 03:58:32 03:58:32 03:58:32