04:45:41 Started by upstream project "integration-distribution-test-scandium" build number 473 04:45:41 originally caused by: 04:45:41 Started by upstream project "autorelease-release-scandium-mvn39-openjdk21" build number 489 04:45:41 originally caused by: 04:45:41 Started by timer 04:45:41 Running as SYSTEM 04:45:41 [EnvInject] - Loading node environment variables. 04:45:41 Building remotely on prd-centos8-robot-2c-8g-3048 (centos8-robot-2c-8g) in workspace /w/workspace/openflowplugin-csit-3node-clustering-only-scandium 04:45:42 [ssh-agent] Looking for ssh-agent implementation... 04:45:42 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 04:45:42 $ ssh-agent 04:45:42 SSH_AUTH_SOCK=/tmp/ssh-pQ2bbmeOagEI/agent.5282 04:45:42 SSH_AGENT_PID=5283 04:45:42 [ssh-agent] Started. 04:45:42 Running ssh-add (command line suppressed) 04:45:42 Identity added: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium@tmp/private_key_14310692051155109739.key (/w/workspace/openflowplugin-csit-3node-clustering-only-scandium@tmp/private_key_14310692051155109739.key) 04:45:42 [ssh-agent] Using credentials jenkins (Release Engineering Jenkins Key) 04:45:42 The recommended git tool is: NONE 04:45:44 using credential opendaylight-jenkins-ssh 04:45:44 Wiping out workspace first. 04:45:44 Cloning the remote Git repository 04:45:44 Cloning repository git://devvexx.opendaylight.org/mirror/integration/test 04:45:44 > git init /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test # timeout=10 04:45:45 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 04:45:45 > git --version # timeout=10 04:45:45 > git --version # 'git version 2.43.0' 04:45:45 using GIT_SSH to set credentials Release Engineering Jenkins Key 04:45:45 [INFO] Currently running in a labeled security context 04:45:45 [INFO] Currently SELinux is 'enforcing' on the host 04:45:45 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test@tmp/jenkins-gitclient-ssh13518478852314953382.key 04:45:45 Verifying host key using known hosts file 04:45:45 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 04:45:45 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test +refs/heads/*:refs/remotes/origin/* # timeout=10 04:45:48 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 04:45:48 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 04:45:48 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 04:45:48 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test 04:45:48 using GIT_SSH to set credentials Release Engineering Jenkins Key 04:45:48 [INFO] Currently running in a labeled security context 04:45:48 [INFO] Currently SELinux is 'enforcing' on the host 04:45:48 > /usr/bin/chcon --type=ssh_home_t /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test@tmp/jenkins-gitclient-ssh5898323112710358833.key 04:45:48 Verifying host key using known hosts file 04:45:48 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 04:45:48 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test master # timeout=10 04:45:48 > git rev-parse FETCH_HEAD^{commit} # timeout=10 04:45:48 Checking out Revision 6c60ddfc8acc87c45ab0767b2ba1d2c4e7d34388 (origin/master) 04:45:48 > git config core.sparsecheckout # timeout=10 04:45:48 > git checkout -f 6c60ddfc8acc87c45ab0767b2ba1d2c4e7d34388 # timeout=10 04:45:49 Commit message: "Adapt test for new pce-allocation field" 04:45:49 > git rev-parse FETCH_HEAD^{commit} # timeout=10 04:45:49 > git rev-list --no-walk 62cb016f4f4171033927cf2ae7f4ac5095373e88 # timeout=10 04:45:49 No emails were triggered. 04:45:49 provisioning config files... 04:45:49 copy managed file [npmrc] to file:/home/jenkins/.npmrc 04:45:49 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 04:45:50 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 04:45:50 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins14489246604929337092.sh 04:45:50 ---> python-tools-install.sh 04:45:50 Setup pyenv: 04:45:50 system 04:45:50 * 3.8.13 (set by /opt/pyenv/version) 04:45:50 * 3.9.13 (set by /opt/pyenv/version) 04:45:50 * 3.10.13 (set by /opt/pyenv/version) 04:45:50 * 3.11.7 (set by /opt/pyenv/version) 04:45:55 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-AAlE 04:45:55 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 04:45:55 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:45:55 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:45:59 lf-activate-venv(): INFO: Base packages installed successfully 04:45:59 lf-activate-venv(): INFO: Installing additional packages: lftools 04:46:23 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 04:46:23 Generating Requirements File 04:46:56 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:46:56 httplib2 0.31.0 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. 04:46:56 Python 3.11.7 04:46:56 pip 25.3 from /tmp/venv-AAlE/lib/python3.11/site-packages/pip (python 3.11) 04:46:56 appdirs==1.4.4 04:46:56 argcomplete==3.6.3 04:46:56 aspy.yaml==1.3.0 04:46:56 attrs==25.4.0 04:46:56 autopage==0.5.2 04:46:56 beautifulsoup4==4.14.3 04:46:56 boto3==1.42.0 04:46:56 botocore==1.41.6 04:46:56 bs4==0.0.2 04:46:56 cachetools==6.2.2 04:46:56 certifi==2025.11.12 04:46:56 cffi==2.0.0 04:46:56 cfgv==3.5.0 04:46:56 chardet==5.2.0 04:46:56 charset-normalizer==3.4.4 04:46:56 click==8.3.1 04:46:56 cliff==4.12.0 04:46:56 cmd2==2.7.0 04:46:56 cryptography==3.3.2 04:46:56 debtcollector==3.0.0 04:46:56 decorator==5.2.1 04:46:56 defusedxml==0.7.1 04:46:56 Deprecated==1.3.1 04:46:56 distlib==0.4.0 04:46:56 dnspython==2.8.0 04:46:56 docker==7.1.0 04:46:56 dogpile.cache==1.5.0 04:46:56 durationpy==0.10 04:46:56 email-validator==2.3.0 04:46:56 filelock==3.20.0 04:46:56 future==1.0.0 04:46:56 gitdb==4.0.12 04:46:56 GitPython==3.1.45 04:46:56 google-auth==2.43.0 04:46:56 httplib2==0.31.0 04:46:56 identify==2.6.15 04:46:56 idna==3.11 04:46:56 importlib-resources==1.5.0 04:46:56 iso8601==2.1.0 04:46:56 Jinja2==3.1.6 04:46:56 jmespath==1.0.1 04:46:56 jsonpatch==1.33 04:46:56 jsonpointer==3.0.0 04:46:56 jsonschema==4.25.1 04:46:56 jsonschema-specifications==2025.9.1 04:46:56 keystoneauth1==5.12.0 04:46:56 kubernetes==34.1.0 04:46:56 lftools==0.37.16 04:46:56 lxml==6.0.2 04:46:56 markdown-it-py==4.0.0 04:46:56 MarkupSafe==3.0.3 04:46:56 mdurl==0.1.2 04:46:56 msgpack==1.1.2 04:46:56 multi_key_dict==2.0.3 04:46:56 munch==4.0.0 04:46:56 netaddr==1.3.0 04:46:56 niet==1.4.2 04:46:56 nodeenv==1.9.1 04:46:56 oauth2client==4.1.3 04:46:56 oauthlib==3.3.1 04:46:56 openstacksdk==4.8.0 04:46:56 os-service-types==1.8.2 04:46:56 osc-lib==4.2.0 04:46:56 oslo.config==10.1.0 04:46:56 oslo.context==6.2.0 04:46:56 oslo.i18n==6.7.1 04:46:56 oslo.log==7.2.1 04:46:56 oslo.serialization==5.8.0 04:46:56 oslo.utils==9.2.0 04:46:56 packaging==25.0 04:46:56 pbr==7.0.3 04:46:56 platformdirs==4.5.0 04:46:56 prettytable==3.17.0 04:46:56 psutil==7.1.3 04:46:56 pyasn1==0.6.1 04:46:56 pyasn1_modules==0.4.2 04:46:56 pycparser==2.23 04:46:56 pygerrit2==2.0.15 04:46:56 PyGithub==2.8.1 04:46:56 Pygments==2.19.2 04:46:56 PyJWT==2.10.1 04:46:56 PyNaCl==1.6.1 04:46:56 pyparsing==2.4.7 04:46:56 pyperclip==1.11.0 04:46:56 pyrsistent==0.20.0 04:46:56 python-cinderclient==9.8.0 04:46:56 python-dateutil==2.9.0.post0 04:46:56 python-heatclient==4.3.0 04:46:56 python-jenkins==1.8.3 04:46:56 python-keystoneclient==5.7.0 04:46:56 python-magnumclient==4.9.0 04:46:56 python-openstackclient==8.2.0 04:46:56 python-swiftclient==4.9.0 04:46:56 PyYAML==6.0.3 04:46:56 referencing==0.37.0 04:46:56 requests==2.32.5 04:46:56 requests-oauthlib==2.0.0 04:46:56 requestsexceptions==1.4.0 04:46:56 rfc3986==2.0.0 04:46:56 rich==14.2.0 04:46:56 rich-argparse==1.7.2 04:46:56 rpds-py==0.30.0 04:46:56 rsa==4.9.1 04:46:56 ruamel.yaml==0.18.16 04:46:56 ruamel.yaml.clib==0.2.15 04:46:56 s3transfer==0.16.0 04:46:56 simplejson==3.20.2 04:46:56 six==1.17.0 04:46:56 smmap==5.0.2 04:46:56 soupsieve==2.8 04:46:56 stevedore==5.6.0 04:46:56 tabulate==0.9.0 04:46:56 toml==0.10.2 04:46:56 tomlkit==0.13.3 04:46:56 tqdm==4.67.1 04:46:56 typing_extensions==4.15.0 04:46:56 tzdata==2025.2 04:46:56 urllib3==1.26.20 04:46:56 virtualenv==20.35.4 04:46:56 wcwidth==0.2.14 04:46:56 websocket-client==1.9.0 04:46:56 wrapt==2.0.1 04:46:56 xdg==6.0.0 04:46:56 xmltodict==1.0.2 04:46:56 yq==3.4.3 04:46:56 [EnvInject] - Injecting environment variables from a build step. 04:46:56 [EnvInject] - Injecting as environment variables the properties content 04:46:56 OS_STACK_TEMPLATE=csit-2-instance-type.yaml 04:46:56 OS_CLOUD=vex 04:46:56 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-scandium-465 04:46:56 OS_STACK_TEMPLATE_DIR=openstack-hot 04:46:56 04:46:56 [EnvInject] - Variables injected successfully. 04:46:56 provisioning config files... 04:46:56 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 04:46:56 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins6938495954933002995.sh 04:46:56 ---> Create parameters file for OpenStack HOT 04:46:56 OpenStack Heat parameters generated 04:46:56 ----------------------------------- 04:46:56 parameters: 04:46:56 vm_0_count: '3' 04:46:56 vm_0_flavor: 'v3-standard-4' 04:46:56 vm_0_image: 'ZZCI - Ubuntu 22.04 - builder - x86_64 - 20250917-133034.447' 04:46:56 vm_1_count: '1' 04:46:56 vm_1_flavor: 'v3-standard-2' 04:46:56 vm_1_image: 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 - 20250917-133034.654' 04:46:56 04:46:56 job_name: '24937-465' 04:46:56 silo: 'releng' 04:46:56 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins2896646782252538377.sh 04:46:57 ---> Create HEAT stack 04:46:57 + source /home/jenkins/lf-env.sh 04:46:57 + lf-activate-venv --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq 04:46:57 ++ mktemp -d /tmp/venv-XXXX 04:46:57 + lf_venv=/tmp/venv-cXGT 04:46:57 + local venv_file=/tmp/.os_lf_venv 04:46:57 + local python=python3 04:46:57 + local options 04:46:57 + local set_path=true 04:46:57 + local install_args= 04:46:57 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq 04:46:57 + options=' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''urllib3~=1.26.15'\'' '\''yq'\''' 04:46:57 + eval set -- ' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''urllib3~=1.26.15'\'' '\''yq'\''' 04:46:57 ++ set -- --python python3 -- 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq 04:46:57 + true 04:46:57 + case $1 in 04:46:57 + python=python3 04:46:57 + shift 2 04:46:57 + true 04:46:57 + case $1 in 04:46:57 + shift 04:46:57 + break 04:46:57 + case $python in 04:46:57 + local pkg_list= 04:46:57 + [[ -d /opt/pyenv ]] 04:46:57 + echo 'Setup pyenv:' 04:46:57 Setup pyenv: 04:46:57 + export PYENV_ROOT=/opt/pyenv 04:46:57 + PYENV_ROOT=/opt/pyenv 04:46:57 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:46:57 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:46:57 + pyenv versions 04:46:57 system 04:46:57 3.8.13 04:46:57 3.9.13 04:46:57 3.10.13 04:46:57 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 04:46:57 + command -v pyenv 04:46:57 ++ pyenv init - --no-rehash 04:46:57 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 04:46:57 for i in ${!paths[@]}; do 04:46:57 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 04:46:57 fi; done; 04:46:57 echo "${paths[*]}"'\'')" 04:46:57 export PATH="/opt/pyenv/shims:${PATH}" 04:46:57 export PYENV_SHELL=bash 04:46:57 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 04:46:57 pyenv() { 04:46:57 local command 04:46:57 command="${1:-}" 04:46:57 if [ "$#" -gt 0 ]; then 04:46:57 shift 04:46:57 fi 04:46:57 04:46:57 case "$command" in 04:46:57 rehash|shell) 04:46:57 eval "$(pyenv "sh-$command" "$@")" 04:46:57 ;; 04:46:57 *) 04:46:57 command pyenv "$command" "$@" 04:46:57 ;; 04:46:57 esac 04:46:57 }' 04:46:57 +++ bash --norc -ec 'IFS=:; paths=($PATH); 04:46:57 for i in ${!paths[@]}; do 04:46:57 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 04:46:57 fi; done; 04:46:57 echo "${paths[*]}"' 04:46:57 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:46:57 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:46:57 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:46:57 ++ export PYENV_SHELL=bash 04:46:57 ++ PYENV_SHELL=bash 04:46:57 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 04:46:57 +++ complete -F _pyenv pyenv 04:46:57 ++ lf-pyver python3 04:46:57 ++ local py_version_xy=python3 04:46:57 ++ local py_version_xyz= 04:46:57 ++ awk '{ print $1 }' 04:46:57 ++ pyenv versions 04:46:57 ++ local command 04:46:57 ++ command=versions 04:46:57 ++ '[' 1 -gt 0 ']' 04:46:57 ++ shift 04:46:57 ++ case "$command" in 04:46:57 ++ command pyenv versions 04:46:57 ++ pyenv versions 04:46:57 ++ sed 's/^[ *]* //' 04:46:57 ++ grep -E '^[0-9.]*[0-9]$' 04:46:57 ++ [[ ! -s /tmp/.pyenv_versions ]] 04:46:57 +++ tail -n 1 04:46:57 +++ grep '^3' /tmp/.pyenv_versions 04:46:57 +++ sort -V 04:46:57 ++ py_version_xyz=3.11.7 04:46:57 ++ [[ -z 3.11.7 ]] 04:46:57 ++ echo 3.11.7 04:46:57 ++ return 0 04:46:57 + pyenv local 3.11.7 04:46:57 + local command 04:46:57 + command=local 04:46:57 + '[' 2 -gt 0 ']' 04:46:57 + shift 04:46:57 + case "$command" in 04:46:57 + command pyenv local 3.11.7 04:46:57 + pyenv local 3.11.7 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='lftools[openstack] ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='kubernetes ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='niet ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='python-heatclient ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='python-openstackclient ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='python-magnumclient ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='urllib3~=1.26.15 ' 04:46:57 + for arg in "$@" 04:46:57 + case $arg in 04:46:57 + pkg_list+='yq ' 04:46:57 + [[ -f /tmp/.os_lf_venv ]] 04:46:57 ++ cat /tmp/.os_lf_venv 04:46:57 + lf_venv=/tmp/venv-AAlE 04:46:57 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from' file:/tmp/.os_lf_venv 04:46:57 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 04:46:57 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 04:46:57 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:46:57 + local 'pip_opts=--upgrade --quiet' 04:46:57 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 04:46:57 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 04:46:57 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 04:46:57 + [[ -n '' ]] 04:46:57 + [[ -n '' ]] 04:46:57 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 04:46:57 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:46:57 + /tmp/venv-AAlE/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 04:46:59 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 04:46:59 lf-activate-venv(): INFO: Base packages installed successfully 04:46:59 + [[ -z lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq ]] 04:46:59 + echo 'lf-activate-venv(): INFO: Installing additional packages: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq ' 04:46:59 lf-activate-venv(): INFO: Installing additional packages: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq 04:46:59 + /tmp/venv-AAlE/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient urllib3~=1.26.15 yq 04:47:20 + type python3 04:47:20 + true 04:47:20 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH' 04:47:20 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 04:47:20 + PATH=/tmp/venv-AAlE/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin 04:47:20 + return 0 04:47:20 + openstack --os-cloud vex limits show --absolute 04:47:20 +--------------------------+---------+ 04:47:20 | Name | Value | 04:47:20 +--------------------------+---------+ 04:47:20 | maxTotalInstances | -1 | 04:47:20 | maxTotalCores | 450 | 04:47:20 | maxTotalRAMSize | 1000000 | 04:47:20 | maxServerMeta | 128 | 04:47:20 | maxImageMeta | 128 | 04:47:20 | maxPersonality | 5 | 04:47:20 | maxPersonalitySize | 10240 | 04:47:20 | maxTotalKeypairs | 100 | 04:47:20 | maxServerGroups | 10 | 04:47:20 | maxServerGroupMembers | 10 | 04:47:20 | maxTotalFloatingIps | -1 | 04:47:20 | maxSecurityGroups | -1 | 04:47:20 | maxSecurityGroupRules | -1 | 04:47:20 | totalRAMUsed | 753664 | 04:47:20 | totalCoresUsed | 184 | 04:47:20 | totalInstancesUsed | 80 | 04:47:20 | totalFloatingIpsUsed | 0 | 04:47:20 | totalSecurityGroupsUsed | 0 | 04:47:20 | totalServerGroupsUsed | 0 | 04:47:20 | maxTotalVolumes | -1 | 04:47:20 | maxTotalSnapshots | 10 | 04:47:20 | maxTotalVolumeGigabytes | 4096 | 04:47:20 | maxTotalBackups | 10 | 04:47:20 | maxTotalBackupGigabytes | 1000 | 04:47:20 | totalVolumesUsed | 3 | 04:47:20 | totalGigabytesUsed | 60 | 04:47:20 | totalSnapshotsUsed | 0 | 04:47:20 | totalBackupsUsed | 0 | 04:47:20 | totalBackupGigabytesUsed | 0 | 04:47:20 +--------------------------+---------+ 04:47:20 + pushd /opt/ciman/openstack-hot 04:47:20 /opt/ciman/openstack-hot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium 04:47:20 + lftools openstack --os-cloud vex stack create releng-openflowplugin-csit-3node-clustering-only-scandium-465 csit-2-instance-type.yaml /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/stack-parameters.yaml 04:47:56 Creating stack releng-openflowplugin-csit-3node-clustering-only-scandium-465 04:47:56 Waiting to initialize infrastructure... 04:47:56 Waiting to initialize infrastructure... 04:47:56 Stack initialization successful. 04:47:56 ------------------------------------ 04:47:56 Stack Details 04:47:56 ------------------------------------ 04:47:56 {'added': None, 04:47:56 'capabilities': [], 04:47:56 'created_at': '2025-12-02T04:47:22Z', 04:47:56 'deleted': None, 04:47:56 'deleted_at': None, 04:47:56 'description': 'No description', 04:47:56 'environment': None, 04:47:56 'environment_files': None, 04:47:56 'files': None, 04:47:56 'files_container': None, 04:47:56 'id': '057c0993-c138-4a8a-acf4-8ee47757639a', 04:47:56 'is_rollback_disabled': True, 04:47:56 'links': [{'href': 'https://orchestration.public.mtl1.vexxhost.net/v1/12c36e260d8e4bb2913965203b1b491f/stacks/releng-openflowplugin-csit-3node-clustering-only-scandium-465/057c0993-c138-4a8a-acf4-8ee47757639a', 04:47:56 'rel': 'self'}], 04:47:56 'location': Munch({'cloud': 'vex', 'region_name': 'ca-ymq-1', 'zone': None, 'project': Munch({'id': '12c36e260d8e4bb2913965203b1b491f', 'name': '61975f2c-7c17-4d69-82fa-c3ae420ad6fd', 'domain_id': None, 'domain_name': 'Default'})}), 04:47:56 'name': 'releng-openflowplugin-csit-3node-clustering-only-scandium-465', 04:47:56 'notification_topics': [], 04:47:56 'outputs': [{'description': 'IP addresses of the 2nd vm types', 04:47:56 'output_key': 'vm_1_ips', 04:47:56 'output_value': ['10.30.170.122']}, 04:47:56 {'description': 'IP addresses of the 1st vm types', 04:47:56 'output_key': 'vm_0_ips', 04:47:56 'output_value': ['10.30.170.174', 04:47:56 '10.30.170.199', 04:47:56 '10.30.171.237']}], 04:47:56 'owner_id': ****, 04:47:56 'parameters': {'OS::project_id': '12c36e260d8e4bb2913965203b1b491f', 04:47:56 'OS::stack_id': '057c0993-c138-4a8a-acf4-8ee47757639a', 04:47:56 'OS::stack_name': 'releng-openflowplugin-csit-3node-clustering-only-scandium-465', 04:47:56 'job_name': '24937-465', 04:47:56 'silo': 'releng', 04:47:56 'vm_0_count': '3', 04:47:56 'vm_0_flavor': 'v3-standard-4', 04:47:56 'vm_0_image': 'ZZCI - Ubuntu 22.04 - builder - x86_64 - ' 04:47:56 '20250917-133034.447', 04:47:56 'vm_1_count': '1', 04:47:56 'vm_1_flavor': 'v3-standard-2', 04:47:56 'vm_1_image': 'ZZCI - Ubuntu 22.04 - mininet-ovs-217 - x86_64 ' 04:47:56 '- 20250917-133034.654'}, 04:47:56 'parent_id': None, 04:47:56 'replaced': None, 04:47:56 'status': 'CREATE_COMPLETE', 04:47:56 'status_reason': 'Stack CREATE completed successfully', 04:47:56 'tags': [], 04:47:56 'template': None, 04:47:56 'template_description': 'No description', 04:47:56 'template_url': None, 04:47:56 'timeout_mins': 15, 04:47:56 'unchanged': None, 04:47:56 'updated': None, 04:47:56 'updated_at': None, 04:47:56 'user_project_id': '2a5dada656684a4684721e5d41a649e7'} 04:47:56 ------------------------------------ 04:47:56 + popd 04:47:56 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium 04:47:56 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins3584857388256314252.sh 04:47:56 ---> Copy SSH public keys to CSIT lab 04:47:56 Setup pyenv: 04:47:57 system 04:47:57 3.8.13 04:47:57 3.9.13 04:47:57 3.10.13 04:47:57 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 04:47:57 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 04:47:57 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:47:57 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:47:59 lf-activate-venv(): INFO: Base packages installed successfully 04:47:59 lf-activate-venv(): INFO: Installing additional packages: lftools[openstack] kubernetes python-heatclient python-openstackclient urllib3~=1.26.15 04:48:13 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 04:48:15 SSH not responding on 10.30.170.174. Retrying in 10 seconds... 04:48:15 SSH not responding on 10.30.170.199. Retrying in 10 seconds... 04:48:15 SSH not responding on 10.30.171.237. Retrying in 10 seconds... 04:48:15 Warning: Permanently added '10.30.170.122' (ECDSA) to the list of known hosts. 04:48:16 releng-24937-465-1-mininet-ovs-217-0 04:48:16 Successfully copied public keys to slave 10.30.170.122 04:48:16 Process 6500 ready. 04:48:25 Ping to 10.30.170.174 successful. 04:48:25 Ping to 10.30.170.199 successful. 04:48:25 Ping to 10.30.171.237 successful. 04:48:25 SSH not responding on 10.30.170.199. Retrying in 10 seconds... 04:48:26 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:48:26 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:48:26 releng-24937-465-0-builder-2 04:48:26 Successfully copied public keys to slave 10.30.171.237 04:48:26 releng-24937-465-0-builder-0 04:48:26 Successfully copied public keys to slave 10.30.170.174 04:48:26 Process 6501 ready. 04:48:35 Ping to 10.30.170.199 successful. 04:48:36 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:48:37 releng-24937-465-0-builder-1 04:48:37 Successfully copied public keys to slave 10.30.170.199 04:48:37 Process 6503 ready. 04:48:37 Process 6504 ready. 04:48:37 SSH ready on all stack servers. 04:48:37 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins1904762375063129524.sh 04:48:37 Setup pyenv: 04:48:37 system 04:48:37 3.8.13 04:48:37 3.9.13 04:48:37 3.10.13 04:48:37 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 04:48:41 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-4W6r 04:48:41 lf-activate-venv(): INFO: Save venv in file: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.robot_venv 04:48:41 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:48:41 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:48:45 lf-activate-venv(): INFO: Base packages installed successfully 04:48:45 lf-activate-venv(): INFO: Installing additional packages: setuptools wheel 04:48:47 lf-activate-venv(): INFO: Adding /tmp/venv-4W6r/bin to PATH 04:48:47 + echo 'Installing Python Requirements' 04:48:47 Installing Python Requirements 04:48:47 + cat 04:48:47 + python -m pip install -r requirements.txt 04:48:47 Looking in indexes: https://nexus3.opendaylight.org/repository/PyPi/simple 04:48:47 Collecting docker-py (from -r requirements.txt (line 1)) 04:48:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-py/1.10.6/docker_py-1.10.6-py2.py3-none-any.whl (50 kB) 04:48:47 Collecting ipaddr (from -r requirements.txt (line 2)) 04:48:47 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ipaddr/2.2.0/ipaddr-2.2.0.tar.gz (26 kB) 04:48:47 Installing build dependencies: started 04:48:48 Installing build dependencies: finished with status 'done' 04:48:48 Getting requirements to build wheel: started 04:48:48 Getting requirements to build wheel: finished with status 'done' 04:48:48 Preparing metadata (pyproject.toml): started 04:48:49 Preparing metadata (pyproject.toml): finished with status 'done' 04:48:49 Collecting netaddr (from -r requirements.txt (line 3)) 04:48:49 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/netaddr/1.3.0/netaddr-1.3.0-py3-none-any.whl (2.3 MB) 04:48:49 Collecting netifaces (from -r requirements.txt (line 4)) 04:48:49 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/netifaces/0.11.0/netifaces-0.11.0.tar.gz (30 kB) 04:48:49 Installing build dependencies: started 04:48:50 Installing build dependencies: finished with status 'done' 04:48:50 Getting requirements to build wheel: started 04:48:50 Getting requirements to build wheel: finished with status 'done' 04:48:50 Preparing metadata (pyproject.toml): started 04:48:50 Preparing metadata (pyproject.toml): finished with status 'done' 04:48:50 Collecting pyhocon (from -r requirements.txt (line 5)) 04:48:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyhocon/0.3.61/pyhocon-0.3.61-py3-none-any.whl (25 kB) 04:48:50 Collecting requests (from -r requirements.txt (line 6)) 04:48:50 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/requests/2.32.5/requests-2.32.5-py3-none-any.whl (64 kB) 04:48:50 Collecting robotframework (from -r requirements.txt (line 7)) 04:48:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework/7.3.2/robotframework-7.3.2-py3-none-any.whl (795 kB) 04:48:50 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 795.1/795.1 kB 29.2 MB/s 0:00:00 04:48:50 Collecting robotframework-httplibrary (from -r requirements.txt (line 8)) 04:48:50 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-httplibrary/0.4.2/robotframework-httplibrary-0.4.2.tar.gz (9.1 kB) 04:48:50 Installing build dependencies: started 04:48:51 Installing build dependencies: finished with status 'done' 04:48:51 Getting requirements to build wheel: started 04:48:52 Getting requirements to build wheel: finished with status 'done' 04:48:52 Preparing metadata (pyproject.toml): started 04:48:52 Preparing metadata (pyproject.toml): finished with status 'done' 04:48:52 Collecting robotframework-requests==0.9.7 (from -r requirements.txt (line 9)) 04:48:52 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-requests/0.9.7/robotframework_requests-0.9.7-py3-none-any.whl (21 kB) 04:48:52 Collecting robotframework-selenium2library (from -r requirements.txt (line 10)) 04:48:52 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-selenium2library/3.0.0/robotframework_selenium2library-3.0.0-py2.py3-none-any.whl (6.2 kB) 04:48:52 Collecting robotframework-sshlibrary==3.8.0 (from -r requirements.txt (line 11)) 04:48:52 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-sshlibrary/3.8.0/robotframework-sshlibrary-3.8.0.tar.gz (51 kB) 04:48:52 Installing build dependencies: started 04:48:53 Installing build dependencies: finished with status 'done' 04:48:53 Getting requirements to build wheel: started 04:48:53 Getting requirements to build wheel: finished with status 'done' 04:48:53 Preparing metadata (pyproject.toml): started 04:48:54 Preparing metadata (pyproject.toml): finished with status 'done' 04:48:54 Collecting scapy (from -r requirements.txt (line 12)) 04:48:54 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scapy/2.6.1/scapy-2.6.1-py3-none-any.whl (2.4 MB) 04:48:54 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 47.1 MB/s 0:00:00 04:48:54 Collecting jsonpath-rw (from -r requirements.txt (line 15)) 04:48:54 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpath-rw/1.4.0/jsonpath-rw-1.4.0.tar.gz (13 kB) 04:48:54 Installing build dependencies: started 04:48:55 Installing build dependencies: finished with status 'done' 04:48:55 Getting requirements to build wheel: started 04:48:55 Getting requirements to build wheel: finished with status 'done' 04:48:55 Preparing metadata (pyproject.toml): started 04:48:55 Preparing metadata (pyproject.toml): finished with status 'done' 04:48:55 Collecting elasticsearch (from -r requirements.txt (line 18)) 04:48:55 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/9.2.0/elasticsearch-9.2.0-py3-none-any.whl (960 kB) 04:48:55 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 960.5/960.5 kB 36.7 MB/s 0:00:00 04:48:55 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 04:48:55 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.18.0/elasticsearch_dsl-8.18.0-py3-none-any.whl (10 kB) 04:48:55 Collecting pyangbind (from -r requirements.txt (line 22)) 04:48:55 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyangbind/0.8.6/pyangbind-0.8.6-py3-none-any.whl (52 kB) 04:48:56 Collecting isodate (from -r requirements.txt (line 25)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/isodate/0.7.2/isodate-0.7.2-py3-none-any.whl (22 kB) 04:48:56 Collecting jmespath (from -r requirements.txt (line 28)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jmespath/1.0.1/jmespath-1.0.1-py3-none-any.whl (20 kB) 04:48:56 Collecting jsonpatch (from -r requirements.txt (line 31)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpatch/1.33/jsonpatch-1.33-py2.py3-none-any.whl (12 kB) 04:48:56 Collecting paramiko>=1.15.3 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/paramiko/4.0.0/paramiko-4.0.0-py3-none-any.whl (223 kB) 04:48:56 Collecting scp>=0.13.0 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scp/0.15.0/scp-0.15.0-py2.py3-none-any.whl (8.8 kB) 04:48:56 Collecting docker-pycreds>=0.2.1 (from docker-py->-r requirements.txt (line 1)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-pycreds/0.4.0/docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) 04:48:56 Collecting six>=1.4.0 (from docker-py->-r requirements.txt (line 1)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/six/1.17.0/six-1.17.0-py2.py3-none-any.whl (11 kB) 04:48:56 Collecting websocket-client>=0.32.0 (from docker-py->-r requirements.txt (line 1)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/websocket-client/1.9.0/websocket_client-1.9.0-py3-none-any.whl (82 kB) 04:48:56 Collecting pyparsing<4,>=2 (from pyhocon->-r requirements.txt (line 5)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pyparsing/3.2.5/pyparsing-3.2.5-py3-none-any.whl (113 kB) 04:48:56 Collecting charset_normalizer<4,>=2 (from requests->-r requirements.txt (line 6)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/charset-normalizer/3.4.4/charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (151 kB) 04:48:56 Collecting idna<4,>=2.5 (from requests->-r requirements.txt (line 6)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/idna/3.11/idna-3.11-py3-none-any.whl (71 kB) 04:48:56 Collecting urllib3<3,>=1.21.1 (from requests->-r requirements.txt (line 6)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/urllib3/2.5.0/urllib3-2.5.0-py3-none-any.whl (129 kB) 04:48:56 Collecting certifi>=2017.4.17 (from requests->-r requirements.txt (line 6)) 04:48:56 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/certifi/2025.11.12/certifi-2025.11.12-py3-none-any.whl (159 kB) 04:48:56 Collecting webtest>=2.0 (from robotframework-httplibrary->-r requirements.txt (line 8)) 04:48:56 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webtest/3.0.7/webtest-3.0.7-py3-none-any.whl (32 kB) 04:48:57 Collecting jsonpointer (from robotframework-httplibrary->-r requirements.txt (line 8)) 04:48:57 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpointer/3.0.0/jsonpointer-3.0.0-py2.py3-none-any.whl (7.6 kB) 04:48:57 Collecting robotframework-seleniumlibrary>=3.0.0 (from robotframework-selenium2library->-r requirements.txt (line 10)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-seleniumlibrary/6.8.0/robotframework_seleniumlibrary-6.8.0-py3-none-any.whl (104 kB) 04:48:57 Collecting ply (from jsonpath-rw->-r requirements.txt (line 15)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ply/3.11/ply-3.11-py2.py3-none-any.whl (49 kB) 04:48:57 Collecting decorator (from jsonpath-rw->-r requirements.txt (line 15)) 04:48:57 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/decorator/5.2.1/decorator-5.2.1-py3-none-any.whl (9.2 kB) 04:48:57 Collecting anyio (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/anyio/4.12.0/anyio-4.12.0-py3-none-any.whl (113 kB) 04:48:57 Collecting elastic-transport<10,>=9.2.0 (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/9.2.0/elastic_transport-9.2.0-py3-none-any.whl (65 kB) 04:48:57 Collecting python-dateutil (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/python-dateutil/2.9.0.post0/python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) 04:48:57 Collecting sniffio (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl (10 kB) 04:48:57 Collecting typing-extensions (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/typing-extensions/4.15.0/typing_extensions-4.15.0-py3-none-any.whl (44 kB) 04:48:57 Collecting elasticsearch (from -r requirements.txt (line 18)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/8.19.2/elasticsearch-8.19.2-py3-none-any.whl (949 kB) 04:48:57 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 949.7/949.7 kB 30.4 MB/s 0:00:00 04:48:57 INFO: pip is looking at multiple versions of elasticsearch-dsl to determine which version is compatible with other requirements. This could take a while. 04:48:57 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.1/elasticsearch_dsl-8.17.1-py3-none-any.whl (158 kB) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.17.0/elasticsearch_dsl-8.17.0-py3-none-any.whl (158 kB) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.16.0/elasticsearch_dsl-8.16.0-py3-none-any.whl (158 kB) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.15.4/elasticsearch_dsl-8.15.4-py3-none-any.whl (104 kB) 04:48:57 Collecting elastic-transport<9,>=8.15.1 (from elasticsearch->-r requirements.txt (line 18)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/8.17.1/elastic_transport-8.17.1-py3-none-any.whl (64 kB) 04:48:57 Collecting pyang (from pyangbind->-r requirements.txt (line 22)) 04:48:57 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyang/2.7.1/pyang-2.7.1-py2.py3-none-any.whl (598 kB) 04:48:57 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 598.5/598.5 kB 33.5 MB/s 0:00:00 04:48:58 Collecting lxml (from pyangbind->-r requirements.txt (line 22)) 04:48:58 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/lxml/6.0.2/lxml-6.0.2-cp311-cp311-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl (5.2 MB) 04:48:59 Collecting regex (from pyangbind->-r requirements.txt (line 22)) 04:48:59 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/regex/2025.11.3/regex-2025.11.3-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (800 kB) 04:48:59 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 800.4/800.4 kB 32.6 MB/s 0:00:00 04:48:59 Collecting enum34 (from pyangbind->-r requirements.txt (line 22)) 04:48:59 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/enum34/1.1.10/enum34-1.1.10-py3-none-any.whl (11 kB) 04:48:59 Collecting bcrypt>=3.2 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:48:59 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/bcrypt/5.0.0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl (278 kB) 04:49:00 Collecting cryptography>=3.3 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:49:00 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cryptography/46.0.3/cryptography-46.0.3-cp311-abi3-manylinux_2_28_x86_64.whl (4.5 MB) 04:49:00 Collecting invoke>=2.0 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:49:00 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/invoke/2.2.1/invoke-2.2.1-py3-none-any.whl (160 kB) 04:49:00 Collecting pynacl>=1.5 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:49:00 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pynacl/1.6.1/pynacl-1.6.1-cp38-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl (1.4 MB) 04:49:00 Collecting cffi>=2.0.0 (from cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:49:00 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cffi/2.0.0/cffi-2.0.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (215 kB) 04:49:00 Collecting pycparser (from cffi>=2.0.0->cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) 04:49:00 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pycparser/2.23/pycparser-2.23-py3-none-any.whl (118 kB) 04:49:00 Collecting selenium>=4.3.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:00 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/selenium/4.38.0/selenium-4.38.0-py3-none-any.whl (9.7 MB) 04:49:00 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.7/9.7 MB 60.9 MB/s 0:00:00 04:49:00 Collecting robotframework-pythonlibcore>=4.4.1 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:00 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-pythonlibcore/4.4.1/robotframework_pythonlibcore-4.4.1-py2.py3-none-any.whl (12 kB) 04:49:01 Collecting click>=8.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/click/8.3.1/click-8.3.1-py3-none-any.whl (108 kB) 04:49:01 Collecting trio<1.0,>=0.31.0 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio/0.32.0/trio-0.32.0-py3-none-any.whl (512 kB) 04:49:01 Collecting trio-websocket<1.0,>=0.12.2 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio-websocket/0.12.2/trio_websocket-0.12.2-py3-none-any.whl (21 kB) 04:49:01 Collecting attrs>=23.2.0 (from trio<1.0,>=0.31.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/attrs/25.4.0/attrs-25.4.0-py3-none-any.whl (67 kB) 04:49:01 Collecting sortedcontainers (from trio<1.0,>=0.31.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sortedcontainers/2.4.0/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) 04:49:01 Collecting outcome (from trio<1.0,>=0.31.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/outcome/1.3.0.post0/outcome-1.3.0.post0-py2.py3-none-any.whl (10 kB) 04:49:01 Collecting wsproto>=0.14 (from trio-websocket<1.0,>=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/wsproto/1.3.2/wsproto-1.3.2-py3-none-any.whl (24 kB) 04:49:01 Collecting pysocks!=1.5.7,<2.0,>=1.5.6 (from urllib3[socks]<3.0,>=2.5.0->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pysocks/1.7.1/PySocks-1.7.1-py3-none-any.whl (16 kB) 04:49:01 Collecting WebOb>=1.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webob/1.8.9/WebOb-1.8.9-py2.py3-none-any.whl (115 kB) 04:49:01 Collecting waitress>=3.0.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/waitress/3.0.2/waitress-3.0.2-py3-none-any.whl (56 kB) 04:49:01 Collecting beautifulsoup4 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:49:01 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/beautifulsoup4/4.14.3/beautifulsoup4-4.14.3-py3-none-any.whl (107 kB) 04:49:01 Collecting h11<1,>=0.16.0 (from wsproto>=0.14->trio-websocket<1.0,>=0.12.2->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) 04:49:01 Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/h11/0.16.0/h11-0.16.0-py3-none-any.whl (37 kB) 04:49:01 Collecting soupsieve>=1.6.1 (from beautifulsoup4->webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) 04:49:01 Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/soupsieve/2.8/soupsieve-2.8-py3-none-any.whl (36 kB) 04:49:02 Building wheels for collected packages: robotframework-sshlibrary, ipaddr, netifaces, robotframework-httplibrary, jsonpath-rw 04:49:02 Building wheel for robotframework-sshlibrary (pyproject.toml): started 04:49:02 Building wheel for robotframework-sshlibrary (pyproject.toml): finished with status 'done' 04:49:02 Created wheel for robotframework-sshlibrary: filename=robotframework_sshlibrary-3.8.0-py3-none-any.whl size=55205 sha256=564f08435c3e243f51aea851c5fc32e23186a98a48a5631b2564447609b5b84e 04:49:02 Stored in directory: /home/jenkins/.cache/pip/wheels/f7/c9/b3/a977b7bcc410d45ae27d240df3d00a12585509180e373ecccc 04:49:02 Building wheel for ipaddr (pyproject.toml): started 04:49:02 Building wheel for ipaddr (pyproject.toml): finished with status 'done' 04:49:02 Created wheel for ipaddr: filename=ipaddr-2.2.0-py3-none-any.whl size=18353 sha256=1fd178032149b35d62011594338444f1217961bbaeee69db3cb27b5cc81b0dad 04:49:02 Stored in directory: /home/jenkins/.cache/pip/wheels/dc/6c/04/da2d847fa8d45c59af3e1d83e2acc29cb8adcbaf04c0898dbf 04:49:02 Building wheel for netifaces (pyproject.toml): started 04:49:05 Building wheel for netifaces (pyproject.toml): finished with status 'done' 04:49:05 Created wheel for netifaces: filename=netifaces-0.11.0-cp311-cp311-linux_x86_64.whl size=41066 sha256=e4af08404bafd64452238f3e3064878ca62be961954d7a44ffc3db247c2e8079 04:49:05 Stored in directory: /home/jenkins/.cache/pip/wheels/f8/18/88/e61d54b995bea304bdb1d040a92b72228a1bf72ca2a3eba7c9 04:49:05 Building wheel for robotframework-httplibrary (pyproject.toml): started 04:49:05 Building wheel for robotframework-httplibrary (pyproject.toml): finished with status 'done' 04:49:05 Created wheel for robotframework-httplibrary: filename=robotframework_httplibrary-0.4.2-py3-none-any.whl size=10014 sha256=a1bcc73f1cd8f2132046faa38435dccea8e1f7b4206480cb9739f49537e807b7 04:49:05 Stored in directory: /home/jenkins/.cache/pip/wheels/aa/bc/0d/9a20dd51effef392aae2733cb4c7b66c6fa29fca33d88b57ed 04:49:05 Building wheel for jsonpath-rw (pyproject.toml): started 04:49:05 Building wheel for jsonpath-rw (pyproject.toml): finished with status 'done' 04:49:05 Created wheel for jsonpath-rw: filename=jsonpath_rw-1.4.0-py3-none-any.whl size=15176 sha256=2c91f797797dc5bca87e5317c9dca42106afabdefd2bb07afd7657acb2c962cd 04:49:05 Stored in directory: /home/jenkins/.cache/pip/wheels/f1/54/63/9a8da38cefae13755097b36cc852decc25d8ef69c37d58d4eb 04:49:05 Successfully built robotframework-sshlibrary ipaddr netifaces robotframework-httplibrary jsonpath-rw 04:49:05 Installing collected packages: sortedcontainers, ply, netifaces, ipaddr, enum34, websocket-client, WebOb, waitress, urllib3, typing-extensions, soupsieve, sniffio, six, scapy, robotframework-pythonlibcore, robotframework, regex, pysocks, pyparsing, pycparser, netaddr, lxml, jsonpointer, jmespath, isodate, invoke, idna, h11, decorator, click, charset_normalizer, certifi, bcrypt, attrs, wsproto, requests, python-dateutil, pyhocon, pyang, outcome, jsonpath-rw, jsonpatch, elastic-transport, docker-pycreds, cffi, beautifulsoup4, webtest, trio, robotframework-requests, pynacl, pyangbind, elasticsearch, docker-py, cryptography, trio-websocket, robotframework-httplibrary, paramiko, elasticsearch-dsl, selenium, scp, robotframework-sshlibrary, robotframework-seleniumlibrary, robotframework-selenium2library 04:49:12 04:49:12 Successfully installed WebOb-1.8.9 attrs-25.4.0 bcrypt-5.0.0 beautifulsoup4-4.14.3 certifi-2025.11.12 cffi-2.0.0 charset_normalizer-3.4.4 click-8.3.1 cryptography-46.0.3 decorator-5.2.1 docker-py-1.10.6 docker-pycreds-0.4.0 elastic-transport-8.17.1 elasticsearch-8.19.2 elasticsearch-dsl-8.15.4 enum34-1.1.10 h11-0.16.0 idna-3.11 invoke-2.2.1 ipaddr-2.2.0 isodate-0.7.2 jmespath-1.0.1 jsonpatch-1.33 jsonpath-rw-1.4.0 jsonpointer-3.0.0 lxml-6.0.2 netaddr-1.3.0 netifaces-0.11.0 outcome-1.3.0.post0 paramiko-4.0.0 ply-3.11 pyang-2.7.1 pyangbind-0.8.6 pycparser-2.23 pyhocon-0.3.61 pynacl-1.6.1 pyparsing-3.2.5 pysocks-1.7.1 python-dateutil-2.9.0.post0 regex-2025.11.3 requests-2.32.5 robotframework-7.3.2 robotframework-httplibrary-0.4.2 robotframework-pythonlibcore-4.4.1 robotframework-requests-0.9.7 robotframework-selenium2library-3.0.0 robotframework-seleniumlibrary-6.8.0 robotframework-sshlibrary-3.8.0 scapy-2.6.1 scp-0.15.0 selenium-4.38.0 six-1.17.0 sniffio-1.3.1 sortedcontainers-2.4.0 soupsieve-2.8 trio-0.32.0 trio-websocket-0.12.2 typing-extensions-4.15.0 urllib3-2.5.0 waitress-3.0.2 websocket-client-1.9.0 webtest-3.0.7 wsproto-1.3.2 04:49:12 + pip freeze 04:49:13 attrs==25.4.0 04:49:13 bcrypt==5.0.0 04:49:13 beautifulsoup4==4.14.3 04:49:13 certifi==2025.11.12 04:49:13 cffi==2.0.0 04:49:13 charset-normalizer==3.4.4 04:49:13 click==8.3.1 04:49:13 cryptography==46.0.3 04:49:13 decorator==5.2.1 04:49:13 distlib==0.4.0 04:49:13 docker-py==1.10.6 04:49:13 docker-pycreds==0.4.0 04:49:13 elastic-transport==8.17.1 04:49:13 elasticsearch==8.19.2 04:49:13 elasticsearch-dsl==8.15.4 04:49:13 enum34==1.1.10 04:49:13 filelock==3.20.0 04:49:13 h11==0.16.0 04:49:13 idna==3.11 04:49:13 invoke==2.2.1 04:49:13 ipaddr==2.2.0 04:49:13 isodate==0.7.2 04:49:13 jmespath==1.0.1 04:49:13 jsonpatch==1.33 04:49:13 jsonpath-rw==1.4.0 04:49:13 jsonpointer==3.0.0 04:49:13 lxml==6.0.2 04:49:13 netaddr==1.3.0 04:49:13 netifaces==0.11.0 04:49:13 outcome==1.3.0.post0 04:49:13 paramiko==4.0.0 04:49:13 platformdirs==4.5.0 04:49:13 ply==3.11 04:49:13 pyang==2.7.1 04:49:13 pyangbind==0.8.6 04:49:13 pycparser==2.23 04:49:13 pyhocon==0.3.61 04:49:13 PyNaCl==1.6.1 04:49:13 pyparsing==3.2.5 04:49:13 PySocks==1.7.1 04:49:13 python-dateutil==2.9.0.post0 04:49:13 regex==2025.11.3 04:49:13 requests==2.32.5 04:49:13 robotframework==7.3.2 04:49:13 robotframework-httplibrary==0.4.2 04:49:13 robotframework-pythonlibcore==4.4.1 04:49:13 robotframework-requests==0.9.7 04:49:13 robotframework-selenium2library==3.0.0 04:49:13 robotframework-seleniumlibrary==6.8.0 04:49:13 robotframework-sshlibrary==3.8.0 04:49:13 scapy==2.6.1 04:49:13 scp==0.15.0 04:49:13 selenium==4.38.0 04:49:13 six==1.17.0 04:49:13 sniffio==1.3.1 04:49:13 sortedcontainers==2.4.0 04:49:13 soupsieve==2.8 04:49:13 trio==0.32.0 04:49:13 trio-websocket==0.12.2 04:49:13 typing_extensions==4.15.0 04:49:13 urllib3==2.5.0 04:49:13 virtualenv==20.35.4 04:49:13 waitress==3.0.2 04:49:13 WebOb==1.8.9 04:49:13 websocket-client==1.9.0 04:49:13 WebTest==3.0.7 04:49:13 wsproto==1.3.2 04:49:13 [EnvInject] - Injecting environment variables from a build step. 04:49:13 [EnvInject] - Injecting as environment variables the properties file path 'env.properties' 04:49:13 [EnvInject] - Variables injected successfully. 04:49:13 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins12491988782885946196.sh 04:49:13 Setup pyenv: 04:49:13 system 04:49:13 3.8.13 04:49:13 3.9.13 04:49:13 3.10.13 04:49:13 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 04:49:13 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 04:49:13 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:49:13 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:49:15 lf-activate-venv(): INFO: Base packages installed successfully 04:49:15 lf-activate-venv(): INFO: Installing additional packages: python-heatclient python-openstackclient yq 04:49:24 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:49:24 lftools 0.37.16 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:49:24 kubernetes 34.1.0 requires urllib3<2.4.0,>=1.24.2, but you have urllib3 2.5.0 which is incompatible. 04:49:24 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 04:49:24 + ODL_SYSTEM=() 04:49:24 + TOOLS_SYSTEM=() 04:49:24 + OPENSTACK_SYSTEM=() 04:49:24 + OPENSTACK_CONTROLLERS=() 04:49:24 + mapfile -t ADDR 04:49:24 ++ openstack stack show -f json -c outputs releng-openflowplugin-csit-3node-clustering-only-scandium-465 04:49:24 ++ jq -r '.outputs[] | select(.output_key | match("^vm_[0-9]+_ips$")) | .output_value | .[]' 04:49:26 + for i in "${ADDR[@]}" 04:49:26 ++ ssh 10.30.170.174 hostname -s 04:49:26 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:26 + REMHOST=releng-24937-465-0-builder-0 04:49:26 + case ${REMHOST} in 04:49:26 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:49:26 + for i in "${ADDR[@]}" 04:49:26 ++ ssh 10.30.170.199 hostname -s 04:49:26 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:26 + REMHOST=releng-24937-465-0-builder-1 04:49:26 + case ${REMHOST} in 04:49:26 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:49:26 + for i in "${ADDR[@]}" 04:49:26 ++ ssh 10.30.171.237 hostname -s 04:49:26 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:27 + REMHOST=releng-24937-465-0-builder-2 04:49:27 + case ${REMHOST} in 04:49:27 + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") 04:49:27 + for i in "${ADDR[@]}" 04:49:27 ++ ssh 10.30.170.122 hostname -s 04:49:27 Warning: Permanently added '10.30.170.122' (ECDSA) to the list of known hosts. 04:49:28 + REMHOST=releng-24937-465-1-mininet-ovs-217-0 04:49:28 + case ${REMHOST} in 04:49:28 + TOOLS_SYSTEM=("${TOOLS_SYSTEM[@]}" "${i}") 04:49:28 + echo NUM_ODL_SYSTEM=3 04:49:28 + echo NUM_TOOLS_SYSTEM=1 04:49:28 + '[' '' == yes ']' 04:49:28 + NUM_OPENSTACK_SYSTEM=0 04:49:28 + echo NUM_OPENSTACK_SYSTEM=0 04:49:28 + '[' 0 -eq 2 ']' 04:49:28 + echo ODL_SYSTEM_IP=10.30.170.174 04:49:28 ++ seq 0 2 04:49:28 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:49:28 + echo ODL_SYSTEM_1_IP=10.30.170.174 04:49:28 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:49:28 + echo ODL_SYSTEM_2_IP=10.30.170.199 04:49:28 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) 04:49:28 + echo ODL_SYSTEM_3_IP=10.30.171.237 04:49:28 + echo TOOLS_SYSTEM_IP=10.30.170.122 04:49:28 ++ seq 0 0 04:49:28 + for i in $(seq 0 $(( ${#TOOLS_SYSTEM[@]} - 1 ))) 04:49:28 + echo TOOLS_SYSTEM_1_IP=10.30.170.122 04:49:28 + openstack_index=0 04:49:28 + NUM_OPENSTACK_CONTROL_NODES=1 04:49:28 + echo NUM_OPENSTACK_CONTROL_NODES=1 04:49:28 ++ seq 0 0 04:49:28 + for i in $(seq 0 $((NUM_OPENSTACK_CONTROL_NODES - 1))) 04:49:28 + echo OPENSTACK_CONTROL_NODE_1_IP= 04:49:28 + NUM_OPENSTACK_COMPUTE_NODES=-1 04:49:28 + echo NUM_OPENSTACK_COMPUTE_NODES=-1 04:49:28 + '[' -1 -ge 2 ']' 04:49:28 ++ seq 0 -2 04:49:28 + NUM_OPENSTACK_HAPROXY_NODES=0 04:49:28 + echo NUM_OPENSTACK_HAPROXY_NODES=0 04:49:28 ++ seq 0 -1 04:49:28 + echo 'Contents of slave_addresses.txt:' 04:49:28 Contents of slave_addresses.txt: 04:49:28 + cat slave_addresses.txt 04:49:28 NUM_ODL_SYSTEM=3 04:49:28 NUM_TOOLS_SYSTEM=1 04:49:28 NUM_OPENSTACK_SYSTEM=0 04:49:28 ODL_SYSTEM_IP=10.30.170.174 04:49:28 ODL_SYSTEM_1_IP=10.30.170.174 04:49:28 ODL_SYSTEM_2_IP=10.30.170.199 04:49:28 ODL_SYSTEM_3_IP=10.30.171.237 04:49:28 TOOLS_SYSTEM_IP=10.30.170.122 04:49:28 TOOLS_SYSTEM_1_IP=10.30.170.122 04:49:28 NUM_OPENSTACK_CONTROL_NODES=1 04:49:28 OPENSTACK_CONTROL_NODE_1_IP= 04:49:28 NUM_OPENSTACK_COMPUTE_NODES=-1 04:49:28 NUM_OPENSTACK_HAPROXY_NODES=0 04:49:28 [EnvInject] - Injecting environment variables from a build step. 04:49:28 [EnvInject] - Injecting as environment variables the properties file path 'slave_addresses.txt' 04:49:28 [EnvInject] - Variables injected successfully. 04:49:28 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/sh /tmp/jenkins10962574887244303481.sh 04:49:28 Preparing for JRE Version 21 04:49:28 Karaf artifact is karaf 04:49:28 Karaf project is integration 04:49:28 Java home is /usr/lib/jvm/java-21-openjdk-amd64 04:49:28 [EnvInject] - Injecting environment variables from a build step. 04:49:28 [EnvInject] - Injecting as environment variables the properties file path 'set_variables.env' 04:49:28 [EnvInject] - Variables injected successfully. 04:49:28 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins14843716371818951852.sh 04:49:28 Distribution bundle URL is https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:28 Distribution bundle is karaf-0.21.4.zip 04:49:28 Distribution bundle version is 0.21.4 04:49:28 Distribution folder is karaf-0.21.4 04:49:28 Nexus prefix is https://nexus.opendaylight.org 04:49:28 [EnvInject] - Injecting environment variables from a build step. 04:49:28 [EnvInject] - Injecting as environment variables the properties file path 'detect_variables.env' 04:49:28 [EnvInject] - Variables injected successfully. 04:49:28 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins10070852851871367245.sh 04:49:28 Setup pyenv: 04:49:28 system 04:49:28 3.8.13 04:49:28 3.9.13 04:49:28 3.10.13 04:49:28 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 04:49:28 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 04:49:28 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 04:49:28 lf-activate-venv(): INFO: Attempting to install with network-safe options... 04:49:30 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:49:30 lftools 0.37.16 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:49:30 lf-activate-venv(): INFO: Base packages installed successfully 04:49:30 lf-activate-venv(): INFO: Installing additional packages: python-heatclient python-openstackclient 04:49:35 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 04:49:35 lftools 0.37.16 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 04:49:36 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 04:49:36 Copying common-functions.sh to /tmp 04:49:37 Copying common-functions.sh to 10.30.170.122:/tmp 04:49:37 Warning: Permanently added '10.30.170.122' (ECDSA) to the list of known hosts. 04:49:38 Copying common-functions.sh to 10.30.170.174:/tmp 04:49:38 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:38 Copying common-functions.sh to 10.30.170.199:/tmp 04:49:38 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:38 Copying common-functions.sh to 10.30.171.237:/tmp 04:49:39 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:39 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins10766585505431780143.sh 04:49:39 common-functions.sh is being sourced 04:49:39 common-functions environment: 04:49:39 MAVENCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:39 ACTUALFEATURES: 04:49:39 FEATURESCONF: /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:39 CUSTOMPROP: /tmp/karaf-0.21.4/etc/custom.properties 04:49:39 LOGCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:39 MEMCONF: /tmp/karaf-0.21.4/bin/setenv 04:49:39 CONTROLLERMEM: 2048m 04:49:39 AKKACONF: /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:39 MODULESCONF: /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:39 MODULESHARDSCONF: /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:39 SUITES: 04:49:39 04:49:39 ################################################# 04:49:39 ## Configure Cluster and Start ## 04:49:39 ################################################# 04:49:39 ACTUALFEATURES: odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer 04:49:39 SPACE_SEPARATED_FEATURES: odl-infrautils-ready odl-jolokia odl-openflowplugin-flow-services-rest odl-openflowplugin-app-table-miss-enforcer 04:49:39 Locating script plan to use... 04:49:39 Finished running script plans 04:49:39 Configuring member-1 with IP address 10.30.170.174 04:49:39 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:39 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:39 + source /tmp/common-functions.sh karaf-0.21.4 scandium 04:49:39 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:49:39 ++ echo 'common-functions.sh is being sourced' 04:49:39 common-functions.sh is being sourced 04:49:39 ++ BUNDLEFOLDER=karaf-0.21.4 04:49:39 ++ DISTROSTREAM=scandium 04:49:39 ++ export MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:39 ++ MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:39 ++ export FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:39 ++ FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:39 ++ export CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:39 ++ CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:39 ++ export LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:39 ++ LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:39 ++ export MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:39 ++ MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:39 ++ export CONTROLLERMEM= 04:49:39 ++ CONTROLLERMEM= 04:49:39 ++ case "${DISTROSTREAM}" in 04:49:39 ++ CLUSTER_SYSTEM=akka 04:49:39 ++ export AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:39 ++ AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:39 ++ export MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:39 ++ MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:39 ++ export MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:39 ++ MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:39 ++ print_common_env 04:49:39 ++ cat 04:49:39 common-functions environment: 04:49:39 MAVENCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:39 ACTUALFEATURES: 04:49:39 FEATURESCONF: /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:39 CUSTOMPROP: /tmp/karaf-0.21.4/etc/custom.properties 04:49:39 LOGCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:39 MEMCONF: /tmp/karaf-0.21.4/bin/setenv 04:49:39 CONTROLLERMEM: 04:49:39 AKKACONF: /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:39 MODULESCONF: /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:39 MODULESHARDSCONF: /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:39 SUITES: 04:49:39 04:49:39 ++ SSH='ssh -t -t' 04:49:39 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:49:39 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:49:39 Changing to /tmp 04:49:39 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:39 + echo 'Changing to /tmp' 04:49:39 + cd /tmp 04:49:39 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip' 04:49:39 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:39 --2025-12-02 04:49:39-- https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:39 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:49:39 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:49:39 HTTP request sent, awaiting response... 200 OK 04:49:39 Length: 239704696 (229M) [application/zip] 04:49:39 Saving to: ‘karaf-0.21.4.zip’ 04:49:39 04:49:39 0K ........ ........ ........ ........ ........ ........ 1% 59.4M 4s 04:49:39 3072K ........ ........ ........ ........ ........ ........ 2% 94.5M 3s 04:49:39 6144K ........ ........ ........ ........ ........ ........ 3% 132M 3s 04:49:39 9216K ........ ........ ........ ........ ........ ........ 5% 146M 2s 04:49:39 12288K ........ ........ ........ ........ ........ ........ 6% 122M 2s 04:49:39 15360K ........ ........ ........ ........ ........ ........ 7% 187M 2s 04:49:39 18432K ........ ........ ........ ........ ........ ........ 9% 170M 2s 04:49:39 21504K ........ ........ ........ ........ ........ ........ 10% 183M 2s 04:49:40 24576K ........ ........ ........ ........ ........ ........ 11% 223M 2s 04:49:40 27648K ........ ........ ........ ........ ........ ........ 13% 196M 2s 04:49:40 30720K ........ ........ ........ ........ ........ ........ 14% 190M 1s 04:49:40 33792K ........ ........ ........ ........ ........ ........ 15% 253M 1s 04:49:40 36864K ........ ........ ........ ........ ........ ........ 17% 239M 1s 04:49:40 39936K ........ ........ ........ ........ ........ ........ 18% 232M 1s 04:49:40 43008K ........ ........ ........ ........ ........ ........ 19% 265M 1s 04:49:40 46080K ........ ........ ........ ........ ........ ........ 20% 261M 1s 04:49:40 49152K ........ ........ ........ ........ ........ ........ 22% 271M 1s 04:49:40 52224K ........ ........ ........ ........ ........ ........ 23% 277M 1s 04:49:40 55296K ........ ........ ........ ........ ........ ........ 24% 269M 1s 04:49:40 58368K ........ ........ ........ ........ ........ ........ 26% 293M 1s 04:49:40 61440K ........ ........ ........ ........ ........ ........ 27% 317M 1s 04:49:40 64512K ........ ........ ........ ........ ........ ........ 28% 290M 1s 04:49:40 67584K ........ ........ ........ ........ ........ ........ 30% 259M 1s 04:49:40 70656K ........ ........ ........ ........ ........ ........ 31% 246M 1s 04:49:40 73728K ........ ........ ........ ........ ........ ........ 32% 181M 1s 04:49:40 76800K ........ ........ ........ ........ ........ ........ 34% 234M 1s 04:49:40 79872K ........ ........ ........ ........ ........ ........ 35% 280M 1s 04:49:40 82944K ........ ........ ........ ........ ........ ........ 36% 243M 1s 04:49:40 86016K ........ ........ ........ ........ ........ ........ 38% 300M 1s 04:49:40 89088K ........ ........ ........ ........ ........ ........ 39% 293M 1s 04:49:40 92160K ........ ........ ........ ........ ........ ........ 40% 320M 1s 04:49:40 95232K ........ ........ ........ ........ ........ ........ 41% 278M 1s 04:49:40 98304K ........ ........ ........ ........ ........ ........ 43% 290M 1s 04:49:40 101376K ........ ........ ........ ........ ........ ........ 44% 277M 1s 04:49:40 104448K ........ ........ ........ ........ ........ ........ 45% 287M 1s 04:49:40 107520K ........ ........ ........ ........ ........ ........ 47% 259M 1s 04:49:40 110592K ........ ........ ........ ........ ........ ........ 48% 255M 1s 04:49:40 113664K ........ ........ ........ ........ ........ ........ 49% 246M 1s 04:49:40 116736K ........ ........ ........ ........ ........ ........ 51% 242M 1s 04:49:40 119808K ........ ........ ........ ........ ........ ........ 52% 244M 1s 04:49:40 122880K ........ ........ ........ ........ ........ ........ 53% 265M 1s 04:49:40 125952K ........ ........ ........ ........ ........ ........ 55% 287M 0s 04:49:40 129024K ........ ........ ........ ........ ........ ........ 56% 281M 0s 04:49:40 132096K ........ ........ ........ ........ ........ ........ 57% 301M 0s 04:49:40 135168K ........ ........ ........ ........ ........ ........ 59% 278M 0s 04:49:40 138240K ........ ........ ........ ........ ........ ........ 60% 314M 0s 04:49:40 141312K ........ ........ ........ ........ ........ ........ 61% 291M 0s 04:49:40 144384K ........ ........ ........ ........ ........ ........ 62% 264M 0s 04:49:40 147456K ........ ........ ........ ........ ........ ........ 64% 197M 0s 04:49:40 150528K ........ ........ ........ ........ ........ ........ 65% 188M 0s 04:49:40 153600K ........ ........ ........ ........ ........ ........ 66% 244M 0s 04:49:40 156672K ........ ........ ........ ........ ........ ........ 68% 230M 0s 04:49:40 159744K ........ ........ ........ ........ ........ ........ 69% 148M 0s 04:49:40 162816K ........ ........ ........ ........ ........ ........ 70% 183M 0s 04:49:40 165888K ........ ........ ........ ........ ........ ........ 72% 96.7M 0s 04:49:40 168960K ........ ........ ........ ........ ........ ........ 73% 123M 0s 04:49:40 172032K ........ ........ ........ ........ ........ ........ 74% 150M 0s 04:49:40 175104K ........ ........ ........ ........ ........ ........ 76% 278M 0s 04:49:40 178176K ........ ........ ........ ........ ........ ........ 77% 257M 0s 04:49:40 181248K ........ ........ ........ ........ ........ ........ 78% 249M 0s 04:49:40 184320K ........ ........ ........ ........ ........ ........ 80% 281M 0s 04:49:40 187392K ........ ........ ........ ........ ........ ........ 81% 276M 0s 04:49:40 190464K ........ ........ ........ ........ ........ ........ 82% 134M 0s 04:49:40 193536K ........ ........ ........ ........ ........ ........ 83% 141M 0s 04:49:40 196608K ........ ........ ........ ........ ........ ........ 85% 96.8M 0s 04:49:40 199680K ........ ........ ........ ........ ........ ........ 86% 167M 0s 04:49:40 202752K ........ ........ ........ ........ ........ ........ 87% 260M 0s 04:49:40 205824K ........ ........ ........ ........ ........ ........ 89% 200M 0s 04:49:40 208896K ........ ........ ........ ........ ........ ........ 90% 214M 0s 04:49:40 211968K ........ ........ ........ ........ ........ ........ 91% 178M 0s 04:49:40 215040K ........ ........ ........ ........ ........ ........ 93% 232M 0s 04:49:40 218112K ........ ........ ........ ........ ........ ........ 94% 266M 0s 04:49:40 221184K ........ ........ ........ ........ ........ ........ 95% 187M 0s 04:49:40 224256K ........ ........ ........ ........ ........ ........ 97% 171M 0s 04:49:40 227328K ........ ........ ........ ........ ........ ........ 98% 238M 0s 04:49:40 230400K ........ ........ ........ ........ ........ ........ 99% 256M 0s 04:49:40 233472K ........ . 100% 235M=1.1s 04:49:40 04:49:40 2025-12-02 04:49:40 (204 MB/s) - ‘karaf-0.21.4.zip’ saved [239704696/239704696] 04:49:40 04:49:40 Extracting the new controller... 04:49:40 + echo 'Extracting the new controller...' 04:49:40 + unzip -q karaf-0.21.4.zip 04:49:42 Adding external repositories... 04:49:42 + echo 'Adding external repositories...' 04:49:42 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:42 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:42 ################################################################################ 04:49:42 # 04:49:42 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:42 # contributor license agreements. See the NOTICE file distributed with 04:49:42 # this work for additional information regarding copyright ownership. 04:49:42 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:42 # (the "License"); you may not use this file except in compliance with 04:49:42 # the License. You may obtain a copy of the License at 04:49:42 # 04:49:42 # http://www.apache.org/licenses/LICENSE-2.0 04:49:42 # 04:49:42 # Unless required by applicable law or agreed to in writing, software 04:49:42 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:42 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:42 # See the License for the specific language governing permissions and 04:49:42 # limitations under the License. 04:49:42 # 04:49:42 ################################################################################ 04:49:42 04:49:42 # 04:49:42 # If set to true, the following property will not allow any certificate to be used 04:49:42 # when accessing Maven repositories through SSL 04:49:42 # 04:49:42 #org.ops4j.pax.url.mvn.certificateCheck= 04:49:42 04:49:42 # 04:49:42 # Path to the local Maven settings file. 04:49:42 # The repositories defined in this file will be automatically added to the list 04:49:42 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:49:42 # below is not set. 04:49:42 # The following locations are checked for the existence of the settings.xml file 04:49:42 # * 1. looks for the specified url 04:49:42 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:49:42 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:49:42 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:49:42 # 04:49:42 #org.ops4j.pax.url.mvn.settings= 04:49:42 04:49:42 # 04:49:42 # Path to the local Maven repository which is used to avoid downloading 04:49:42 # artifacts when they already exist locally. 04:49:42 # The value of this property will be extracted from the settings.xml file 04:49:42 # above, or defaulted to: 04:49:42 # System.getProperty( "user.home" ) + "/.m2/repository" 04:49:42 # 04:49:42 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:49:42 04:49:42 # 04:49:42 # Default this to false. It's just weird to use undocumented repos 04:49:42 # 04:49:42 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:49:42 04:49:42 # 04:49:42 # Uncomment if you don't wanna use the proxy settings 04:49:42 # from the Maven conf/settings.xml file 04:49:42 # 04:49:42 # org.ops4j.pax.url.mvn.proxySupport=false 04:49:42 04:49:42 # 04:49:42 # Comma separated list of repositories scanned when resolving an artifact. 04:49:42 # Those repositories will be checked before iterating through the 04:49:42 # below list of repositories and even before the local repository 04:49:42 # A repository url can be appended with zero or more of the following flags: 04:49:42 # @snapshots : the repository contains snaphots 04:49:42 # @noreleases : the repository does not contain any released artifacts 04:49:42 # 04:49:42 # The following property value will add the system folder as a repo. 04:49:42 # 04:49:42 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:49:42 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:49:42 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:49:42 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:49:42 04:49:42 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:49:42 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:49:42 04:49:42 # 04:49:42 # Comma separated list of repositories scanned when resolving an artifact. 04:49:42 # The default list includes the following repositories: 04:49:42 # http://repo1.maven.org/maven2@id=central 04:49:42 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:49:42 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:49:42 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:49:42 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:49:42 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:49:42 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:42 # To add repositories to the default ones, prepend '+' to the list of repositories 04:49:42 # to add. 04:49:42 # A repository url can be appended with zero or more of the following flags: 04:49:42 # @snapshots : the repository contains snapshots 04:49:42 # @noreleases : the repository does not contain any released artifacts 04:49:42 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:49:42 # 04:49:42 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:42 04:49:42 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 04:49:42 + [[ True == \T\r\u\e ]] 04:49:42 + echo 'Configuring the startup features...' 04:49:42 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:42 + FEATURE_TEST_STRING=features-test 04:49:42 + FEATURE_TEST_VERSION=0.21.4 04:49:42 + KARAF_VERSION=karaf4 04:49:42 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:49:42 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:42 + [[ ! -z '' ]] 04:49:42 + cat /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:42 ################################################################################ 04:49:42 # 04:49:42 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:42 # contributor license agreements. See the NOTICE file distributed with 04:49:42 # this work for additional information regarding copyright ownership. 04:49:42 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:42 # (the "License"); you may not use this file except in compliance with 04:49:42 # the License. You may obtain a copy of the License at 04:49:42 # 04:49:42 # http://www.apache.org/licenses/LICENSE-2.0 04:49:42 # 04:49:42 # Unless required by applicable law or agreed to in writing, software 04:49:42 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:42 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:42 # See the License for the specific language governing permissions and 04:49:42 # limitations under the License. 04:49:42 # 04:49:42 ################################################################################ 04:49:42 04:49:42 # 04:49:42 # Comma separated list of features repositories to register by default 04:49:42 # 04:49:42 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/f3e5bf4d-c8b9-4e8e-a676-a978cb689c44.xml 04:49:42 04:49:42 # 04:49:42 # Comma separated list of features to install at startup 04:49:42 # 04:49:42 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, a90d8f2f-9a3e-46e7-99a0-9ab64083b4d5 04:49:42 04:49:42 # 04:49:42 # Resource repositories (OBR) that the features resolver can use 04:49:42 # to resolve requirements/capabilities 04:49:42 # 04:49:42 # The format of the resourceRepositories is 04:49:42 # resourceRepositories=[xml:url|json:url],... 04:49:42 # for Instance: 04:49:42 # 04:49:42 #resourceRepositories=xml:http://host/path/to/index.xml 04:49:42 # or 04:49:42 #resourceRepositories=json:http://host/path/to/index.json 04:49:42 # 04:49:42 04:49:42 # 04:49:42 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:49:42 # 04:49:42 featuresBootAsynchronous=false 04:49:42 04:49:42 # 04:49:42 # Service requirements enforcement 04:49:42 # 04:49:42 # By default, the feature resolver checks the service requirements/capabilities of 04:49:42 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:49:42 # the required bundles. 04:49:42 # The following flag can have those values: 04:49:42 # - disable: service requirements are completely ignored 04:49:42 # - default: service requirements are ignored for old features 04:49:42 # - enforce: service requirements are always verified 04:49:42 # 04:49:42 #serviceRequirements=default 04:49:42 04:49:42 # 04:49:42 # Store cfg file for config element in feature 04:49:42 # 04:49:42 #configCfgStore=true 04:49:42 04:49:42 # 04:49:42 # Define if the feature service automatically refresh bundles 04:49:42 # 04:49:42 autoRefresh=true 04:49:42 04:49:42 # 04:49:42 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:49:42 # XML file defines instructions related to features processing 04:49:42 # versions.properties may declare properties to resolve placeholders in XML file 04:49:42 # both files are relative to ${karaf.etc} 04:49:42 # 04:49:42 #featureProcessing=org.apache.karaf.features.xml 04:49:42 #featureProcessingVersions=versions.properties 04:49:42 + configure_karaf_log karaf4 '' 04:49:42 + local -r karaf_version=karaf4 04:49:42 + local -r controllerdebugmap= 04:49:42 + local logapi=log4j 04:49:42 + grep log4j2 /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:42 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:42 log4j2.rootLogger.level = INFO 04:49:42 #log4j2.rootLogger.type = asyncRoot 04:49:42 #log4j2.rootLogger.includeLocation = false 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:42 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:42 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:42 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:42 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:42 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:42 log4j2.logger.spifly.level = WARN 04:49:42 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:42 log4j2.logger.audit.level = INFO 04:49:42 log4j2.logger.audit.additivity = false 04:49:42 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:42 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:42 log4j2.appender.console.type = Console 04:49:42 log4j2.appender.console.name = Console 04:49:42 log4j2.appender.console.layout.type = PatternLayout 04:49:42 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:42 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:42 log4j2.appender.rolling.name = RollingFile 04:49:42 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:42 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:42 #log4j2.appender.rolling.immediateFlush = false 04:49:42 log4j2.appender.rolling.append = true 04:49:42 log4j2.appender.rolling.layout.type = PatternLayout 04:49:42 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:42 log4j2.appender.rolling.policies.type = Policies 04:49:42 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:42 log4j2.appender.rolling.policies.size.size = 64MB 04:49:42 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:42 log4j2.appender.rolling.strategy.max = 7 04:49:42 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:42 log4j2.appender.audit.name = AuditRollingFile 04:49:42 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:42 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:42 log4j2.appender.audit.append = true 04:49:42 log4j2.appender.audit.layout.type = PatternLayout 04:49:42 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:42 log4j2.appender.audit.policies.type = Policies 04:49:42 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:42 log4j2.appender.audit.policies.size.size = 8MB 04:49:42 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:42 log4j2.appender.audit.strategy.max = 7 04:49:42 log4j2.appender.osgi.type = PaxOsgi 04:49:42 log4j2.appender.osgi.name = PaxOsgi 04:49:42 log4j2.appender.osgi.filter = * 04:49:42 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:42 #log4j2.logger.aether.level = TRACE 04:49:42 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:42 #log4j2.logger.http-headers.level = DEBUG 04:49:42 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:42 #log4j2.logger.maven.level = TRACE 04:49:42 + logapi=log4j2 04:49:42 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:49:42 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:49:42 + '[' log4j2 == log4j2 ']' 04:49:42 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:42 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:49:42 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:49:42 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:49:42 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:49:42 controllerdebugmap: 04:49:42 cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:42 + unset IFS 04:49:42 + echo 'controllerdebugmap: ' 04:49:42 + '[' -n '' ']' 04:49:42 + echo 'cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg' 04:49:42 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:42 ################################################################################ 04:49:42 # 04:49:42 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:42 # contributor license agreements. See the NOTICE file distributed with 04:49:42 # this work for additional information regarding copyright ownership. 04:49:42 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:42 # (the "License"); you may not use this file except in compliance with 04:49:42 # the License. You may obtain a copy of the License at 04:49:42 # 04:49:42 # http://www.apache.org/licenses/LICENSE-2.0 04:49:42 # 04:49:42 # Unless required by applicable law or agreed to in writing, software 04:49:42 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:42 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:42 # See the License for the specific language governing permissions and 04:49:42 # limitations under the License. 04:49:42 # 04:49:42 ################################################################################ 04:49:42 04:49:42 # Common pattern layout for appenders 04:49:42 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:42 04:49:42 # Root logger 04:49:42 log4j2.rootLogger.level = INFO 04:49:42 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:42 #log4j2.rootLogger.type = asyncRoot 04:49:42 #log4j2.rootLogger.includeLocation = false 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:42 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:42 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:42 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:42 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:42 04:49:42 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:42 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:42 04:49:42 # Loggers configuration 04:49:42 04:49:42 # Spifly logger 04:49:42 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:42 log4j2.logger.spifly.level = WARN 04:49:42 04:49:42 # Security audit logger 04:49:42 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:42 log4j2.logger.audit.level = INFO 04:49:42 log4j2.logger.audit.additivity = false 04:49:42 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:42 04:49:42 # Appenders configuration 04:49:42 04:49:42 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:42 log4j2.appender.console.type = Console 04:49:42 log4j2.appender.console.name = Console 04:49:42 log4j2.appender.console.layout.type = PatternLayout 04:49:42 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:42 04:49:42 # Rolling file appender 04:49:42 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:42 log4j2.appender.rolling.name = RollingFile 04:49:42 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:42 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:42 # uncomment to not force a disk flush 04:49:42 #log4j2.appender.rolling.immediateFlush = false 04:49:42 log4j2.appender.rolling.append = true 04:49:42 log4j2.appender.rolling.layout.type = PatternLayout 04:49:42 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:42 log4j2.appender.rolling.policies.type = Policies 04:49:42 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:42 log4j2.appender.rolling.policies.size.size = 1GB 04:49:42 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:42 log4j2.appender.rolling.strategy.max = 7 04:49:42 04:49:42 # Audit file appender 04:49:42 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:42 log4j2.appender.audit.name = AuditRollingFile 04:49:42 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:42 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:42 log4j2.appender.audit.append = true 04:49:42 log4j2.appender.audit.layout.type = PatternLayout 04:49:42 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:42 log4j2.appender.audit.policies.type = Policies 04:49:42 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:42 log4j2.appender.audit.policies.size.size = 8MB 04:49:42 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:42 log4j2.appender.audit.strategy.max = 7 04:49:42 04:49:42 # OSGi appender 04:49:42 log4j2.appender.osgi.type = PaxOsgi 04:49:42 log4j2.appender.osgi.name = PaxOsgi 04:49:42 log4j2.appender.osgi.filter = * 04:49:42 04:49:42 # help with identification of maven-related problems with pax-url-aether 04:49:42 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:42 #log4j2.logger.aether.level = TRACE 04:49:42 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:42 #log4j2.logger.http-headers.level = DEBUG 04:49:42 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:42 #log4j2.logger.maven.level = TRACE 04:49:42 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:42 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:42 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.21.4/bin/setenv 04:49:42 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:49:42 + local -r controllermem=2048m 04:49:42 Configure 04:49:42 + local -r memconf=/tmp/karaf-0.21.4/bin/setenv 04:49:42 + echo Configure 04:49:42 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:49:42 max memory: 2048m 04:49:42 memconf: /tmp/karaf-0.21.4/bin/setenv 04:49:42 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:49:42 + echo ' max memory: 2048m' 04:49:42 + echo ' memconf: /tmp/karaf-0.21.4/bin/setenv' 04:49:42 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.21.4/bin/setenv 04:49:42 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.21.4/bin/setenv 04:49:42 cat /tmp/karaf-0.21.4/bin/setenv 04:49:42 + echo 'cat /tmp/karaf-0.21.4/bin/setenv' 04:49:42 + cat /tmp/karaf-0.21.4/bin/setenv 04:49:42 #!/bin/sh 04:49:42 # 04:49:42 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:42 # contributor license agreements. See the NOTICE file distributed with 04:49:42 # this work for additional information regarding copyright ownership. 04:49:42 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:42 # (the "License"); you may not use this file except in compliance with 04:49:42 # the License. You may obtain a copy of the License at 04:49:42 # 04:49:42 # http://www.apache.org/licenses/LICENSE-2.0 04:49:42 # 04:49:42 # Unless required by applicable law or agreed to in writing, software 04:49:42 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:42 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:42 # See the License for the specific language governing permissions and 04:49:42 # limitations under the License. 04:49:42 # 04:49:42 04:49:42 # 04:49:42 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:49:42 # script: client, instance, shell, start, status, stop, karaf 04:49:42 # 04:49:42 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:49:42 # Actions go here... 04:49:42 # fi 04:49:42 04:49:42 # 04:49:42 # general settings which should be applied for all scripts go here; please keep 04:49:42 # in mind that it is possible that scripts might be executed more than once, e.g. 04:49:42 # in example of the start script where the start script is executed first and the 04:49:42 # karaf script afterwards. 04:49:42 # 04:49:42 04:49:42 # 04:49:42 # The following section shows the possible configuration options for the default 04:49:42 # karaf scripts 04:49:42 # 04:49:42 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:49:42 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:49:42 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:49:42 # export EXTRA_JAVA_OPTS # Additional JVM options 04:49:42 # export KARAF_HOME # Karaf home folder 04:49:42 # export KARAF_DATA # Karaf data folder 04:49:42 # export KARAF_BASE # Karaf base folder 04:49:42 # export KARAF_ETC # Karaf etc folder 04:49:42 # export KARAF_LOG # Karaf log folder 04:49:42 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:49:42 # export KARAF_OPTS # Additional available Karaf options 04:49:42 # export KARAF_DEBUG # Enable debug mode 04:49:42 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:49:42 # export KARAF_NOROOT # Prevent execution as root if set to true 04:49:42 Set Java version 04:49:42 + echo 'Set Java version' 04:49:42 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:49:43 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:43 sudo: a password is required 04:49:43 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:43 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:43 sudo: a password is required 04:49:43 JDK default version ... 04:49:43 + echo 'JDK default version ...' 04:49:43 + java -version 04:49:43 openjdk version "21.0.8" 2025-07-15 04:49:43 OpenJDK Runtime Environment (build 21.0.8+9-Ubuntu-0ubuntu122.04.1) 04:49:43 OpenJDK 64-Bit Server VM (build 21.0.8+9-Ubuntu-0ubuntu122.04.1, mixed mode, sharing) 04:49:43 Set JAVA_HOME 04:49:43 + echo 'Set JAVA_HOME' 04:49:43 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:43 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:43 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:43 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:43 Listing all open ports on controller system... 04:49:43 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:43 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:49:43 + echo 'Listing all open ports on controller system...' 04:49:43 + netstat -pnatu 04:49:43 /tmp/configuration-script.sh: line 40: netstat: command not found 04:49:43 + '[' -f /tmp/custom_shard_config.txt ']' 04:49:43 Configuring cluster 04:49:43 + echo 'Configuring cluster' 04:49:43 + /tmp/karaf-0.21.4/bin/configure_cluster.sh 1 10.30.170.174 10.30.170.199 10.30.171.237 04:49:43 ################################################ 04:49:43 ## Configure Cluster ## 04:49:43 ################################################ 04:49:43 NOTE: Cluster configuration files not found. Copying from 04:49:43 /tmp/karaf-0.21.4/system/org/opendaylight/controller/sal-clustering-config/10.0.14 04:49:43 Configuring unique name in akka.conf 04:49:43 Configuring hostname in akka.conf 04:49:43 Configuring data and rpc seed nodes in akka.conf 04:49:43 modules = [ 04:49:43 04:49:43 { 04:49:43 name = "inventory" 04:49:43 namespace = "urn:opendaylight:inventory" 04:49:43 shard-strategy = "module" 04:49:43 }, 04:49:43 { 04:49:43 name = "topology" 04:49:43 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:43 shard-strategy = "module" 04:49:43 }, 04:49:43 { 04:49:43 name = "toaster" 04:49:43 namespace = "http://netconfcentral.org/ns/toaster" 04:49:43 shard-strategy = "module" 04:49:43 } 04:49:43 ] 04:49:43 Configuring replication type in module-shards.conf 04:49:43 ################################################ 04:49:43 ## NOTE: Manually restart controller to ## 04:49:43 ## apply configuration. ## 04:49:43 ################################################ 04:49:43 Dump akka.conf 04:49:43 + echo 'Dump akka.conf' 04:49:43 + cat /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:43 04:49:43 odl-cluster-data { 04:49:43 akka { 04:49:43 remote { 04:49:43 artery { 04:49:43 enabled = on 04:49:43 transport = tcp 04:49:43 canonical.hostname = "10.30.170.174" 04:49:43 canonical.port = 2550 04:49:43 } 04:49:43 } 04:49:43 04:49:43 cluster { 04:49:43 # Using artery. 04:49:43 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:43 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:43 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:43 04:49:43 roles = ["member-1"] 04:49:43 04:49:43 # when under load we might trip a false positive on the failure detector 04:49:43 # failure-detector { 04:49:43 # heartbeat-interval = 4 s 04:49:43 # acceptable-heartbeat-pause = 16s 04:49:43 # } 04:49:43 } 04:49:43 04:49:43 persistence { 04:49:43 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:43 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:43 # The relative path is always relative to KARAF_HOME. 04:49:43 04:49:43 # snapshot-store.local.dir = "target/snapshots" 04:49:43 04:49:43 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:43 snapshot-store.local.use-lz4-compression = false 04:49:43 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:43 snapshot-store.local.lz4-blocksize = 256KB 04:49:43 } 04:49:43 disable-default-actor-system-quarantined-event-handling = "false" 04:49:43 } 04:49:43 } 04:49:43 Dump modules.conf 04:49:43 + echo 'Dump modules.conf' 04:49:43 + cat /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:43 modules = [ 04:49:43 04:49:43 { 04:49:43 name = "inventory" 04:49:43 namespace = "urn:opendaylight:inventory" 04:49:43 shard-strategy = "module" 04:49:43 }, 04:49:43 { 04:49:43 name = "topology" 04:49:43 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:43 shard-strategy = "module" 04:49:43 }, 04:49:43 { 04:49:43 name = "toaster" 04:49:43 namespace = "http://netconfcentral.org/ns/toaster" 04:49:43 shard-strategy = "module" 04:49:43 } 04:49:43 ] 04:49:43 Dump module-shards.conf 04:49:43 + echo 'Dump module-shards.conf' 04:49:43 + cat /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:43 module-shards = [ 04:49:43 { 04:49:43 name = "default" 04:49:43 shards = [ 04:49:43 { 04:49:43 name = "default" 04:49:43 replicas = ["member-1", 04:49:43 "member-2", 04:49:43 "member-3"] 04:49:43 } 04:49:43 ] 04:49:43 }, 04:49:43 { 04:49:43 name = "inventory" 04:49:43 shards = [ 04:49:43 { 04:49:43 name="inventory" 04:49:43 replicas = ["member-1", 04:49:43 "member-2", 04:49:43 "member-3"] 04:49:43 } 04:49:43 ] 04:49:43 }, 04:49:43 { 04:49:43 name = "topology" 04:49:43 shards = [ 04:49:43 { 04:49:43 name="topology" 04:49:43 replicas = ["member-1", 04:49:43 "member-2", 04:49:43 "member-3"] 04:49:43 } 04:49:43 ] 04:49:43 }, 04:49:43 { 04:49:43 name = "toaster" 04:49:43 shards = [ 04:49:43 { 04:49:43 name="toaster" 04:49:43 replicas = ["member-1", 04:49:43 "member-2", 04:49:43 "member-3"] 04:49:43 } 04:49:43 ] 04:49:43 } 04:49:43 ] 04:49:43 Configuring member-2 with IP address 10.30.170.199 04:49:43 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:43 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:43 + source /tmp/common-functions.sh karaf-0.21.4 scandium 04:49:43 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:49:43 common-functions.sh is being sourced 04:49:43 ++ echo 'common-functions.sh is being sourced' 04:49:43 ++ BUNDLEFOLDER=karaf-0.21.4 04:49:43 ++ DISTROSTREAM=scandium 04:49:43 ++ export MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:43 ++ MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:43 ++ export FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:43 ++ FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:43 ++ export CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:43 ++ CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:43 ++ export LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:43 ++ LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:43 ++ export MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:43 ++ MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:43 ++ export CONTROLLERMEM= 04:49:43 ++ CONTROLLERMEM= 04:49:43 ++ case "${DISTROSTREAM}" in 04:49:43 ++ CLUSTER_SYSTEM=akka 04:49:43 ++ export AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:43 ++ AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:43 ++ export MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:43 ++ MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:43 ++ export MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:43 ++ MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:43 ++ print_common_env 04:49:43 ++ cat 04:49:43 common-functions environment: 04:49:43 MAVENCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:43 ACTUALFEATURES: 04:49:43 FEATURESCONF: /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:43 CUSTOMPROP: /tmp/karaf-0.21.4/etc/custom.properties 04:49:43 LOGCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:43 MEMCONF: /tmp/karaf-0.21.4/bin/setenv 04:49:43 CONTROLLERMEM: 04:49:43 AKKACONF: /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:43 MODULESCONF: /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:43 MODULESHARDSCONF: /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:43 SUITES: 04:49:43 04:49:43 ++ SSH='ssh -t -t' 04:49:43 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:49:43 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:49:43 Changing to /tmp 04:49:43 + echo 'Changing to /tmp' 04:49:43 + cd /tmp 04:49:43 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:43 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip' 04:49:43 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:43 --2025-12-02 04:49:43-- https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:43 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:49:43 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:49:43 HTTP request sent, awaiting response... 200 OK 04:49:43 Length: 239704696 (229M) [application/zip] 04:49:43 Saving to: ‘karaf-0.21.4.zip’ 04:49:43 04:49:43 0K ........ ........ ........ ........ ........ ........ 1% 66.7M 3s 04:49:43 3072K ........ ........ ........ ........ ........ ........ 2% 112M 3s 04:49:43 6144K ........ ........ ........ ........ ........ ........ 3% 135M 2s 04:49:43 9216K ........ ........ ........ ........ ........ ........ 5% 145M 2s 04:49:43 12288K ........ ........ ........ ........ ........ ........ 6% 161M 2s 04:49:43 15360K ........ ........ ........ ........ ........ ........ 7% 173M 2s 04:49:43 18432K ........ ........ ........ ........ ........ ........ 9% 203M 2s 04:49:43 21504K ........ ........ ........ ........ ........ ........ 10% 226M 2s 04:49:43 24576K ........ ........ ........ ........ ........ ........ 11% 153M 1s 04:49:43 27648K ........ ........ ........ ........ ........ ........ 13% 300M 1s 04:49:43 30720K ........ ........ ........ ........ ........ ........ 14% 175M 1s 04:49:44 33792K ........ ........ ........ ........ ........ ........ 15% 194M 1s 04:49:44 36864K ........ ........ ........ ........ ........ ........ 17% 230M 1s 04:49:44 39936K ........ ........ ........ ........ ........ ........ 18% 232M 1s 04:49:44 43008K ........ ........ ........ ........ ........ ........ 19% 251M 1s 04:49:44 46080K ........ ........ ........ ........ ........ ........ 20% 256M 1s 04:49:44 49152K ........ ........ ........ ........ ........ ........ 22% 225M 1s 04:49:44 52224K ........ ........ ........ ........ ........ ........ 23% 312M 1s 04:49:44 55296K ........ ........ ........ ........ ........ ........ 24% 412M 1s 04:49:44 58368K ........ ........ ........ ........ ........ ........ 26% 443M 1s 04:49:44 61440K ........ ........ ........ ........ ........ ........ 27% 373M 1s 04:49:44 64512K ........ ........ ........ ........ ........ ........ 28% 196M 1s 04:49:44 67584K ........ ........ ........ ........ ........ ........ 30% 407M 1s 04:49:44 70656K ........ ........ ........ ........ ........ ........ 31% 428M 1s 04:49:44 73728K ........ ........ ........ ........ ........ ........ 32% 373M 1s 04:49:44 76800K ........ ........ ........ ........ ........ ........ 34% 340M 1s 04:49:44 79872K ........ ........ ........ ........ ........ ........ 35% 329M 1s 04:49:44 82944K ........ ........ ........ ........ ........ ........ 36% 224M 1s 04:49:44 86016K ........ ........ ........ ........ ........ ........ 38% 273M 1s 04:49:44 89088K ........ ........ ........ ........ ........ ........ 39% 292M 1s 04:49:44 92160K ........ ........ ........ ........ ........ ........ 40% 352M 1s 04:49:44 95232K ........ ........ ........ ........ ........ ........ 41% 333M 1s 04:49:44 98304K ........ ........ ........ ........ ........ ........ 43% 183M 1s 04:49:44 101376K ........ ........ ........ ........ ........ ........ 44% 305M 1s 04:49:44 104448K ........ ........ ........ ........ ........ ........ 45% 170M 1s 04:49:44 107520K ........ ........ ........ ........ ........ ........ 47% 286M 1s 04:49:44 110592K ........ ........ ........ ........ ........ ........ 48% 338M 1s 04:49:44 113664K ........ ........ ........ ........ ........ ........ 49% 330M 1s 04:49:44 116736K ........ ........ ........ ........ ........ ........ 51% 329M 1s 04:49:44 119808K ........ ........ ........ ........ ........ ........ 52% 309M 0s 04:49:44 122880K ........ ........ ........ ........ ........ ........ 53% 190M 0s 04:49:44 125952K ........ ........ ........ ........ ........ ........ 55% 402M 0s 04:49:44 129024K ........ ........ ........ ........ ........ ........ 56% 304M 0s 04:49:44 132096K ........ ........ ........ ........ ........ ........ 57% 243M 0s 04:49:44 135168K ........ ........ ........ ........ ........ ........ 59% 351M 0s 04:49:44 138240K ........ ........ ........ ........ ........ ........ 60% 412M 0s 04:49:44 141312K ........ ........ ........ ........ ........ ........ 61% 324M 0s 04:49:44 144384K ........ ........ ........ ........ ........ ........ 62% 319M 0s 04:49:44 147456K ........ ........ ........ ........ ........ ........ 64% 322M 0s 04:49:44 150528K ........ ........ ........ ........ ........ ........ 65% 330M 0s 04:49:44 153600K ........ ........ ........ ........ ........ ........ 66% 311M 0s 04:49:44 156672K ........ ........ ........ ........ ........ ........ 68% 307M 0s 04:49:44 159744K ........ ........ ........ ........ ........ ........ 69% 306M 0s 04:49:44 162816K ........ ........ ........ ........ ........ ........ 70% 287M 0s 04:49:44 165888K ........ ........ ........ ........ ........ ........ 72% 345M 0s 04:49:44 168960K ........ ........ ........ ........ ........ ........ 73% 303M 0s 04:49:44 172032K ........ ........ ........ ........ ........ ........ 74% 321M 0s 04:49:44 175104K ........ ........ ........ ........ ........ ........ 76% 301M 0s 04:49:44 178176K ........ ........ ........ ........ ........ ........ 77% 304M 0s 04:49:44 181248K ........ ........ ........ ........ ........ ........ 78% 308M 0s 04:49:44 184320K ........ ........ ........ ........ ........ ........ 80% 312M 0s 04:49:44 187392K ........ ........ ........ ........ ........ ........ 81% 306M 0s 04:49:44 190464K ........ ........ ........ ........ ........ ........ 82% 311M 0s 04:49:44 193536K ........ ........ ........ ........ ........ ........ 83% 318M 0s 04:49:44 196608K ........ ........ ........ ........ ........ ........ 85% 326M 0s 04:49:44 199680K ........ ........ ........ ........ ........ ........ 86% 329M 0s 04:49:44 202752K ........ ........ ........ ........ ........ ........ 87% 323M 0s 04:49:44 205824K ........ ........ ........ ........ ........ ........ 89% 336M 0s 04:49:44 208896K ........ ........ ........ ........ ........ ........ 90% 244M 0s 04:49:44 211968K ........ ........ ........ ........ ........ ........ 91% 324M 0s 04:49:44 215040K ........ ........ ........ ........ ........ ........ 93% 335M 0s 04:49:44 218112K ........ ........ ........ ........ ........ ........ 94% 336M 0s 04:49:44 221184K ........ ........ ........ ........ ........ ........ 95% 328M 0s 04:49:44 224256K ........ ........ ........ ........ ........ ........ 97% 331M 0s 04:49:44 227328K ........ ........ ........ ........ ........ ........ 98% 326M 0s 04:49:44 230400K ........ ........ ........ ........ ........ ........ 99% 331M 0s 04:49:44 233472K ........ . 100% 319M=0.9s 04:49:44 04:49:44 2025-12-02 04:49:44 (259 MB/s) - ‘karaf-0.21.4.zip’ saved [239704696/239704696] 04:49:44 04:49:44 Extracting the new controller... 04:49:44 + echo 'Extracting the new controller...' 04:49:44 + unzip -q karaf-0.21.4.zip 04:49:46 Adding external repositories... 04:49:46 + echo 'Adding external repositories...' 04:49:46 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:46 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:46 ################################################################################ 04:49:46 # 04:49:46 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:46 # contributor license agreements. See the NOTICE file distributed with 04:49:46 # this work for additional information regarding copyright ownership. 04:49:46 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:46 # (the "License"); you may not use this file except in compliance with 04:49:46 # the License. You may obtain a copy of the License at 04:49:46 # 04:49:46 # http://www.apache.org/licenses/LICENSE-2.0 04:49:46 # 04:49:46 # Unless required by applicable law or agreed to in writing, software 04:49:46 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:46 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:46 # See the License for the specific language governing permissions and 04:49:46 # limitations under the License. 04:49:46 # 04:49:46 ################################################################################ 04:49:46 04:49:46 # 04:49:46 # If set to true, the following property will not allow any certificate to be used 04:49:46 # when accessing Maven repositories through SSL 04:49:46 # 04:49:46 #org.ops4j.pax.url.mvn.certificateCheck= 04:49:46 04:49:46 # 04:49:46 # Path to the local Maven settings file. 04:49:46 # The repositories defined in this file will be automatically added to the list 04:49:46 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:49:46 # below is not set. 04:49:46 # The following locations are checked for the existence of the settings.xml file 04:49:46 # * 1. looks for the specified url 04:49:46 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:49:46 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:49:46 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:49:46 # 04:49:46 #org.ops4j.pax.url.mvn.settings= 04:49:46 04:49:46 # 04:49:46 # Path to the local Maven repository which is used to avoid downloading 04:49:46 # artifacts when they already exist locally. 04:49:46 # The value of this property will be extracted from the settings.xml file 04:49:46 # above, or defaulted to: 04:49:46 # System.getProperty( "user.home" ) + "/.m2/repository" 04:49:46 # 04:49:46 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:49:46 04:49:46 # 04:49:46 # Default this to false. It's just weird to use undocumented repos 04:49:46 # 04:49:46 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:49:46 04:49:46 # 04:49:46 # Uncomment if you don't wanna use the proxy settings 04:49:46 # from the Maven conf/settings.xml file 04:49:46 # 04:49:46 # org.ops4j.pax.url.mvn.proxySupport=false 04:49:46 04:49:46 # 04:49:46 # Comma separated list of repositories scanned when resolving an artifact. 04:49:46 # Those repositories will be checked before iterating through the 04:49:46 # below list of repositories and even before the local repository 04:49:46 # A repository url can be appended with zero or more of the following flags: 04:49:46 # @snapshots : the repository contains snaphots 04:49:46 # @noreleases : the repository does not contain any released artifacts 04:49:46 # 04:49:46 # The following property value will add the system folder as a repo. 04:49:46 # 04:49:46 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:49:46 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:49:46 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:49:46 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:49:46 04:49:46 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:49:46 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:49:46 04:49:46 # 04:49:46 # Comma separated list of repositories scanned when resolving an artifact. 04:49:46 # The default list includes the following repositories: 04:49:46 # http://repo1.maven.org/maven2@id=central 04:49:46 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:49:46 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:49:46 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:49:46 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:49:46 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:49:46 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:46 # To add repositories to the default ones, prepend '+' to the list of repositories 04:49:46 # to add. 04:49:46 # A repository url can be appended with zero or more of the following flags: 04:49:46 # @snapshots : the repository contains snapshots 04:49:46 # @noreleases : the repository does not contain any released artifacts 04:49:46 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:49:46 # 04:49:46 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:46 04:49:46 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.+ [[ True == \T\r\u\e ]] 04:49:46 + echo 'Configuring the startup features...' 04:49:46 Configuring the startup features... 04:49:46 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:46 + FEATURE_TEST_STRING=features-test 04:49:46 + FEATURE_TEST_VERSION=0.21.4 04:49:46 + KARAF_VERSION=karaf4 04:49:46 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:49:46 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:46 + [[ ! -z '' ]] 04:49:46 + cat /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:46 ################################################################################ 04:49:46 # 04:49:46 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:46 # contributor license agreements. See the NOTICE file distributed with 04:49:46 # this work for additional information regarding copyright ownership. 04:49:46 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:46 # (the "License"); you may not use this file except in compliance with 04:49:46 # the License. You may obtain a copy of the License at 04:49:46 # 04:49:46 # http://www.apache.org/licenses/LICENSE-2.0 04:49:46 # 04:49:46 # Unless required by applicable law or agreed to in writing, software 04:49:46 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:46 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:46 # See the License for the specific language governing permissions and 04:49:46 # limitations under the License. 04:49:46 # 04:49:46 ################################################################################ 04:49:46 04:49:46 # 04:49:46 # Comma separated list of features repositories to register by default 04:49:46 # 04:49:46 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/f3e5bf4d-c8b9-4e8e-a676-a978cb689c44.xml 04:49:46 04:49:46 # 04:49:46 # Comma separated list of features to install at startup 04:49:46 # 04:49:46 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, a90d8f2f-9a3e-46e7-99a0-9ab64083b4d5 04:49:46 04:49:46 # 04:49:46 # Resource repositories (OBR) that the features resolver can use 04:49:46 # to resolve requirements/capabilities 04:49:46 # 04:49:46 # The format of the resourceRepositories is 04:49:46 # resourceRepositories=[xml:url|json:url],... 04:49:46 # for Instance: 04:49:46 # 04:49:46 #resourceRepositories=xml:http://host/path/to/index.xml 04:49:46 # or 04:49:46 #resourceRepositories=json:http://host/path/to/index.json 04:49:46 # 04:49:46 04:49:46 # 04:49:46 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:49:46 # 04:49:46 featuresBootAsynchronous=false 04:49:46 04:49:46 # 04:49:46 # Service requirements enforcement 04:49:46 # 04:49:46 # By default, the feature resolver checks the service requirements/capabilities of 04:49:46 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:49:46 # the required bundles. 04:49:46 # The following flag can have those values: 04:49:46 # - disable: service requirements are completely ignored 04:49:46 # - default: service requirements are ignored for old features 04:49:46 # - enforce: service requirements are always verified 04:49:46 # 04:49:46 #serviceRequirements=default 04:49:46 04:49:46 # 04:49:46 # Store cfg file for config element in feature 04:49:46 # 04:49:46 #configCfgStore=true 04:49:46 04:49:46 # 04:49:46 # Define if the feature service automatically refresh bundles 04:49:46 # 04:49:46 autoRefresh=true 04:49:46 04:49:46 # 04:49:46 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:49:46 # XML file defines instructions related to features processing 04:49:46 # versions.properties may declare properties to resolve placeholders in XML file 04:49:46 # both files are relative to ${karaf.etc} 04:49:46 # 04:49:46 #featureProcessing=org.apache.karaf.features.xml 04:49:46 #featureProcessingVersions=versions.properties 04:49:46 + configure_karaf_log karaf4 '' 04:49:46 + local -r karaf_version=karaf4 04:49:46 + local -r controllerdebugmap= 04:49:46 + local logapi=log4j 04:49:46 + grep log4j2 /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:46 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:46 log4j2.rootLogger.level = INFO 04:49:46 #log4j2.rootLogger.type = asyncRoot 04:49:46 #log4j2.rootLogger.includeLocation = false 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:46 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:46 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:46 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:46 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:46 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:46 log4j2.logger.spifly.level = WARN 04:49:46 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:46 log4j2.logger.audit.level = INFO 04:49:46 log4j2.logger.audit.additivity = false 04:49:46 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:46 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:46 log4j2.appender.console.type = Console 04:49:46 log4j2.appender.console.name = Console 04:49:46 log4j2.appender.console.layout.type = PatternLayout 04:49:46 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:46 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:46 log4j2.appender.rolling.name = RollingFile 04:49:46 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:46 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:46 #log4j2.appender.rolling.immediateFlush = false 04:49:46 log4j2.appender.rolling.append = true 04:49:46 log4j2.appender.rolling.layout.type = PatternLayout 04:49:46 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:46 log4j2.appender.rolling.policies.type = Policies 04:49:46 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:46 log4j2.appender.rolling.policies.size.size = 64MB 04:49:46 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:46 log4j2.appender.rolling.strategy.max = 7 04:49:46 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:46 log4j2.appender.audit.name = AuditRollingFile 04:49:46 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:46 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:46 log4j2.appender.audit.append = true 04:49:46 log4j2.appender.audit.layout.type = PatternLayout 04:49:46 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:46 log4j2.appender.audit.policies.type = Policies 04:49:46 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:46 log4j2.appender.audit.policies.size.size = 8MB 04:49:46 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:46 log4j2.appender.audit.strategy.max = 7 04:49:46 log4j2.appender.osgi.type = PaxOsgi 04:49:46 log4j2.appender.osgi.name = PaxOsgi 04:49:46 log4j2.appender.osgi.filter = * 04:49:46 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:46 #log4j2.logger.aether.level = TRACE 04:49:46 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:46 #log4j2.logger.http-headers.level = DEBUG 04:49:46 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:46 #log4j2.logger.maven.level = TRACE 04:49:46 + logapi=log4j2 04:49:46 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:49:46 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:49:46 + '[' log4j2 == log4j2 ']' 04:49:46 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:46 controllerdebugmap: 04:49:46 cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:46 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:49:46 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:49:46 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:49:46 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:49:46 + unset IFS 04:49:46 + echo 'controllerdebugmap: ' 04:49:46 + '[' -n '' ']' 04:49:46 + echo 'cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg' 04:49:46 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:46 ################################################################################ 04:49:46 # 04:49:46 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:46 # contributor license agreements. See the NOTICE file distributed with 04:49:46 # this work for additional information regarding copyright ownership. 04:49:46 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:46 # (the "License"); you may not use this file except in compliance with 04:49:46 # the License. You may obtain a copy of the License at 04:49:46 # 04:49:46 # http://www.apache.org/licenses/LICENSE-2.0 04:49:46 # 04:49:46 # Unless required by applicable law or agreed to in writing, software 04:49:46 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:46 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:46 # See the License for the specific language governing permissions and 04:49:46 # limitations under the License. 04:49:46 # 04:49:46 ################################################################################ 04:49:46 04:49:46 # Common pattern layout for appenders 04:49:46 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:46 04:49:46 # Root logger 04:49:46 log4j2.rootLogger.level = INFO 04:49:46 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:46 #log4j2.rootLogger.type = asyncRoot 04:49:46 #log4j2.rootLogger.includeLocation = false 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:46 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:46 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:46 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:46 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:46 04:49:46 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:46 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:46 04:49:46 # Loggers configuration 04:49:46 04:49:46 # Spifly logger 04:49:46 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:46 log4j2.logger.spifly.level = WARN 04:49:46 04:49:46 # Security audit logger 04:49:46 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:46 log4j2.logger.audit.level = INFO 04:49:46 log4j2.logger.audit.additivity = false 04:49:46 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:46 04:49:46 # Appenders configuration 04:49:46 04:49:46 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:46 log4j2.appender.console.type = Console 04:49:46 log4j2.appender.console.name = Console 04:49:46 log4j2.appender.console.layout.type = PatternLayout 04:49:46 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:46 04:49:46 # Rolling file appender 04:49:46 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:46 log4j2.appender.rolling.name = RollingFile 04:49:46 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:46 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:46 # uncomment to not force a disk flush 04:49:46 #log4j2.appender.rolling.immediateFlush = false 04:49:46 log4j2.appender.rolling.append = true 04:49:46 log4j2.appender.rolling.layout.type = PatternLayout 04:49:46 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:46 log4j2.appender.rolling.policies.type = Policies 04:49:46 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:46 log4j2.appender.rolling.policies.size.size = 1GB 04:49:46 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:46 log4j2.appender.rolling.strategy.max = 7 04:49:46 04:49:46 # Audit file appender 04:49:46 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:46 log4j2.appender.audit.name = AuditRollingFile 04:49:46 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:46 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:46 log4j2.appender.audit.append = true 04:49:46 log4j2.appender.audit.layout.type = PatternLayout 04:49:46 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:46 log4j2.appender.audit.policies.type = Policies 04:49:46 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:46 log4j2.appender.audit.policies.size.size = 8MB 04:49:46 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:46 log4j2.appender.audit.strategy.max = 7 04:49:46 04:49:46 # OSGi appender 04:49:46 log4j2.appender.osgi.type = PaxOsgi 04:49:46 log4j2.appender.osgi.name = PaxOsgi 04:49:46 log4j2.appender.osgi.filter = * 04:49:46 04:49:46 # help with identification of maven-related problems with pax-url-aether 04:49:46 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:46 #log4j2.logger.aether.level = TRACE 04:49:46 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:46 #log4j2.logger.http-headers.level = DEBUG 04:49:46 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:46 #log4j2.logger.maven.level = TRACE 04:49:46 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:46 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:46 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.21.4/bin/setenv 04:49:46 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:49:46 + local -r controllermem=2048m 04:49:46 Configure 04:49:46 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:49:46 max memory: 2048m 04:49:46 memconf: /tmp/karaf-0.21.4/bin/setenv 04:49:46 + local -r memconf=/tmp/karaf-0.21.4/bin/setenv 04:49:46 + echo Configure 04:49:46 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:49:46 + echo ' max memory: 2048m' 04:49:46 + echo ' memconf: /tmp/karaf-0.21.4/bin/setenv' 04:49:46 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.21.4/bin/setenv 04:49:46 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.21.4/bin/setenv 04:49:46 cat /tmp/karaf-0.21.4/bin/setenv 04:49:46 + echo 'cat /tmp/karaf-0.21.4/bin/setenv' 04:49:46 + cat /tmp/karaf-0.21.4/bin/setenv 04:49:46 #!/bin/sh 04:49:46 # 04:49:46 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:46 # contributor license agreements. See the NOTICE file distributed with 04:49:46 # this work for additional information regarding copyright ownership. 04:49:46 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:46 # (the "License"); you may not use this file except in compliance with 04:49:46 # the License. You may obtain a copy of the License at 04:49:46 # 04:49:46 # http://www.apache.org/licenses/LICENSE-2.0 04:49:46 # 04:49:46 # Unless required by applicable law or agreed to in writing, software 04:49:46 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:46 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:46 # See the License for the specific language governing permissions and 04:49:46 # limitations under the License. 04:49:46 # 04:49:46 04:49:46 # 04:49:46 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:49:46 # script: client, instance, shell, start, status, stop, karaf 04:49:46 # 04:49:46 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:49:46 # Actions go here... 04:49:46 # fi 04:49:46 04:49:46 # 04:49:46 # general settings which should be applied for all scripts go here; please keep 04:49:46 # in mind that it is possible that scripts might be executed more than once, e.g. 04:49:46 # in example of the start script where the start script is executed first and the 04:49:46 # karaf script afterwards. 04:49:46 # 04:49:46 04:49:46 # 04:49:46 # The following section shows the possible configuration options for the default 04:49:46 # karaf scripts 04:49:46 # 04:49:46 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:49:46 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:49:46 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:49:46 # export EXTRA_JAVA_OPTS # Additional JVM options 04:49:46 # export KARAF_HOME # Karaf home folder 04:49:46 # export KARAF_DATA # Karaf data folder 04:49:46 # export KARAF_BASE # Karaf base folder 04:49:46 # export KARAF_ETC # Karaf etc folder 04:49:46 # export KARAF_LOG # Karaf log folder 04:49:46 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:49:46 # export KARAF_OPTS # Additional available Karaf options 04:49:46 # export KARAF_DEBUG # Enable debug mode 04:49:46 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:49:46 # export KARAF_NOROOT # Prevent execution as root if set to true 04:49:46 Set Java version 04:49:46 + echo 'Set Java version' 04:49:46 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:49:46 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:46 sudo: a password is required 04:49:46 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:46 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:46 sudo: a password is required 04:49:46 JDK default version ... 04:49:46 + echo 'JDK default version ...' 04:49:46 + java -version 04:49:47 openjdk version "21.0.8" 2025-07-15 04:49:47 OpenJDK Runtime Environment (build 21.0.8+9-Ubuntu-0ubuntu122.04.1) 04:49:47 OpenJDK 64-Bit Server VM (build 21.0.8+9-Ubuntu-0ubuntu122.04.1, mixed mode, sharing) 04:49:47 Set JAVA_HOME 04:49:47 + echo 'Set JAVA_HOME' 04:49:47 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:47 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:47 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:47 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:47 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:47 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:49:47 Listing all open ports on controller system... 04:49:47 + echo 'Listing all open ports on controller system...' 04:49:47 + netstat -pnatu 04:49:47 /tmp/configuration-script.sh: line 40: netstat: command not found 04:49:47 Configuring cluster 04:49:47 + '[' -f /tmp/custom_shard_config.txt ']' 04:49:47 + echo 'Configuring cluster' 04:49:47 + /tmp/karaf-0.21.4/bin/configure_cluster.sh 2 10.30.170.174 10.30.170.199 10.30.171.237 04:49:47 ################################################ 04:49:47 ## Configure Cluster ## 04:49:47 ################################################ 04:49:47 NOTE: Cluster configuration files not found. Copying from 04:49:47 /tmp/karaf-0.21.4/system/org/opendaylight/controller/sal-clustering-config/10.0.14 04:49:47 Configuring unique name in akka.conf 04:49:47 Configuring hostname in akka.conf 04:49:47 Configuring data and rpc seed nodes in akka.conf 04:49:47 modules = [ 04:49:47 04:49:47 { 04:49:47 name = "inventory" 04:49:47 namespace = "urn:opendaylight:inventory" 04:49:47 shard-strategy = "module" 04:49:47 }, 04:49:47 { 04:49:47 name = "topology" 04:49:47 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:47 shard-strategy = "module" 04:49:47 }, 04:49:47 { 04:49:47 name = "toaster" 04:49:47 namespace = "http://netconfcentral.org/ns/toaster" 04:49:47 shard-strategy = "module" 04:49:47 } 04:49:47 ] 04:49:47 Configuring replication type in module-shards.conf 04:49:47 ################################################ 04:49:47 ## NOTE: Manually restart controller to ## 04:49:47 ## apply configuration. ## 04:49:47 ################################################ 04:49:47 Dump akka.conf 04:49:47 + echo 'Dump akka.conf' 04:49:47 + cat /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:47 04:49:47 odl-cluster-data { 04:49:47 akka { 04:49:47 remote { 04:49:47 artery { 04:49:47 enabled = on 04:49:47 transport = tcp 04:49:47 canonical.hostname = "10.30.170.199" 04:49:47 canonical.port = 2550 04:49:47 } 04:49:47 } 04:49:47 04:49:47 cluster { 04:49:47 # Using artery. 04:49:47 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:47 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:47 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:47 04:49:47 roles = ["member-2"] 04:49:47 04:49:47 # when under load we might trip a false positive on the failure detector 04:49:47 # failure-detector { 04:49:47 # heartbeat-interval = 4 s 04:49:47 # acceptable-heartbeat-pause = 16s 04:49:47 # } 04:49:47 } 04:49:47 04:49:47 persistence { 04:49:47 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:47 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:47 # The relative path is always relative to KARAF_HOME. 04:49:47 04:49:47 # snapshot-store.local.dir = "target/snapshots" 04:49:47 04:49:47 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:47 snapshot-store.local.use-lz4-compression = false 04:49:47 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:47 snapshot-store.local.lz4-blocksize = 256KB 04:49:47 } 04:49:47 disable-default-actor-system-quarantined-event-handling = "false" 04:49:47 } 04:49:47 } 04:49:47 Dump modules.conf 04:49:47 + echo 'Dump modules.conf' 04:49:47 + cat /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:47 modules = [ 04:49:47 04:49:47 { 04:49:47 name = "inventory" 04:49:47 namespace = "urn:opendaylight:inventory" 04:49:47 shard-strategy = "module" 04:49:47 }, 04:49:47 { 04:49:47 name = "topology" 04:49:47 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:47 shard-strategy = "module" 04:49:47 }, 04:49:47 { 04:49:47 name = "toaster" 04:49:47 namespace = "http://netconfcentral.org/ns/toaster" 04:49:47 shard-strategy = "module" 04:49:47 } 04:49:47 ] 04:49:47 + echo 'Dump module-shards.conf' 04:49:47 Dump module-shards.conf 04:49:47 + cat /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:47 module-shards = [ 04:49:47 { 04:49:47 name = "default" 04:49:47 shards = [ 04:49:47 { 04:49:47 name = "default" 04:49:47 replicas = ["member-1", 04:49:47 "member-2", 04:49:47 "member-3"] 04:49:47 } 04:49:47 ] 04:49:47 }, 04:49:47 { 04:49:47 name = "inventory" 04:49:47 shards = [ 04:49:47 { 04:49:47 name="inventory" 04:49:47 replicas = ["member-1", 04:49:47 "member-2", 04:49:47 "member-3"] 04:49:47 } 04:49:47 ] 04:49:47 }, 04:49:47 { 04:49:47 name = "topology" 04:49:47 shards = [ 04:49:47 { 04:49:47 name="topology" 04:49:47 replicas = ["member-1", 04:49:47 "member-2", 04:49:47 "member-3"] 04:49:47 } 04:49:47 ] 04:49:47 }, 04:49:47 { 04:49:47 name = "toaster" 04:49:47 shards = [ 04:49:47 { 04:49:47 name="toaster" 04:49:47 replicas = ["member-1", 04:49:47 "member-2", 04:49:47 "member-3"] 04:49:47 } 04:49:47 ] 04:49:47 } 04:49:47 ] 04:49:47 Configuring member-3 with IP address 10.30.171.237 04:49:47 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:47 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:47 + source /tmp/common-functions.sh karaf-0.21.4 scandium 04:49:47 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] 04:49:47 ++ echo 'common-functions.sh is being sourced' 04:49:47 common-functions.sh is being sourced 04:49:47 ++ BUNDLEFOLDER=karaf-0.21.4 04:49:47 ++ DISTROSTREAM=scandium 04:49:47 ++ export MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:47 ++ MAVENCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:47 ++ export FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:47 ++ FEATURESCONF=/tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:47 ++ export CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:47 ++ CUSTOMPROP=/tmp/karaf-0.21.4/etc/custom.properties 04:49:47 ++ export LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:47 ++ LOGCONF=/tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:47 ++ export MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:47 ++ MEMCONF=/tmp/karaf-0.21.4/bin/setenv 04:49:47 ++ export CONTROLLERMEM= 04:49:47 ++ CONTROLLERMEM= 04:49:47 ++ case "${DISTROSTREAM}" in 04:49:47 ++ CLUSTER_SYSTEM=akka 04:49:47 ++ export AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:47 ++ AKKACONF=/tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:47 ++ export MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:47 ++ MODULESCONF=/tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:47 ++ export MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:47 ++ MODULESHARDSCONF=/tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:47 ++ print_common_env 04:49:47 ++ cat 04:49:47 common-functions environment: 04:49:47 MAVENCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:47 ACTUALFEATURES: 04:49:47 FEATURESCONF: /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:47 CUSTOMPROP: /tmp/karaf-0.21.4/etc/custom.properties 04:49:47 LOGCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:47 MEMCONF: /tmp/karaf-0.21.4/bin/setenv 04:49:47 CONTROLLERMEM: 04:49:47 AKKACONF: /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:47 MODULESCONF: /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:47 MODULESHARDSCONF: /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:47 SUITES: 04:49:47 04:49:47 ++ SSH='ssh -t -t' 04:49:47 ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' 04:49:47 ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' 04:49:47 Changing to /tmp 04:49:47 Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:47 + echo 'Changing to /tmp' 04:49:47 + cd /tmp 04:49:47 + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip' 04:49:47 + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:47 --2025-12-02 04:49:47-- https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip 04:49:47 Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 04:49:47 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. 04:49:47 HTTP request sent, awaiting response... 200 OK 04:49:47 Length: 239704696 (229M) [application/zip] 04:49:47 Saving to: ‘karaf-0.21.4.zip’ 04:49:47 04:49:47 0K ........ ........ ........ ........ ........ ........ 1% 94.3M 2s 04:49:47 3072K ........ ........ ........ ........ ........ ........ 2% 152M 2s 04:49:47 6144K ........ ........ ........ ........ ........ ........ 3% 176M 2s 04:49:47 9216K ........ ........ ........ ........ ........ ........ 5% 154M 2s 04:49:47 12288K ........ ........ ........ ........ ........ ........ 6% 184M 1s 04:49:47 15360K ........ ........ ........ ........ ........ ........ 7% 211M 1s 04:49:47 18432K ........ ........ ........ ........ ........ ........ 9% 228M 1s 04:49:47 21504K ........ ........ ........ ........ ........ ........ 10% 240M 1s 04:49:47 24576K ........ ........ ........ ........ ........ ........ 11% 212M 1s 04:49:47 27648K ........ ........ ........ ........ ........ ........ 13% 245M 1s 04:49:47 30720K ........ ........ ........ ........ ........ ........ 14% 271M 1s 04:49:47 33792K ........ ........ ........ ........ ........ ........ 15% 315M 1s 04:49:47 36864K ........ ........ ........ ........ ........ ........ 17% 283M 1s 04:49:47 39936K ........ ........ ........ ........ ........ ........ 18% 347M 1s 04:49:47 43008K ........ ........ ........ ........ ........ ........ 19% 307M 1s 04:49:47 46080K ........ ........ ........ ........ ........ ........ 20% 317M 1s 04:49:47 49152K ........ ........ ........ ........ ........ ........ 22% 310M 1s 04:49:47 52224K ........ ........ ........ ........ ........ ........ 23% 338M 1s 04:49:47 55296K ........ ........ ........ ........ ........ ........ 24% 343M 1s 04:49:47 58368K ........ ........ ........ ........ ........ ........ 26% 353M 1s 04:49:47 61440K ........ ........ ........ ........ ........ ........ 27% 349M 1s 04:49:47 64512K ........ ........ ........ ........ ........ ........ 28% 270M 1s 04:49:47 67584K ........ ........ ........ ........ ........ ........ 30% 296M 1s 04:49:47 70656K ........ ........ ........ ........ ........ ........ 31% 341M 1s 04:49:47 73728K ........ ........ ........ ........ ........ ........ 32% 333M 1s 04:49:47 76800K ........ ........ ........ ........ ........ ........ 34% 339M 1s 04:49:47 79872K ........ ........ ........ ........ ........ ........ 35% 342M 1s 04:49:47 82944K ........ ........ ........ ........ ........ ........ 36% 351M 1s 04:49:47 86016K ........ ........ ........ ........ ........ ........ 38% 334M 1s 04:49:47 89088K ........ ........ ........ ........ ........ ........ 39% 349M 1s 04:49:47 92160K ........ ........ ........ ........ ........ ........ 40% 349M 1s 04:49:47 95232K ........ ........ ........ ........ ........ ........ 41% 315M 1s 04:49:47 98304K ........ ........ ........ ........ ........ ........ 43% 282M 1s 04:49:47 101376K ........ ........ ........ ........ ........ ........ 44% 289M 0s 04:49:47 104448K ........ ........ ........ ........ ........ ........ 45% 308M 0s 04:49:47 107520K ........ ........ ........ ........ ........ ........ 47% 343M 0s 04:49:47 110592K ........ ........ ........ ........ ........ ........ 48% 334M 0s 04:49:47 113664K ........ ........ ........ ........ ........ ........ 49% 323M 0s 04:49:47 116736K ........ ........ ........ ........ ........ ........ 51% 308M 0s 04:49:47 119808K ........ ........ ........ ........ ........ ........ 52% 265M 0s 04:49:47 122880K ........ ........ ........ ........ ........ ........ 53% 276M 0s 04:49:48 125952K ........ ........ ........ ........ ........ ........ 55% 295M 0s 04:49:48 129024K ........ ........ ........ ........ ........ ........ 56% 303M 0s 04:49:48 132096K ........ ........ ........ ........ ........ ........ 57% 280M 0s 04:49:48 135168K ........ ........ ........ ........ ........ ........ 59% 282M 0s 04:49:48 138240K ........ ........ ........ ........ ........ ........ 60% 285M 0s 04:49:48 141312K ........ ........ ........ ........ ........ ........ 61% 289M 0s 04:49:48 144384K ........ ........ ........ ........ ........ ........ 62% 278M 0s 04:49:48 147456K ........ ........ ........ ........ ........ ........ 64% 260M 0s 04:49:48 150528K ........ ........ ........ ........ ........ ........ 65% 327M 0s 04:49:48 153600K ........ ........ ........ ........ ........ ........ 66% 286M 0s 04:49:48 156672K ........ ........ ........ ........ ........ ........ 68% 287M 0s 04:49:48 159744K ........ ........ ........ ........ ........ ........ 69% 297M 0s 04:49:48 162816K ........ ........ ........ ........ ........ ........ 70% 272M 0s 04:49:48 165888K ........ ........ ........ ........ ........ ........ 72% 281M 0s 04:49:48 168960K ........ ........ ........ ........ ........ ........ 73% 277M 0s 04:49:48 172032K ........ ........ ........ ........ ........ ........ 74% 285M 0s 04:49:48 175104K ........ ........ ........ ........ ........ ........ 76% 293M 0s 04:49:48 178176K ........ ........ ........ ........ ........ ........ 77% 285M 0s 04:49:48 181248K ........ ........ ........ ........ ........ ........ 78% 277M 0s 04:49:48 184320K ........ ........ ........ ........ ........ ........ 80% 316M 0s 04:49:48 187392K ........ ........ ........ ........ ........ ........ 81% 298M 0s 04:49:48 190464K ........ ........ ........ ........ ........ ........ 82% 293M 0s 04:49:48 193536K ........ ........ ........ ........ ........ ........ 83% 306M 0s 04:49:48 196608K ........ ........ ........ ........ ........ ........ 85% 310M 0s 04:49:48 199680K ........ ........ ........ ........ ........ ........ 86% 298M 0s 04:49:48 202752K ........ ........ ........ ........ ........ ........ 87% 307M 0s 04:49:48 205824K ........ ........ ........ ........ ........ ........ 89% 313M 0s 04:49:48 208896K ........ ........ ........ ........ ........ ........ 90% 305M 0s 04:49:48 211968K ........ ........ ........ ........ ........ ........ 91% 279M 0s 04:49:48 215040K ........ ........ ........ ........ ........ ........ 93% 286M 0s 04:49:48 218112K ........ ........ ........ ........ ........ ........ 94% 290M 0s 04:49:48 221184K ........ ........ ........ ........ ........ ........ 95% 285M 0s 04:49:48 224256K ........ ........ ........ ........ ........ ........ 97% 293M 0s 04:49:48 227328K ........ ........ ........ ........ ........ ........ 98% 287M 0s 04:49:48 230400K ........ ........ ........ ........ ........ ........ 99% 277M 0s 04:49:48 233472K ........ . 100% 303M=0.8s 04:49:48 04:49:48 2025-12-02 04:49:48 (276 MB/s) - ‘karaf-0.21.4.zip’ saved [239704696/239704696] 04:49:48 04:49:48 + echo 'Extracting the new controller...' 04:49:48 Extracting the new controller... 04:49:48 + unzip -q karaf-0.21.4.zip 04:49:50 Adding external repositories... 04:49:50 + echo 'Adding external repositories...' 04:49:50 + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:50 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:50 ################################################################################ 04:49:50 # 04:49:50 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:50 # contributor license agreements. See the NOTICE file distributed with 04:49:50 # this work for additional information regarding copyright ownership. 04:49:50 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:50 # (the "License"); you may not use this file except in compliance with 04:49:50 # the License. You may obtain a copy of the License at 04:49:50 # 04:49:50 # http://www.apache.org/licenses/LICENSE-2.0 04:49:50 # 04:49:50 # Unless required by applicable law or agreed to in writing, software 04:49:50 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:50 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:50 # See the License for the specific language governing permissions and 04:49:50 # limitations under the License. 04:49:50 # 04:49:50 ################################################################################ 04:49:50 04:49:50 # 04:49:50 # If set to true, the following property will not allow any certificate to be used 04:49:50 # when accessing Maven repositories through SSL 04:49:50 # 04:49:50 #org.ops4j.pax.url.mvn.certificateCheck= 04:49:50 04:49:50 # 04:49:50 # Path to the local Maven settings file. 04:49:50 # The repositories defined in this file will be automatically added to the list 04:49:50 # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property 04:49:50 # below is not set. 04:49:50 # The following locations are checked for the existence of the settings.xml file 04:49:50 # * 1. looks for the specified url 04:49:50 # * 2. if not found looks for ${user.home}/.m2/settings.xml 04:49:50 # * 3. if not found looks for ${maven.home}/conf/settings.xml 04:49:50 # * 4. if not found looks for ${M2_HOME}/conf/settings.xml 04:49:50 # 04:49:50 #org.ops4j.pax.url.mvn.settings= 04:49:50 04:49:50 # 04:49:50 # Path to the local Maven repository which is used to avoid downloading 04:49:50 # artifacts when they already exist locally. 04:49:50 # The value of this property will be extracted from the settings.xml file 04:49:50 # above, or defaulted to: 04:49:50 # System.getProperty( "user.home" ) + "/.m2/repository" 04:49:50 # 04:49:50 org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} 04:49:50 04:49:50 # 04:49:50 # Default this to false. It's just weird to use undocumented repos 04:49:50 # 04:49:50 org.ops4j.pax.url.mvn.useFallbackRepositories=false 04:49:50 04:49:50 # 04:49:50 # Uncomment if you don't wanna use the proxy settings 04:49:50 # from the Maven conf/settings.xml file 04:49:50 # 04:49:50 # org.ops4j.pax.url.mvn.proxySupport=false 04:49:50 04:49:50 # 04:49:50 # Comma separated list of repositories scanned when resolving an artifact. 04:49:50 # Those repositories will be checked before iterating through the 04:49:50 # below list of repositories and even before the local repository 04:49:50 # A repository url can be appended with zero or more of the following flags: 04:49:50 # @snapshots : the repository contains snaphots 04:49:50 # @noreleases : the repository does not contain any released artifacts 04:49:50 # 04:49:50 # The following property value will add the system folder as a repo. 04:49:50 # 04:49:50 org.ops4j.pax.url.mvn.defaultRepositories=\ 04:49:50 file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ 04:49:50 file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ 04:49:50 file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots 04:49:50 04:49:50 # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo 04:49:50 #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false 04:49:50 04:49:50 # 04:49:50 # Comma separated list of repositories scanned when resolving an artifact. 04:49:50 # The default list includes the following repositories: 04:49:50 # http://repo1.maven.org/maven2@id=central 04:49:50 # http://repository.springsource.com/maven/bundles/release@id=spring.ebr 04:49:50 # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external 04:49:50 # http://zodiac.springsource.com/maven/bundles/release@id=gemini 04:49:50 # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases 04:49:50 # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases 04:49:50 # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:50 # To add repositories to the default ones, prepend '+' to the list of repositories 04:49:50 # to add. 04:49:50 # A repository url can be appended with zero or more of the following flags: 04:49:50 # @snapshots : the repository contains snapshots 04:49:50 # @noreleases : the repository does not contain any released artifacts 04:49:50 # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended 04:49:50 # 04:49:50 org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases 04:49:50 04:49:50 ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... 04:49:50 + [[ True == \T\r\u\e ]] 04:49:50 + echo 'Configuring the startup features...' 04:49:50 + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer,/g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:50 + FEATURE_TEST_STRING=features-test 04:49:50 + FEATURE_TEST_VERSION=0.21.4 04:49:50 + KARAF_VERSION=karaf4 04:49:50 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] 04:49:50 + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:50 + [[ ! -z '' ]] 04:49:50 + cat /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:50 ################################################################################ 04:49:50 # 04:49:50 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:50 # contributor license agreements. See the NOTICE file distributed with 04:49:50 # this work for additional information regarding copyright ownership. 04:49:50 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:50 # (the "License"); you may not use this file except in compliance with 04:49:50 # the License. You may obtain a copy of the License at 04:49:50 # 04:49:50 # http://www.apache.org/licenses/LICENSE-2.0 04:49:50 # 04:49:50 # Unless required by applicable law or agreed to in writing, software 04:49:50 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:50 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:50 # See the License for the specific language governing permissions and 04:49:50 # limitations under the License. 04:49:50 # 04:49:50 ################################################################################ 04:49:50 04:49:50 # 04:49:50 # Comma separated list of features repositories to register by default 04:49:50 # 04:49:50 featuresRepositories = mvn:org.opendaylight.integration/features-test/0.21.4/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/f3e5bf4d-c8b9-4e8e-a676-a978cb689c44.xml 04:49:50 04:49:50 # 04:49:50 # Comma separated list of features to install at startup 04:49:50 # 04:49:50 featuresBoot = odl-infrautils-ready,odl-jolokia,odl-openflowplugin-flow-services-rest,odl-openflowplugin-app-table-miss-enforcer, a90d8f2f-9a3e-46e7-99a0-9ab64083b4d5 04:49:50 04:49:50 # 04:49:50 # Resource repositories (OBR) that the features resolver can use 04:49:50 # to resolve requirements/capabilities 04:49:50 # 04:49:50 # The format of the resourceRepositories is 04:49:50 # resourceRepositories=[xml:url|json:url],... 04:49:50 # for Instance: 04:49:50 # 04:49:50 #resourceRepositories=xml:http://host/path/to/index.xml 04:49:50 # or 04:49:50 #resourceRepositories=json:http://host/path/to/index.json 04:49:50 # 04:49:50 04:49:50 # 04:49:50 # Defines if the boot features are started in asynchronous mode (in a dedicated thread) 04:49:50 # 04:49:50 featuresBootAsynchronous=false 04:49:50 04:49:50 # 04:49:50 # Service requirements enforcement 04:49:50 # 04:49:50 # By default, the feature resolver checks the service requirements/capabilities of 04:49:50 # bundles for new features (xml schema >= 1.3.0) in order to automatically installs 04:49:50 # the required bundles. 04:49:50 # The following flag can have those values: 04:49:50 # - disable: service requirements are completely ignored 04:49:50 # - default: service requirements are ignored for old features 04:49:50 # - enforce: service requirements are always verified 04:49:50 # 04:49:50 #serviceRequirements=default 04:49:50 04:49:50 # 04:49:50 # Store cfg file for config element in feature 04:49:50 # 04:49:50 #configCfgStore=true 04:49:50 04:49:50 # 04:49:50 # Define if the feature service automatically refresh bundles 04:49:50 # 04:49:50 autoRefresh=true 04:49:50 04:49:50 # 04:49:50 # Configuration of features processing mechanism (overrides, blacklisting, modification of features) 04:49:50 # XML file defines instructions related to features processing 04:49:50 # versions.properties may declare properties to resolve placeholders in XML file 04:49:50 # both files are relative to ${karaf.etc} 04:49:50 # 04:49:50 #featureProcessing=org.apache.karaf.features.xml 04:49:50 #featureProcessingVersions=versions.properties 04:49:50 + configure_karaf_log karaf4 '' 04:49:50 + local -r karaf_version=karaf4 04:49:50 + local -r controllerdebugmap= 04:49:50 + local logapi=log4j 04:49:50 + grep log4j2 /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:50 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:50 log4j2.rootLogger.level = INFO 04:49:50 #log4j2.rootLogger.type = asyncRoot 04:49:50 #log4j2.rootLogger.includeLocation = false 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:50 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:50 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:50 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:50 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:50 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:50 log4j2.logger.spifly.level = WARN 04:49:50 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:50 log4j2.logger.audit.level = INFO 04:49:50 log4j2.logger.audit.additivity = false 04:49:50 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:50 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:50 log4j2.appender.console.type = Console 04:49:50 log4j2.appender.console.name = Console 04:49:50 log4j2.appender.console.layout.type = PatternLayout 04:49:50 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:50 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:50 log4j2.appender.rolling.name = RollingFile 04:49:50 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:50 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:50 #log4j2.appender.rolling.immediateFlush = false 04:49:50 log4j2.appender.rolling.append = true 04:49:50 log4j2.appender.rolling.layout.type = PatternLayout 04:49:50 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:50 log4j2.appender.rolling.policies.type = Policies 04:49:50 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:50 log4j2.appender.rolling.policies.size.size = 64MB 04:49:50 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:50 log4j2.appender.rolling.strategy.max = 7 04:49:50 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:50 log4j2.appender.audit.name = AuditRollingFile 04:49:50 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:50 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:50 log4j2.appender.audit.append = true 04:49:50 log4j2.appender.audit.layout.type = PatternLayout 04:49:50 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:50 log4j2.appender.audit.policies.type = Policies 04:49:50 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:50 log4j2.appender.audit.policies.size.size = 8MB 04:49:50 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:50 log4j2.appender.audit.strategy.max = 7 04:49:50 log4j2.appender.osgi.type = PaxOsgi 04:49:50 log4j2.appender.osgi.name = PaxOsgi 04:49:50 log4j2.appender.osgi.filter = * 04:49:50 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:50 #log4j2.logger.aether.level = TRACE 04:49:50 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:50 #log4j2.logger.http-headers.level = DEBUG 04:49:50 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:50 #log4j2.logger.maven.level = TRACE 04:49:50 Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 04:49:50 + logapi=log4j2 04:49:50 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' 04:49:50 + '[' log4j2 == log4j2 ']' 04:49:50 + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:50 + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver 04:49:50 + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver 04:49:50 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' 04:49:50 + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' 04:49:50 controllerdebugmap: 04:49:50 + unset IFS 04:49:50 + echo 'controllerdebugmap: ' 04:49:50 + '[' -n '' ']' 04:49:50 cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:50 + echo 'cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg' 04:49:50 + cat /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:50 ################################################################################ 04:49:50 # 04:49:50 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:50 # contributor license agreements. See the NOTICE file distributed with 04:49:50 # this work for additional information regarding copyright ownership. 04:49:50 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:50 # (the "License"); you may not use this file except in compliance with 04:49:50 # the License. You may obtain a copy of the License at 04:49:50 # 04:49:50 # http://www.apache.org/licenses/LICENSE-2.0 04:49:50 # 04:49:50 # Unless required by applicable law or agreed to in writing, software 04:49:50 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:50 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:50 # See the License for the specific language governing permissions and 04:49:50 # limitations under the License. 04:49:50 # 04:49:50 ################################################################################ 04:49:50 04:49:50 # Common pattern layout for appenders 04:49:50 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:50 04:49:50 # Root logger 04:49:50 log4j2.rootLogger.level = INFO 04:49:50 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:50 #log4j2.rootLogger.type = asyncRoot 04:49:50 #log4j2.rootLogger.includeLocation = false 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:50 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:50 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:50 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:50 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:50 04:49:50 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:50 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:50 04:49:50 # Loggers configuration 04:49:50 04:49:50 # Spifly logger 04:49:50 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:50 log4j2.logger.spifly.level = WARN 04:49:50 04:49:50 # Security audit logger 04:49:50 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:50 log4j2.logger.audit.level = INFO 04:49:50 log4j2.logger.audit.additivity = false 04:49:50 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:50 04:49:50 # Appenders configuration 04:49:50 04:49:50 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:50 log4j2.appender.console.type = Console 04:49:50 log4j2.appender.console.name = Console 04:49:50 log4j2.appender.console.layout.type = PatternLayout 04:49:50 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:50 04:49:50 # Rolling file appender 04:49:50 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:50 log4j2.appender.rolling.name = RollingFile 04:49:50 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:50 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:50 # uncomment to not force a disk flush 04:49:50 #log4j2.appender.rolling.immediateFlush = false 04:49:50 log4j2.appender.rolling.append = true 04:49:50 log4j2.appender.rolling.layout.type = PatternLayout 04:49:50 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:50 log4j2.appender.rolling.policies.type = Policies 04:49:50 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:50 log4j2.appender.rolling.policies.size.size = 1GB 04:49:50 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:50 log4j2.appender.rolling.strategy.max = 7 04:49:50 04:49:50 # Audit file appender 04:49:50 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:50 log4j2.appender.audit.name = AuditRollingFile 04:49:50 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:50 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:50 log4j2.appender.audit.append = true 04:49:50 log4j2.appender.audit.layout.type = PatternLayout 04:49:50 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:50 log4j2.appender.audit.policies.type = Policies 04:49:50 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:50 log4j2.appender.audit.policies.size.size = 8MB 04:49:50 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:50 log4j2.appender.audit.strategy.max = 7 04:49:50 04:49:50 # OSGi appender 04:49:50 log4j2.appender.osgi.type = PaxOsgi 04:49:50 log4j2.appender.osgi.name = PaxOsgi 04:49:50 log4j2.appender.osgi.filter = * 04:49:50 04:49:50 # help with identification of maven-related problems with pax-url-aether 04:49:50 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:50 #log4j2.logger.aether.level = TRACE 04:49:50 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:50 #log4j2.logger.http-headers.level = DEBUG 04:49:50 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:50 #log4j2.logger.maven.level = TRACE 04:49:50 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:50 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:50 + set_java_vars /usr/lib/jvm/java-21-openjdk-amd64 2048m /tmp/karaf-0.21.4/bin/setenv 04:49:50 Configure 04:49:50 java home: /usr/lib/jvm/java-21-openjdk-amd64 04:49:50 max memory: 2048m 04:49:50 memconf: /tmp/karaf-0.21.4/bin/setenv 04:49:50 + local -r java_home=/usr/lib/jvm/java-21-openjdk-amd64 04:49:50 + local -r controllermem=2048m 04:49:50 + local -r memconf=/tmp/karaf-0.21.4/bin/setenv 04:49:50 + echo Configure 04:49:50 + echo ' java home: /usr/lib/jvm/java-21-openjdk-amd64' 04:49:50 + echo ' max memory: 2048m' 04:49:50 + echo ' memconf: /tmp/karaf-0.21.4/bin/setenv' 04:49:50 + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64}%g' /tmp/karaf-0.21.4/bin/setenv 04:49:50 + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.21.4/bin/setenv 04:49:50 cat /tmp/karaf-0.21.4/bin/setenv 04:49:50 + echo 'cat /tmp/karaf-0.21.4/bin/setenv' 04:49:50 + cat /tmp/karaf-0.21.4/bin/setenv 04:49:50 #!/bin/sh 04:49:50 # 04:49:50 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:50 # contributor license agreements. See the NOTICE file distributed with 04:49:50 # this work for additional information regarding copyright ownership. 04:49:50 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:50 # (the "License"); you may not use this file except in compliance with 04:49:50 # the License. You may obtain a copy of the License at 04:49:50 # 04:49:50 # http://www.apache.org/licenses/LICENSE-2.0 04:49:50 # 04:49:50 # Unless required by applicable law or agreed to in writing, software 04:49:50 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:50 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:50 # See the License for the specific language governing permissions and 04:49:50 # limitations under the License. 04:49:50 # 04:49:50 04:49:50 # 04:49:50 # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf 04:49:50 # script: client, instance, shell, start, status, stop, karaf 04:49:50 # 04:49:50 # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then 04:49:50 # Actions go here... 04:49:50 # fi 04:49:50 04:49:50 # 04:49:50 # general settings which should be applied for all scripts go here; please keep 04:49:50 # in mind that it is possible that scripts might be executed more than once, e.g. 04:49:50 # in example of the start script where the start script is executed first and the 04:49:50 # karaf script afterwards. 04:49:50 # 04:49:50 04:49:50 # 04:49:50 # The following section shows the possible configuration options for the default 04:49:50 # karaf scripts 04:49:50 # 04:49:50 export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-21-openjdk-amd64} # Location of Java installation 04:49:50 # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration 04:49:50 # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options 04:49:50 # export EXTRA_JAVA_OPTS # Additional JVM options 04:49:50 # export KARAF_HOME # Karaf home folder 04:49:50 # export KARAF_DATA # Karaf data folder 04:49:50 # export KARAF_BASE # Karaf base folder 04:49:50 # export KARAF_ETC # Karaf etc folder 04:49:50 # export KARAF_LOG # Karaf log folder 04:49:50 # export KARAF_SYSTEM_OPTS # First citizen Karaf options 04:49:50 # export KARAF_OPTS # Additional available Karaf options 04:49:50 # export KARAF_DEBUG # Enable debug mode 04:49:50 # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start 04:49:50 # export KARAF_NOROOT # Prevent execution as root if set to true 04:49:50 + echo 'Set Java version' 04:49:50 Set Java version 04:49:50 + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 1 04:49:50 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:50 sudo: a password is required 04:49:50 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:50 sudo: a terminal is required to read the password; either use the -S option to read from standard input or configure an askpass helper 04:49:50 sudo: a password is required 04:49:50 JDK default version ... 04:49:50 + echo 'JDK default version ...' 04:49:50 + java -version 04:49:51 openjdk version "21.0.8" 2025-07-15 04:49:51 OpenJDK Runtime Environment (build 21.0.8+9-Ubuntu-0ubuntu122.04.1) 04:49:51 OpenJDK 64-Bit Server VM (build 21.0.8+9-Ubuntu-0ubuntu122.04.1, mixed mode, sharing) 04:49:51 Set JAVA_HOME 04:49:51 + echo 'Set JAVA_HOME' 04:49:51 + export JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:51 + JAVA_HOME=/usr/lib/jvm/java-21-openjdk-amd64 04:49:51 ++ readlink -e /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:51 Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:51 Listing all open ports on controller system... 04:49:51 + JAVA_RESOLVED=/usr/lib/jvm/java-21-openjdk-amd64/bin/java 04:49:51 + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-21-openjdk-amd64/bin/java' 04:49:51 + echo 'Listing all open ports on controller system...' 04:49:51 + netstat -pnatu 04:49:51 /tmp/configuration-script.sh: line 40: netstat: command not found 04:49:51 Configuring cluster 04:49:51 + '[' -f /tmp/custom_shard_config.txt ']' 04:49:51 + echo 'Configuring cluster' 04:49:51 + /tmp/karaf-0.21.4/bin/configure_cluster.sh 3 10.30.170.174 10.30.170.199 10.30.171.237 04:49:51 ################################################ 04:49:51 ## Configure Cluster ## 04:49:51 ################################################ 04:49:51 NOTE: Cluster configuration files not found. Copying from 04:49:51 /tmp/karaf-0.21.4/system/org/opendaylight/controller/sal-clustering-config/10.0.14 04:49:51 Configuring unique name in akka.conf 04:49:51 Configuring hostname in akka.conf 04:49:51 Configuring data and rpc seed nodes in akka.conf 04:49:51 modules = [ 04:49:51 04:49:51 { 04:49:51 name = "inventory" 04:49:51 namespace = "urn:opendaylight:inventory" 04:49:51 shard-strategy = "module" 04:49:51 }, 04:49:51 { 04:49:51 name = "topology" 04:49:51 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:51 shard-strategy = "module" 04:49:51 }, 04:49:51 { 04:49:51 name = "toaster" 04:49:51 namespace = "http://netconfcentral.org/ns/toaster" 04:49:51 shard-strategy = "module" 04:49:51 } 04:49:51 ] 04:49:51 Configuring replication type in module-shards.conf 04:49:51 ################################################ 04:49:51 ## NOTE: Manually restart controller to ## 04:49:51 ## apply configuration. ## 04:49:51 ################################################ 04:49:51 Dump akka.conf 04:49:51 + echo 'Dump akka.conf' 04:49:51 + cat /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:51 04:49:51 odl-cluster-data { 04:49:51 akka { 04:49:51 remote { 04:49:51 artery { 04:49:51 enabled = on 04:49:51 transport = tcp 04:49:51 canonical.hostname = "10.30.171.237" 04:49:51 canonical.port = 2550 04:49:51 } 04:49:51 } 04:49:51 04:49:51 cluster { 04:49:51 # Using artery. 04:49:51 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:51 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:51 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:51 04:49:51 roles = ["member-3"] 04:49:51 04:49:51 # when under load we might trip a false positive on the failure detector 04:49:51 # failure-detector { 04:49:51 # heartbeat-interval = 4 s 04:49:51 # acceptable-heartbeat-pause = 16s 04:49:51 # } 04:49:51 } 04:49:51 04:49:51 persistence { 04:49:51 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:51 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:51 # The relative path is always relative to KARAF_HOME. 04:49:51 04:49:51 # snapshot-store.local.dir = "target/snapshots" 04:49:51 04:49:51 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:51 snapshot-store.local.use-lz4-compression = false 04:49:51 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:51 snapshot-store.local.lz4-blocksize = 256KB 04:49:51 } 04:49:51 disable-default-actor-system-quarantined-event-handling = "false" 04:49:51 } 04:49:51 } 04:49:51 Dump modules.conf 04:49:51 + echo 'Dump modules.conf' 04:49:51 + cat /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:51 modules = [ 04:49:51 04:49:51 { 04:49:51 name = "inventory" 04:49:51 namespace = "urn:opendaylight:inventory" 04:49:51 shard-strategy = "module" 04:49:51 }, 04:49:51 { 04:49:51 name = "topology" 04:49:51 namespace = "urn:TBD:params:xml:ns:yang:network-topology" 04:49:51 shard-strategy = "module" 04:49:51 }, 04:49:51 { 04:49:51 name = "toaster" 04:49:51 namespace = "http://netconfcentral.org/ns/toaster" 04:49:51 shard-strategy = "module" 04:49:51 } 04:49:51 ] 04:49:51 Dump module-shards.conf 04:49:51 + echo 'Dump module-shards.conf' 04:49:51 + cat /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:51 module-shards = [ 04:49:51 { 04:49:51 name = "default" 04:49:51 shards = [ 04:49:51 { 04:49:51 name = "default" 04:49:51 replicas = ["member-1", 04:49:51 "member-2", 04:49:51 "member-3"] 04:49:51 } 04:49:51 ] 04:49:51 }, 04:49:51 { 04:49:51 name = "inventory" 04:49:51 shards = [ 04:49:51 { 04:49:51 name="inventory" 04:49:51 replicas = ["member-1", 04:49:51 "member-2", 04:49:51 "member-3"] 04:49:51 } 04:49:51 ] 04:49:51 }, 04:49:51 { 04:49:51 name = "topology" 04:49:51 shards = [ 04:49:51 { 04:49:51 name="topology" 04:49:51 replicas = ["member-1", 04:49:51 "member-2", 04:49:51 "member-3"] 04:49:51 } 04:49:51 ] 04:49:51 }, 04:49:51 { 04:49:51 name = "toaster" 04:49:51 shards = [ 04:49:51 { 04:49:51 name="toaster" 04:49:51 replicas = ["member-1", 04:49:51 "member-2", 04:49:51 "member-3"] 04:49:51 } 04:49:51 ] 04:49:51 } 04:49:51 ] 04:49:51 Locating config plan to use... 04:49:51 config plan exists!!! 04:49:51 Changing the config plan path... 04:49:51 # Place the suites in run order: 04:49:51 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/scripts/set_akka_debug.sh 04:49:51 Executing /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/scripts/set_akka_debug.sh... 04:49:51 Copying config files to ODL Controller folder 04:49:51 Set AKKA/PEKKO debug on 10.30.170.174 04:49:51 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:51 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:51 Enable AKKA/PEKKO debug 04:49:51 Dump /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:51 04:49:51 odl-cluster-data { 04:49:51 akka { 04:49:51 loglevel = "DEBUG" 04:49:51 actor { 04:49:51 debug { 04:49:51 autoreceive = on 04:49:51 lifecycle = on 04:49:51 unhandled = on 04:49:51 fsm = on 04:49:51 event-stream = on 04:49:51 } 04:49:51 } 04:49:51 remote { 04:49:51 artery { 04:49:51 enabled = on 04:49:51 transport = tcp 04:49:51 canonical.hostname = "10.30.170.174" 04:49:51 canonical.port = 2550 04:49:51 } 04:49:51 } 04:49:51 04:49:51 cluster { 04:49:51 # Using artery. 04:49:51 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:51 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:51 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:51 04:49:51 roles = ["member-1"] 04:49:51 04:49:51 # when under load we might trip a false positive on the failure detector 04:49:51 # failure-detector { 04:49:51 # heartbeat-interval = 4 s 04:49:51 # acceptable-heartbeat-pause = 16s 04:49:51 # } 04:49:51 } 04:49:51 04:49:51 persistence { 04:49:51 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:51 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:51 # The relative path is always relative to KARAF_HOME. 04:49:51 04:49:51 # snapshot-store.local.dir = "target/snapshots" 04:49:51 04:49:51 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:51 snapshot-store.local.use-lz4-compression = false 04:49:51 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:51 snapshot-store.local.lz4-blocksize = 256KB 04:49:51 } 04:49:51 disable-default-actor-system-quarantined-event-handling = "false" 04:49:51 } 04:49:51 } 04:49:51 Dump /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:51 ################################################################################ 04:49:51 # 04:49:51 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:51 # contributor license agreements. See the NOTICE file distributed with 04:49:51 # this work for additional information regarding copyright ownership. 04:49:51 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:51 # (the "License"); you may not use this file except in compliance with 04:49:51 # the License. You may obtain a copy of the License at 04:49:51 # 04:49:51 # http://www.apache.org/licenses/LICENSE-2.0 04:49:51 # 04:49:51 # Unless required by applicable law or agreed to in writing, software 04:49:51 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:51 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:51 # See the License for the specific language governing permissions and 04:49:51 # limitations under the License. 04:49:51 # 04:49:51 ################################################################################ 04:49:51 04:49:51 # Common pattern layout for appenders 04:49:51 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:51 04:49:51 # Root logger 04:49:51 log4j2.rootLogger.level = INFO 04:49:51 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:51 #log4j2.rootLogger.type = asyncRoot 04:49:51 #log4j2.rootLogger.includeLocation = false 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:51 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:51 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:51 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:51 04:49:51 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:51 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:51 04:49:51 # Loggers configuration 04:49:51 04:49:51 # Spifly logger 04:49:51 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:51 log4j2.logger.spifly.level = WARN 04:49:51 04:49:51 # Security audit logger 04:49:51 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:51 log4j2.logger.audit.level = INFO 04:49:51 log4j2.logger.audit.additivity = false 04:49:51 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:51 04:49:51 # Appenders configuration 04:49:51 04:49:51 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:51 log4j2.appender.console.type = Console 04:49:51 log4j2.appender.console.name = Console 04:49:51 log4j2.appender.console.layout.type = PatternLayout 04:49:51 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:51 04:49:51 # Rolling file appender 04:49:51 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:51 log4j2.appender.rolling.name = RollingFile 04:49:51 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:51 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:51 # uncomment to not force a disk flush 04:49:51 #log4j2.appender.rolling.immediateFlush = false 04:49:51 log4j2.appender.rolling.append = true 04:49:51 log4j2.appender.rolling.layout.type = PatternLayout 04:49:51 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:51 log4j2.appender.rolling.policies.type = Policies 04:49:51 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:51 log4j2.appender.rolling.policies.size.size = 1GB 04:49:51 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:51 log4j2.appender.rolling.strategy.max = 7 04:49:51 04:49:51 # Audit file appender 04:49:51 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:51 log4j2.appender.audit.name = AuditRollingFile 04:49:51 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:51 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:51 log4j2.appender.audit.append = true 04:49:51 log4j2.appender.audit.layout.type = PatternLayout 04:49:51 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:51 log4j2.appender.audit.policies.type = Policies 04:49:51 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:51 log4j2.appender.audit.policies.size.size = 8MB 04:49:51 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:51 log4j2.appender.audit.strategy.max = 7 04:49:51 04:49:51 # OSGi appender 04:49:51 log4j2.appender.osgi.type = PaxOsgi 04:49:51 log4j2.appender.osgi.name = PaxOsgi 04:49:51 log4j2.appender.osgi.filter = * 04:49:51 04:49:51 # help with identification of maven-related problems with pax-url-aether 04:49:51 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:51 #log4j2.logger.aether.level = TRACE 04:49:51 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:51 #log4j2.logger.http-headers.level = DEBUG 04:49:51 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:51 #log4j2.logger.maven.level = TRACE 04:49:51 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:51 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:51 log4j2.logger.cluster.name=akka.cluster 04:49:51 log4j2.logger.cluster.level=DEBUG 04:49:51 log4j2.logger.remote.name=akka.remote 04:49:51 log4j2.logger.remote.level=DEBUG 04:49:51 Set AKKA/PEKKO debug on 10.30.170.199 04:49:52 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:52 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:52 Enable AKKA/PEKKO debug 04:49:52 Dump /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:52 04:49:52 odl-cluster-data { 04:49:52 akka { 04:49:52 loglevel = "DEBUG" 04:49:52 actor { 04:49:52 debug { 04:49:52 autoreceive = on 04:49:52 lifecycle = on 04:49:52 unhandled = on 04:49:52 fsm = on 04:49:52 event-stream = on 04:49:52 } 04:49:52 } 04:49:52 remote { 04:49:52 artery { 04:49:52 enabled = on 04:49:52 transport = tcp 04:49:52 canonical.hostname = "10.30.170.199" 04:49:52 canonical.port = 2550 04:49:52 } 04:49:52 } 04:49:52 04:49:52 cluster { 04:49:52 # Using artery. 04:49:52 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:52 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:52 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:52 04:49:52 roles = ["member-2"] 04:49:52 04:49:52 # when under load we might trip a false positive on the failure detector 04:49:52 # failure-detector { 04:49:52 # heartbeat-interval = 4 s 04:49:52 # acceptable-heartbeat-pause = 16s 04:49:52 # } 04:49:52 } 04:49:52 04:49:52 persistence { 04:49:52 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:52 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:52 # The relative path is always relative to KARAF_HOME. 04:49:52 04:49:52 # snapshot-store.local.dir = "target/snapshots" 04:49:52 04:49:52 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:52 snapshot-store.local.use-lz4-compression = false 04:49:52 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:52 snapshot-store.local.lz4-blocksize = 256KB 04:49:52 } 04:49:52 disable-default-actor-system-quarantined-event-handling = "false" 04:49:52 } 04:49:52 } 04:49:52 Dump /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:52 ################################################################################ 04:49:52 # 04:49:52 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:52 # contributor license agreements. See the NOTICE file distributed with 04:49:52 # this work for additional information regarding copyright ownership. 04:49:52 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:52 # (the "License"); you may not use this file except in compliance with 04:49:52 # the License. You may obtain a copy of the License at 04:49:52 # 04:49:52 # http://www.apache.org/licenses/LICENSE-2.0 04:49:52 # 04:49:52 # Unless required by applicable law or agreed to in writing, software 04:49:52 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:52 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:52 # See the License for the specific language governing permissions and 04:49:52 # limitations under the License. 04:49:52 # 04:49:52 ################################################################################ 04:49:52 04:49:52 # Common pattern layout for appenders 04:49:52 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:52 04:49:52 # Root logger 04:49:52 log4j2.rootLogger.level = INFO 04:49:52 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:52 #log4j2.rootLogger.type = asyncRoot 04:49:52 #log4j2.rootLogger.includeLocation = false 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:52 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:52 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:52 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:52 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:52 04:49:52 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:52 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:52 04:49:52 # Loggers configuration 04:49:52 04:49:52 # Spifly logger 04:49:52 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:52 log4j2.logger.spifly.level = WARN 04:49:52 04:49:52 # Security audit logger 04:49:52 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:52 log4j2.logger.audit.level = INFO 04:49:52 log4j2.logger.audit.additivity = false 04:49:52 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:52 04:49:52 # Appenders configuration 04:49:52 04:49:52 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:52 log4j2.appender.console.type = Console 04:49:52 log4j2.appender.console.name = Console 04:49:52 log4j2.appender.console.layout.type = PatternLayout 04:49:52 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:52 04:49:52 # Rolling file appender 04:49:52 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:52 log4j2.appender.rolling.name = RollingFile 04:49:52 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:52 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:52 # uncomment to not force a disk flush 04:49:52 #log4j2.appender.rolling.immediateFlush = false 04:49:52 log4j2.appender.rolling.append = true 04:49:52 log4j2.appender.rolling.layout.type = PatternLayout 04:49:52 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:52 log4j2.appender.rolling.policies.type = Policies 04:49:52 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:52 log4j2.appender.rolling.policies.size.size = 1GB 04:49:52 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:52 log4j2.appender.rolling.strategy.max = 7 04:49:52 04:49:52 # Audit file appender 04:49:52 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:52 log4j2.appender.audit.name = AuditRollingFile 04:49:52 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:52 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:52 log4j2.appender.audit.append = true 04:49:52 log4j2.appender.audit.layout.type = PatternLayout 04:49:52 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:52 log4j2.appender.audit.policies.type = Policies 04:49:52 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:52 log4j2.appender.audit.policies.size.size = 8MB 04:49:52 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:52 log4j2.appender.audit.strategy.max = 7 04:49:52 04:49:52 # OSGi appender 04:49:52 log4j2.appender.osgi.type = PaxOsgi 04:49:52 log4j2.appender.osgi.name = PaxOsgi 04:49:52 log4j2.appender.osgi.filter = * 04:49:52 04:49:52 # help with identification of maven-related problems with pax-url-aether 04:49:52 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:52 #log4j2.logger.aether.level = TRACE 04:49:52 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:52 #log4j2.logger.http-headers.level = DEBUG 04:49:52 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:52 #log4j2.logger.maven.level = TRACE 04:49:52 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:52 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:52 log4j2.logger.cluster.name=akka.cluster 04:49:52 log4j2.logger.cluster.level=DEBUG 04:49:52 log4j2.logger.remote.name=akka.remote 04:49:52 log4j2.logger.remote.level=DEBUG 04:49:52 Set AKKA/PEKKO debug on 10.30.171.237 04:49:52 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:52 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:53 Enable AKKA/PEKKO debug 04:49:53 Dump /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:53 04:49:53 odl-cluster-data { 04:49:53 akka { 04:49:53 loglevel = "DEBUG" 04:49:53 actor { 04:49:53 debug { 04:49:53 autoreceive = on 04:49:53 lifecycle = on 04:49:53 unhandled = on 04:49:53 fsm = on 04:49:53 event-stream = on 04:49:53 } 04:49:53 } 04:49:53 remote { 04:49:53 artery { 04:49:53 enabled = on 04:49:53 transport = tcp 04:49:53 canonical.hostname = "10.30.171.237" 04:49:53 canonical.port = 2550 04:49:53 } 04:49:53 } 04:49:53 04:49:53 cluster { 04:49:53 # Using artery. 04:49:53 seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.174:2550", 04:49:53 "akka://opendaylight-cluster-data@10.30.170.199:2550", 04:49:53 "akka://opendaylight-cluster-data@10.30.171.237:2550"] 04:49:53 04:49:53 roles = ["member-3"] 04:49:53 04:49:53 # when under load we might trip a false positive on the failure detector 04:49:53 # failure-detector { 04:49:53 # heartbeat-interval = 4 s 04:49:53 # acceptable-heartbeat-pause = 16s 04:49:53 # } 04:49:53 } 04:49:53 04:49:53 persistence { 04:49:53 # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by 04:49:53 # modifying the following two properties. The directory location specified may be a relative or absolute path. 04:49:53 # The relative path is always relative to KARAF_HOME. 04:49:53 04:49:53 # snapshot-store.local.dir = "target/snapshots" 04:49:53 04:49:53 # Use lz4 compression for LocalSnapshotStore snapshots 04:49:53 snapshot-store.local.use-lz4-compression = false 04:49:53 # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB 04:49:53 snapshot-store.local.lz4-blocksize = 256KB 04:49:53 } 04:49:53 disable-default-actor-system-quarantined-event-handling = "false" 04:49:53 } 04:49:53 } 04:49:53 Dump /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:53 ################################################################################ 04:49:53 # 04:49:53 # Licensed to the Apache Software Foundation (ASF) under one or more 04:49:53 # contributor license agreements. See the NOTICE file distributed with 04:49:53 # this work for additional information regarding copyright ownership. 04:49:53 # The ASF licenses this file to You under the Apache License, Version 2.0 04:49:53 # (the "License"); you may not use this file except in compliance with 04:49:53 # the License. You may obtain a copy of the License at 04:49:53 # 04:49:53 # http://www.apache.org/licenses/LICENSE-2.0 04:49:53 # 04:49:53 # Unless required by applicable law or agreed to in writing, software 04:49:53 # distributed under the License is distributed on an "AS IS" BASIS, 04:49:53 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 04:49:53 # See the License for the specific language governing permissions and 04:49:53 # limitations under the License. 04:49:53 # 04:49:53 ################################################################################ 04:49:53 04:49:53 # Common pattern layout for appenders 04:49:53 log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n 04:49:53 04:49:53 # Root logger 04:49:53 log4j2.rootLogger.level = INFO 04:49:53 # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library 04:49:53 #log4j2.rootLogger.type = asyncRoot 04:49:53 #log4j2.rootLogger.includeLocation = false 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile 04:49:53 log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi 04:49:53 log4j2.rootLogger.appenderRef.Console.ref = Console 04:49:53 log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter 04:49:53 log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} 04:49:53 04:49:53 # Filters for logs marked by org.opendaylight.odlparent.Markers 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY 04:49:53 log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL 04:49:53 04:49:53 # Loggers configuration 04:49:53 04:49:53 # Spifly logger 04:49:53 log4j2.logger.spifly.name = org.apache.aries.spifly 04:49:53 log4j2.logger.spifly.level = WARN 04:49:53 04:49:53 # Security audit logger 04:49:53 log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit 04:49:53 log4j2.logger.audit.level = INFO 04:49:53 log4j2.logger.audit.additivity = false 04:49:53 log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile 04:49:53 04:49:53 # Appenders configuration 04:49:53 04:49:53 # Console appender not used by default (see log4j2.rootLogger.appenderRefs) 04:49:53 log4j2.appender.console.type = Console 04:49:53 log4j2.appender.console.name = Console 04:49:53 log4j2.appender.console.layout.type = PatternLayout 04:49:53 log4j2.appender.console.layout.pattern = ${log4j2.pattern} 04:49:53 04:49:53 # Rolling file appender 04:49:53 log4j2.appender.rolling.type = RollingRandomAccessFile 04:49:53 log4j2.appender.rolling.name = RollingFile 04:49:53 log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log 04:49:53 log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i 04:49:53 # uncomment to not force a disk flush 04:49:53 #log4j2.appender.rolling.immediateFlush = false 04:49:53 log4j2.appender.rolling.append = true 04:49:53 log4j2.appender.rolling.layout.type = PatternLayout 04:49:53 log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} 04:49:53 log4j2.appender.rolling.policies.type = Policies 04:49:53 log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy 04:49:53 log4j2.appender.rolling.policies.size.size = 1GB 04:49:53 log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy 04:49:53 log4j2.appender.rolling.strategy.max = 7 04:49:53 04:49:53 # Audit file appender 04:49:53 log4j2.appender.audit.type = RollingRandomAccessFile 04:49:53 log4j2.appender.audit.name = AuditRollingFile 04:49:53 log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log 04:49:53 log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i 04:49:53 log4j2.appender.audit.append = true 04:49:53 log4j2.appender.audit.layout.type = PatternLayout 04:49:53 log4j2.appender.audit.layout.pattern = ${log4j2.pattern} 04:49:53 log4j2.appender.audit.policies.type = Policies 04:49:53 log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy 04:49:53 log4j2.appender.audit.policies.size.size = 8MB 04:49:53 log4j2.appender.audit.strategy.type = DefaultRolloverStrategy 04:49:53 log4j2.appender.audit.strategy.max = 7 04:49:53 04:49:53 # OSGi appender 04:49:53 log4j2.appender.osgi.type = PaxOsgi 04:49:53 log4j2.appender.osgi.name = PaxOsgi 04:49:53 log4j2.appender.osgi.filter = * 04:49:53 04:49:53 # help with identification of maven-related problems with pax-url-aether 04:49:53 #log4j2.logger.aether.name = shaded.org.eclipse.aether 04:49:53 #log4j2.logger.aether.level = TRACE 04:49:53 #log4j2.logger.http-headers.name = shaded.org.apache.http.headers 04:49:53 #log4j2.logger.http-headers.level = DEBUG 04:49:53 #log4j2.logger.maven.name = org.ops4j.pax.url.mvn 04:49:53 #log4j2.logger.maven.level = TRACE 04:49:53 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN 04:49:53 log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN 04:49:53 log4j2.logger.cluster.name=akka.cluster 04:49:53 log4j2.logger.cluster.level=DEBUG 04:49:53 log4j2.logger.remote.name=akka.remote 04:49:53 log4j2.logger.remote.level=DEBUG 04:49:53 Finished running config plans 04:49:53 Starting member-1 with IP address 10.30.170.174 04:49:53 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:53 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:53 Redirecting karaf console output to karaf_console.log 04:49:53 Starting controller... 04:49:53 start: Redirecting Karaf output to /tmp/karaf-0.21.4/data/log/karaf_console.log 04:49:53 Starting member-2 with IP address 10.30.170.199 04:49:53 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:53 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:49:53 Redirecting karaf console output to karaf_console.log 04:49:53 Starting controller... 04:49:53 start: Redirecting Karaf output to /tmp/karaf-0.21.4/data/log/karaf_console.log 04:49:53 Starting member-3 with IP address 10.30.171.237 04:49:53 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:54 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:49:54 Redirecting karaf console output to karaf_console.log 04:49:54 Starting controller... 04:49:54 start: Redirecting Karaf output to /tmp/karaf-0.21.4/data/log/karaf_console.log 04:49:54 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins11348204676929725782.sh 04:49:54 common-functions.sh is being sourced 04:49:54 common-functions environment: 04:49:54 MAVENCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.url.mvn.cfg 04:49:54 ACTUALFEATURES: 04:49:54 FEATURESCONF: /tmp/karaf-0.21.4/etc/org.apache.karaf.features.cfg 04:49:54 CUSTOMPROP: /tmp/karaf-0.21.4/etc/custom.properties 04:49:54 LOGCONF: /tmp/karaf-0.21.4/etc/org.ops4j.pax.logging.cfg 04:49:54 MEMCONF: /tmp/karaf-0.21.4/bin/setenv 04:49:54 CONTROLLERMEM: 2048m 04:49:54 AKKACONF: /tmp/karaf-0.21.4/configuration/initial/akka.conf 04:49:54 MODULESCONF: /tmp/karaf-0.21.4/configuration/initial/modules.conf 04:49:54 MODULESHARDSCONF: /tmp/karaf-0.21.4/configuration/initial/module-shards.conf 04:49:54 SUITES: 04:49:54 04:49:54 + echo '#################################################' 04:49:54 ################################################# 04:49:54 + echo '## Verify Cluster is UP ##' 04:49:54 ## Verify Cluster is UP ## 04:49:54 + echo '#################################################' 04:49:54 ################################################# 04:49:54 + create_post_startup_script 04:49:54 + cat 04:49:54 + copy_and_run_post_startup_script 04:49:54 + seed_index=1 04:49:54 ++ seq 1 3 04:49:54 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:49:54 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:49:54 + echo 'Execute the post startup script on controller 10.30.170.174' 04:49:54 Execute the post startup script on controller 10.30.170.174 04:49:54 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/post-startup-script.sh 10.30.170.174:/tmp/ 04:49:54 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:54 + ssh 10.30.170.174 'bash /tmp/post-startup-script.sh 1' 04:49:54 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:49:54 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:49:59 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:04 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:09 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:14 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:19 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:24 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:29 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:34 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:39 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:44 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:49 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:50:54 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:50:59 2025-12-02T04:50:32,047 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:50:59 Controller is UP 04:50:59 2025-12-02T04:50:32,047 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:50:59 Listing all open ports on controller system... 04:50:59 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:50:59 looking for "BindException: Address already in use" in log file 04:50:59 looking for "server is unhealthy" in log file 04:50:59 + '[' 1 == 0 ']' 04:50:59 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:50:59 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:50:59 + echo 'Execute the post startup script on controller 10.30.170.199' 04:50:59 Execute the post startup script on controller 10.30.170.199 04:50:59 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/post-startup-script.sh 10.30.170.199:/tmp/ 04:50:59 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:51:00 + ssh 10.30.170.199 'bash /tmp/post-startup-script.sh 2' 04:51:00 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:51:00 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:05 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:10 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:15 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:20 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:25 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:30 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:35 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:40 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:45 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:50 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:51:55 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:00 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:52:05 2025-12-02T04:50:31,996 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:52:05 Controller is UP 04:52:05 2025-12-02T04:50:31,996 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:52:05 Listing all open ports on controller system... 04:52:05 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:52:05 looking for "BindException: Address already in use" in log file 04:52:05 looking for "server is unhealthy" in log file 04:52:05 + '[' 2 == 0 ']' 04:52:05 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:52:05 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:52:05 + echo 'Execute the post startup script on controller 10.30.171.237' 04:52:05 Execute the post startup script on controller 10.30.171.237 04:52:05 + scp /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/post-startup-script.sh 10.30.171.237:/tmp/ 04:52:05 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:52:06 + ssh 10.30.171.237 'bash /tmp/post-startup-script.sh 3' 04:52:06 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:52:06 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:11 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:16 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:21 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:26 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:31 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:36 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:41 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:46 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:51 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:52:56 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:53:01 /tmp/post-startup-script.sh: line 4: netstat: command not found 04:53:06 Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 04:53:11 2025-12-02T04:50:31,700 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:53:11 Controller is UP 04:53:11 2025-12-02T04:50:31,700 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 199 - org.opendaylight.infrautils.ready-api - 7.1.7 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 04:53:11 Listing all open ports on controller system... 04:53:11 /tmp/post-startup-script.sh: line 51: netstat: command not found 04:53:11 looking for "BindException: Address already in use" in log file 04:53:11 looking for "server is unhealthy" in log file 04:53:11 + '[' 0 == 0 ']' 04:53:11 + seed_index=1 04:53:11 + dump_controller_threads 04:53:11 ++ seq 1 3 04:53:11 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:11 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:53:11 + echo 'Let'\''s take the karaf thread dump' 04:53:11 Let's take the karaf thread dump 04:53:11 + ssh 10.30.170.174 'sudo ps aux' 04:53:11 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:53:11 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_before.log 04:53:11 ++ grep -v grep 04:53:11 ++ tr -s ' ' 04:53:11 ++ cut -f2 '-d ' 04:53:11 + pid=2134 04:53:11 + echo 'karaf main: org.apache.karaf.main.Main, pid:2134' 04:53:11 karaf main: org.apache.karaf.main.Main, pid:2134 04:53:11 + ssh 10.30.170.174 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2134' 04:53:12 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 04:53:12 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:12 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:53:12 + echo 'Let'\''s take the karaf thread dump' 04:53:12 Let's take the karaf thread dump 04:53:12 + ssh 10.30.170.199 'sudo ps aux' 04:53:12 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:53:13 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_before.log 04:53:13 ++ grep -v grep 04:53:13 ++ cut -f2 '-d ' 04:53:13 ++ tr -s ' ' 04:53:13 + pid=2144 04:53:13 + echo 'karaf main: org.apache.karaf.main.Main, pid:2144' 04:53:13 karaf main: org.apache.karaf.main.Main, pid:2144 04:53:13 + ssh 10.30.170.199 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2144' 04:53:13 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 04:53:13 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:13 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:53:13 + echo 'Let'\''s take the karaf thread dump' 04:53:13 Let's take the karaf thread dump 04:53:13 + ssh 10.30.171.237 'sudo ps aux' 04:53:13 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:53:14 ++ tr -s ' ' 04:53:14 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_before.log 04:53:14 ++ cut -f2 '-d ' 04:53:14 ++ grep -v grep 04:53:14 + pid=2143 04:53:14 + echo 'karaf main: org.apache.karaf.main.Main, pid:2143' 04:53:14 karaf main: org.apache.karaf.main.Main, pid:2143 04:53:14 + ssh 10.30.171.237 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 2143' 04:53:14 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 04:53:14 + '[' 0 -gt 0 ']' 04:53:14 + echo 'Generating controller variables...' 04:53:14 Generating controller variables... 04:53:14 ++ seq 1 3 04:53:14 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:14 + CONTROLLERIP=ODL_SYSTEM_1_IP 04:53:14 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.174' 04:53:14 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:14 + CONTROLLERIP=ODL_SYSTEM_2_IP 04:53:14 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.174 -v ODL_SYSTEM_2_IP:10.30.170.199' 04:53:14 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 04:53:14 + CONTROLLERIP=ODL_SYSTEM_3_IP 04:53:14 + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.174 -v ODL_SYSTEM_2_IP:10.30.170.199 -v ODL_SYSTEM_3_IP:10.30.171.237' 04:53:14 + echo 'Generating mininet variables...' 04:53:14 Generating mininet variables... 04:53:14 ++ seq 1 1 04:53:14 + for i in $(seq 1 "${NUM_TOOLS_SYSTEM}") 04:53:14 + MININETIP=TOOLS_SYSTEM_1_IP 04:53:14 + tools_variables=' -v TOOLS_SYSTEM_1_IP:10.30.170.122' 04:53:14 + get_test_suites SUITES 04:53:14 + local __suite_list=SUITES 04:53:14 + echo 'Locating test plan to use...' 04:53:14 Locating test plan to use... 04:53:14 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/testplans/openflowplugin-clustering-scandium.txt 04:53:14 + '[' '!' -f /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/testplans/openflowplugin-clustering-scandium.txt ']' 04:53:14 + testplan_filepath=/w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/testplans/openflowplugin-clustering.txt 04:53:14 + '[' disabled '!=' disabled ']' 04:53:14 + echo 'Changing the testplan path...' 04:53:14 Changing the testplan path... 04:53:14 + sed s:integration:/w/workspace/openflowplugin-csit-3node-clustering-only-scandium: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/testplans/openflowplugin-clustering.txt 04:53:14 + cat testplan.txt 04:53:14 # Place the suites in run order: 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot 04:53:14 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 04:53:14 + '[' -z '' ']' 04:53:14 ++ grep -E -v '(^[[:space:]]*#|^[[:space:]]*$)' testplan.txt 04:53:14 ++ tr '\012' ' ' 04:53:14 + suite_list='/w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 04:53:14 + eval 'SUITES='\''/w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot '\''' 04:53:14 ++ SUITES='/w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ' 04:53:14 + echo 'Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ...' 04:53:14 Starting Robot test suites /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot ... 04:53:14 + robot -N openflowplugin-clustering.txt --removekeywords wuks -e exclude -e skip_if_scandium -v BUNDLEFOLDER:karaf-0.21.4 -v BUNDLE_URL:https://nexus.opendaylight.org/content/repositories//autorelease-9409/org/opendaylight/integration/karaf/0.21.4/karaf-0.21.4.zip -v CONTROLLER:10.30.170.174 -v CONTROLLER1:10.30.170.199 -v CONTROLLER2:10.30.171.237 -v CONTROLLER_USER:jenkins -v JAVA_HOME:/usr/lib/jvm/java-21-openjdk-amd64 -v JDKVERSION:openjdk21 -v JENKINS_WORKSPACE:/w/workspace/openflowplugin-csit-3node-clustering-only-scandium -v MININET:10.30.170.122 -v MININET1: -v MININET2: -v MININET_USER:jenkins -v NEXUSURL_PREFIX:https://nexus.opendaylight.org -v NUM_ODL_SYSTEM:3 -v NUM_TOOLS_SYSTEM:1 -v ODL_STREAM:scandium -v ODL_SYSTEM_IP:10.30.170.174 -v ODL_SYSTEM_1_IP:10.30.170.174 -v ODL_SYSTEM_2_IP:10.30.170.199 -v ODL_SYSTEM_3_IP:10.30.171.237 -v ODL_SYSTEM_USER:jenkins -v TOOLS_SYSTEM_IP:10.30.170.122 -v TOOLS_SYSTEM_1_IP:10.30.170.122 -v TOOLS_SYSTEM_USER:jenkins -v USER_HOME:/home/jenkins -v IS_KARAF_APPL:True -v WORKSPACE:/tmp -v ODL_OF_PLUGIN:lithium /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/010__Cluster_HA_Owner_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/020__Cluster_HA_Owner_Restart.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustering/030__Cluster_HA_Data_Recovery_Leader_Follower_Failover.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Clustered_Reconciliation/010_Group_Flows.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/010_Switch_Disconnect.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/020_Cluster_Node_Failure.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/EntityOwnership/030_Cluster_Sync_Problems.robot /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/suites/openflowplugin/Bug_Validation/9145.robot 04:53:15 ============================================================================== 04:53:15 openflowplugin-clustering.txt 04:53:15 ============================================================================== 04:53:15 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test suite for C... 04:53:15 ============================================================================== 04:53:19 Check Shards Status Before Fail :: Check Status for all shards in ... | PASS | 04:53:26 ------------------------------------------------------------------------------ 04:53:26 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 04:53:35 ------------------------------------------------------------------------------ 04:53:35 Check Entity Owner Status And Find Owner and Successor Before Fail... | FAIL | 04:54:05 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 04:54:05 ------------------------------------------------------------------------------ 04:54:05 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 04:54:05 Variable '@{original_successor_list}' not found. 04:54:05 ------------------------------------------------------------------------------ 04:54:05 Check Network Operational Information Before Fail :: Check devices... | PASS | 04:54:05 ------------------------------------------------------------------------------ 04:54:05 Add Configuration In Owner and Verify Before Fail :: Add Flow in O... | FAIL | 04:54:06 Variable '${original_owner}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Modify Configuration In Owner and Verify Before Fail :: Modify Flo... | FAIL | 04:54:06 Variable '${original_owner}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Delete Configuration In Owner and Verify Before Fail :: Delete Flo... | FAIL | 04:54:06 Variable '${original_owner}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Add Configuration In Successor and Verify Before Fail :: Add Flow ... | FAIL | 04:54:06 Variable '${original_successor}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Modify Configuration In Successor and Verify Before Fail :: Modify... | FAIL | 04:54:06 Variable '${original_successor}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Delete Configuration In Successor and Verify Before Fail :: Delete... | FAIL | 04:54:06 Variable '${original_successor}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Send RPC Add to Owner and Verify Before Fail :: Add Flow in Owner ... | FAIL | 04:54:06 Variable '${original_owner}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Send RPC Delete to Owner and Verify Before Fail :: Delete Flow in ... | FAIL | 04:54:06 Variable '${original_owner}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Send RPC Add to Successor and Verify Before Fail :: Add Flow in Su... | FAIL | 04:54:06 Variable '${original_successor}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Send RPC Delete to Successor and Verify Before Fail :: Delete Flow... | FAIL | 04:54:06 Variable '${original_successor}' not found. 04:54:06 ------------------------------------------------------------------------------ 04:54:06 Modify Network And Verify Before Fail :: Take a link down and veri... | PASS | 04:54:07 ------------------------------------------------------------------------------ 04:54:07 Restore Network And Verify Before Fail :: Take the link up and ver... | PASS | 04:54:10 ------------------------------------------------------------------------------ 04:54:10 Kill Owner Instance :: Kill Owner Instance and verify it is dead | FAIL | 04:54:10 Variable '${original_owner}' not found. 04:54:10 ------------------------------------------------------------------------------ 04:54:10 Check Shards Status After Fail :: Create original cluster list and... | FAIL | 04:54:10 Variable '${new_cluster_list}' not found. 04:54:10 ------------------------------------------------------------------------------ 04:54:10 Check Entity Owner Status And Find Owner and Successor After Fail ... | FAIL | 04:54:11 Variable '${original_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Check Network Operational Information After Fail :: Check devices ... | FAIL | 04:54:11 Variable '${new_cluster_list}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Add Configuration In Owner and Verify After Fail :: Add Flow in Ow... | FAIL | 04:54:11 Variable '${new_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Modify Configuration In Owner and Verify After Fail :: Modify Flow... | FAIL | 04:54:11 Variable '${new_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Delete Configuration In Owner and Verify After Fail :: Delete Flow... | FAIL | 04:54:11 Variable '${new_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Add Configuration In Successor and Verify After Fail :: Add Flow i... | FAIL | 04:54:11 Variable '${new_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Modify Configuration In Successor and Verify After Fail :: Modify ... | FAIL | 04:54:11 Variable '${new_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Delete Configuration In Successor and Verify After Fail :: Delete ... | FAIL | 04:54:11 Variable '${new_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Send RPC Add to Owner and Verify After Fail :: Add Flow in Owner a... | FAIL | 04:54:11 Variable '${new_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Send RPC Delete to Owner and Verify After Fail :: Delete Flow in O... | FAIL | 04:54:11 Variable '${new_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Send RPC Add to Successor and Verify After Fail :: Add Flow in Suc... | FAIL | 04:54:11 Variable '${new_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Send RPC Delete to Successor and Verify After Fail :: Delete Flow ... | FAIL | 04:54:11 Variable '${new_successor}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Modify Network and Verify After Fail :: Take a link down and verif... | FAIL | 04:54:11 Variable '${new_cluster_list}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Restore Network and Verify After Fail :: Take the link up and veri... | FAIL | 04:54:11 Variable '${new_cluster_list}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 04:54:11 This test fails due to https://jira.opendaylight.org/browse/CONTROLLER-1849 04:54:11 04:54:11 Variable '${original_owner}' not found. 04:54:11 ------------------------------------------------------------------------------ 04:54:11 Check Shards Status After Recover :: Create original cluster list ... | PASS | 04:54:14 ------------------------------------------------------------------------------ 04:54:14 Check Entity Owner Status After Recover :: Check Entity Owner Stat... | FAIL | 04:54:45 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Check Network Operational Information After Recover :: Check devic... | PASS | 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Add Configuration In Owner and Verify After Recover :: Add Flow in... | FAIL | 04:54:45 Variable '${new_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Modify Configuration In Owner and Verify After Recover :: Modify F... | FAIL | 04:54:45 Variable '${new_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Delete Configuration In Owner and Verify After Recover :: Delete F... | FAIL | 04:54:45 Variable '${new_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Add Configuration In Old Owner and Verify After Recover :: Add Flo... | FAIL | 04:54:45 Variable '${original_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Modify Configuration In Old Owner and Verify After Recover :: Modi... | FAIL | 04:54:45 Variable '${original_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Delete Configuration In Old Owner and Verify After Recover :: Dele... | FAIL | 04:54:45 Variable '${original_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Send RPC Add to Owner and Verify After Recover :: Add Flow in Owne... | FAIL | 04:54:45 Variable '${new_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Send RPC Delete to Owner and Verify After Recover :: Delete Flow i... | FAIL | 04:54:45 Variable '${new_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Send RPC Add to Old Owner and Verify After Recover :: Add Flow in ... | FAIL | 04:54:45 Variable '${original_owner}' not found. 04:54:45 ------------------------------------------------------------------------------ 04:54:45 Send RPC Delete to Old Owner and Verify After Recover :: Delete Fl... | FAIL | 04:54:46 Variable '${original_owner}' not found. 04:54:46 ------------------------------------------------------------------------------ 04:54:46 Modify Network and Verify After Recover :: Take a link down and ve... | PASS | 04:54:46 ------------------------------------------------------------------------------ 04:54:46 Restore Network and Verify After Recover :: Take the link up and v... | PASS | 04:54:50 ------------------------------------------------------------------------------ 04:54:50 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 04:54:53 ------------------------------------------------------------------------------ 04:54:53 Check No Network Operational Information :: Check device is not in... | PASS | 04:54:53 ------------------------------------------------------------------------------ 04:54:53 openflowplugin-clustering.txt.Cluster HA Owner Failover :: Test su... | FAIL | 04:54:53 51 tests, 11 passed, 40 failed 04:54:53 ============================================================================== 04:54:53 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test suite for Cl... 04:54:53 ============================================================================== 04:54:56 Check Shards Status Before Stop :: Check Status for all shards in ... | PASS | 04:55:00 ------------------------------------------------------------------------------ 04:55:00 Start Mininet Multiple Connections :: Start mininet tree,2 with co... | PASS | 04:55:08 ------------------------------------------------------------------------------ 04:55:08 Check Entity Owner Status And Find Owner and Successor Before Stop... | FAIL | 04:55:38 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 04:55:38 ------------------------------------------------------------------------------ 04:55:38 Reconnect Extra Switches To Successors And Check OVS Connections :... | FAIL | 04:55:38 Variable '@{original_successor_list}' not found. 04:55:38 ------------------------------------------------------------------------------ 04:55:38 Check Network Operational Information Before Stop :: Check devices... | PASS | 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Add Configuration In Owner and Verify Before Stop :: Add Flow in O... | FAIL | 04:55:39 Variable '${original_owner}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Modify Configuration In Owner and Verify Before Stop :: Modify Flo... | FAIL | 04:55:39 Variable '${original_owner}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Delete Configuration In Owner and Verify Before Stop :: Delete Flo... | FAIL | 04:55:39 Variable '${original_owner}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Add Configuration In Successor and Verify Before Stop :: Add Flow ... | FAIL | 04:55:39 Variable '${original_successor}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Modify Configuration In Successor and Verify Before Stop :: Modify... | FAIL | 04:55:39 Variable '${original_successor}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Delete Configuration In Successor and Verify Before Stop :: Delete... | FAIL | 04:55:39 Variable '${original_successor}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Send RPC Add to Owner and Verify Before Stop :: Add Flow in Owner ... | FAIL | 04:55:39 Variable '${original_owner}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Send RPC Delete to Owner and Verify Before Stop :: Delete Flow in ... | FAIL | 04:55:39 Variable '${original_owner}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Send RPC Add to Successor and Verify Before Stop :: Add Flow in Su... | FAIL | 04:55:39 Variable '${original_successor}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Send RPC Delete to Successor and Verify Before Stop :: Delete Flow... | FAIL | 04:55:39 Variable '${original_successor}' not found. 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Modify Network And Verify Before Stop :: Take a link down and veri... | PASS | 04:55:39 ------------------------------------------------------------------------------ 04:55:39 Restore Network And Verify Before Stop :: Take the link up and ver... | PASS | 04:55:40 ------------------------------------------------------------------------------ 04:55:40 Stop Owner Instance :: Stop Owner Instance and verify it is dead | FAIL | 04:55:40 Variable '${original_owner}' not found. 04:55:40 ------------------------------------------------------------------------------ 04:55:40 Check Shards Status After Stop :: Create original cluster list and... | FAIL | 04:55:40 Variable '${new_cluster_list}' not found. 04:55:40 ------------------------------------------------------------------------------ 04:55:40 Check Entity Owner Status And Find Owner and Successor After Stop ... | FAIL | 04:55:40 Variable '${original_successor}' not found. 04:55:40 ------------------------------------------------------------------------------ 04:55:40 Check Network Operational Information After Stop :: Check devices ... | FAIL | 04:55:40 Variable '${new_cluster_list}' not found. 04:55:40 ------------------------------------------------------------------------------ 04:55:40 Add Configuration In Owner and Verify After Stop :: Add Flow in Ow... | FAIL | 04:55:41 Variable '${new_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Modify Configuration In Owner and Verify After Stop :: Modify Flow... | FAIL | 04:55:41 Variable '${new_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Delete Configuration In Owner and Verify After Stop :: Delete Flow... | FAIL | 04:55:41 Variable '${new_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Add Configuration In Successor and Verify After Stop :: Add Flow i... | FAIL | 04:55:41 Variable '${new_successor}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Modify Configuration In Successor and Verify After Stop :: Modify ... | FAIL | 04:55:41 Variable '${new_successor}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Delete Configuration In Successor and Verify After Stop :: Delete ... | FAIL | 04:55:41 Variable '${new_successor}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Send RPC Add to Owner and Verify After Stop :: Add Flow in Owner a... | FAIL | 04:55:41 Variable '${new_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Send RPC Delete to Owner and Verify After Stop :: Delete Flow in O... | FAIL | 04:55:41 Variable '${new_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Send RPC Add to Successor and Verify After Stop :: Add Flow in Suc... | FAIL | 04:55:41 Variable '${new_successor}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Send RPC Delete to Successor and Verify After Stop :: Delete Flow ... | FAIL | 04:55:41 Variable '${new_successor}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Modify Network and Verify After Stop :: Take a link down and verif... | FAIL | 04:55:41 Variable '${new_cluster_list}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Restore Network and Verify After Stop :: Take the link up and veri... | FAIL | 04:55:41 Variable '${new_cluster_list}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Start Old Owner Instance :: Start old Owner Instance and verify it... | FAIL | 04:55:41 Variable '${original_owner}' not found. 04:55:41 ------------------------------------------------------------------------------ 04:55:41 Check Shards Status After Start :: Create original cluster list an... | PASS | 04:55:44 ------------------------------------------------------------------------------ 04:55:44 Check Entity Owner Status After Start :: Check Entity Owner Status... | FAIL | 04:56:14 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Check Network Operational Information After Start :: Check devices... | PASS | 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Add Configuration In Owner and Verify After Start :: Add Flow in O... | FAIL | 04:56:14 Variable '${new_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Modify Configuration In Owner and Verify After Start :: Modify Flo... | FAIL | 04:56:14 Variable '${new_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Delete Configuration In Owner and Verify After Start :: Delete Flo... | FAIL | 04:56:14 Variable '${new_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Add Configuration In Old Owner and Verify After Start :: Add Flow ... | FAIL | 04:56:14 Variable '${original_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Modify Configuration In Old Owner and Verify After Start :: Modify... | FAIL | 04:56:14 Variable '${original_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Delete Configuration In Old Owner and Verify After Start :: Delete... | FAIL | 04:56:14 Variable '${original_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Send RPC Add to Owner and Verify After Start :: Add Flow in Owner ... | FAIL | 04:56:14 Variable '${new_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Send RPC Delete to Owner and Verify After Start :: Delete Flow in ... | FAIL | 04:56:14 Variable '${new_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Send RPC Add to Old Owner and Verify After Start :: Add Flow in Ow... | FAIL | 04:56:14 Variable '${original_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Send RPC Delete to Old Owner and Verify After Start :: Delete Flow... | FAIL | 04:56:14 Variable '${original_owner}' not found. 04:56:14 ------------------------------------------------------------------------------ 04:56:14 Modify Network and Verify After Start :: Take a link down and veri... | PASS | 04:56:15 ------------------------------------------------------------------------------ 04:56:15 Restore Network and Verify After Start :: Take the link up and ver... | PASS | 04:56:20 ------------------------------------------------------------------------------ 04:56:20 Stop Mininet and Exit :: Stop mininet and exit connection. | PASS | 04:56:23 ------------------------------------------------------------------------------ 04:56:23 Check No Network Operational Information :: Check device is not in... | PASS | 04:56:23 ------------------------------------------------------------------------------ 04:56:23 openflowplugin-clustering.txt.Cluster HA Owner Restart :: Test sui... | FAIL | 04:56:23 51 tests, 11 passed, 40 failed 04:56:23 ============================================================================== 04:56:23 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Follower Fail... 04:56:23 ============================================================================== 04:56:26 Check Shards Status Before Leader Restart :: Check Status for all ... | PASS | 04:56:30 ------------------------------------------------------------------------------ 04:56:30 Get inventory Leader Before Leader Restart :: Find leader in the i... | PASS | 04:56:31 ------------------------------------------------------------------------------ 04:56:31 Start Mininet Connect To Follower Node1 :: Start mininet with conn... | PASS | 04:56:35 ------------------------------------------------------------------------------ 04:56:35 Add Flows In Follower Node2 and Verify Before Leader Restart :: Ad... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:38 | PASS | 04:56:38 ------------------------------------------------------------------------------ 04:56:38 Stop Mininet Connected To Follower Node1 and Exit :: Stop mininet ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:56:40 | PASS | 04:56:40 ------------------------------------------------------------------------------ 04:56:40 Restart Leader From Cluster Node :: Stop Leader Node and Start it ... | PASS | 04:57:23 ------------------------------------------------------------------------------ 04:57:23 Get inventory Follower After Leader Restart :: Find new Followers ... | PASS | 04:57:25 ------------------------------------------------------------------------------ 04:57:25 Start Mininet Connect To Old Leader :: Start mininet with connecti... | PASS | 04:57:28 ------------------------------------------------------------------------------ 04:57:28 Verify Flows In Switch After Leader Restart :: Verify flows are in... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:29 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:30 | PASS | 04:57:30 ------------------------------------------------------------------------------ 04:57:30 Stop Mininet Connected To Old Leader and Exit :: Stop mininet and ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:57:33 | PASS | 04:57:33 ------------------------------------------------------------------------------ 04:57:33 Restart Follower Node2 :: Stop Follower Node2 and Start it Up, Ver... | PASS | 04:58:14 ------------------------------------------------------------------------------ 04:58:14 Get inventory Follower After Follower Restart :: Find Followers an... | PASS | 04:58:16 ------------------------------------------------------------------------------ 04:58:16 Start Mininet Connect To Leader :: Start mininet with connection t... | PASS | 04:58:19 ------------------------------------------------------------------------------ 04:58:19 Verify Flows In Switch After Follower Restart :: Verify flows are ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:19 | PASS | 04:58:19 ------------------------------------------------------------------------------ 04:58:19 Stop Mininet Connected To Leader and Exit :: Stop mininet Connecte... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:20 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 04:58:22 | PASS | 04:58:22 ------------------------------------------------------------------------------ 04:58:22 Restart Full Cluster :: Stop all Cluster Nodes and Start it Up All. | PASS | 05:01:07 ------------------------------------------------------------------------------ 05:01:07 Get inventory Status After Cluster Restart :: Find New Followers a... | PASS | 05:01:41 ------------------------------------------------------------------------------ 05:01:41 Start Mininet Connect To Follower Node2 After Cluster Restart :: S... | PASS | 05:01:44 ------------------------------------------------------------------------------ 05:01:44 Verify Flows In Switch After Cluster Restart :: Verify flows are i... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:44 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:46 | PASS | 05:01:46 ------------------------------------------------------------------------------ 05:01:46 Delete Flows In Follower Node1 and Verify After Leader Restart :: ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:50 | PASS | 05:01:50 ------------------------------------------------------------------------------ 05:01:50 Stop Mininet Connected To Follower Node2 and Exit After Cluster Re... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:52 | PASS | 05:01:52 ------------------------------------------------------------------------------ 05:01:52 openflowplugin-clustering.txt.Cluster HA Data Recovery Leader Foll... | PASS | 05:01:52 21 tests, 21 passed, 0 failed 05:01:52 ============================================================================== 05:01:52 /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/test/csit/libraries/VsctlListParser.py:61: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:01:52 if ctl_ref is not "": 05:01:52 openflowplugin-clustering.txt.010 Group Flows :: Switch connections and clu... 05:01:52 ============================================================================== 05:01:55 Add Groups And Flows :: Add 100 groups 1&2 and flows in every switch. | PASS | 05:01:59 ------------------------------------------------------------------------------ 05:01:59 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 05:02:08 ------------------------------------------------------------------------------ 05:02:08 Check Linear Topology :: Check Linear Topology. | PASS | 05:02:12 ------------------------------------------------------------------------------ 05:02:12 Check Stats Are Not Frozen :: Check that duration flow stat is inc... | PASS | 05:02:18 ------------------------------------------------------------------------------ 05:02:18 Check Flows In Operational DS :: Check Flows in operational DS. | PASS | 05:02:18 ------------------------------------------------------------------------------ 05:02:18 Check Groups In Operational DS :: Check Groups in operational DS. | PASS | 05:02:19 ------------------------------------------------------------------------------ 05:02:19 Check Flows In Switch :: Check Flows in switch. | PASS | 05:02:19 ------------------------------------------------------------------------------ 05:02:19 Check Entity Owner Status And Find Owner and Successor Before Fail... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 | FAIL | 05:02:50 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:02:50 ------------------------------------------------------------------------------ 05:02:50 Disconnect Mininet From Owner :: Disconnect mininet from the owner :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 | FAIL | 05:02:50 Variable '${original_owner}' not found. 05:02:50 ------------------------------------------------------------------------------ 05:02:50 Check Entity Owner Status And Find Owner and Successor After Fail ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:02:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 | FAIL | 05:03:01 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${new_cluster_list}' not found. 05:03:01 ------------------------------------------------------------------------------ 05:03:01 Check Switch Moves To New Master :: Check switch s1 is connected t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 | FAIL | 05:03:01 Variable '${new_owner}' not found. 05:03:01 ------------------------------------------------------------------------------ 05:03:01 Check Linear Topology After Disconnect :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:01 | PASS | 05:03:01 ------------------------------------------------------------------------------ 05:03:01 Check Stats Are Not Frozen After Disconnect :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:07 | PASS | 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Remove Flows And Groups After Mininet Is Disconnected :: Remove 1 ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:07 | PASS | 05:03:07 ------------------------------------------------------------------------------ 05:03:07 Check Flows In Operational DS After Mininet Is Disconnected :: Che... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:10 | PASS | 05:03:10 ------------------------------------------------------------------------------ 05:03:10 Check Groups In Operational DS After Mininet Is Disconnected :: Ch... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:10 | PASS | 05:03:10 ------------------------------------------------------------------------------ 05:03:10 Check Flows In Switch After Mininet Is Disconnected :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 | PASS | 05:03:11 ------------------------------------------------------------------------------ 05:03:11 Reconnect Mininet To Owner :: Reconnect mininet to switch 1 owner. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 | FAIL | 05:03:11 Variable '${original_owner_list}' not found. 05:03:11 ------------------------------------------------------------------------------ 05:03:11 Check Entity Owner Status And Find Owner and Successor After Recon... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:42 | FAIL | 05:03:42 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:03:42 ------------------------------------------------------------------------------ 05:03:42 Add Flows And Groups After Owner Reconnect :: Add 1 group type 1&2... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:44 | PASS | 05:03:44 ------------------------------------------------------------------------------ 05:03:44 Check Stats Are Not Frozen After Owner Reconnect :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:44 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:44 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:51 | PASS | 05:03:51 ------------------------------------------------------------------------------ 05:03:51 Check Flows After Owner Reconnect In Operational DS :: Check Flows... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 | PASS | 05:03:52 ------------------------------------------------------------------------------ 05:03:52 Check Groups After Owner Reconnect In Operational DS :: Check Grou... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 | PASS | 05:03:52 ------------------------------------------------------------------------------ 05:03:52 Check Flows After Owner Reconnect In Switch :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 | PASS | 05:03:53 ------------------------------------------------------------------------------ 05:03:53 Check Switches Generate Slave Connection :: Check switches are con... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 | FAIL | 05:03:53 Variable '${original_owner}' not found. 05:03:53 ------------------------------------------------------------------------------ 05:03:53 Disconnect Mininet From Successor :: Disconnect mininet from the S... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:53 | FAIL | 05:03:53 Variable '${new_successor_list}' not found. 05:03:53 ------------------------------------------------------------------------------ 05:03:53 Check Entity Owner Status And Find New Owner and Successor After D... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:03:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 | FAIL | 05:04:04 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${owner_list}' not found. 05:04:04 ------------------------------------------------------------------------------ 05:04:04 Disconnect Mininet From Current Owner :: Disconnect mininet from t... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 | FAIL | 05:04:04 Variable '${current_owner}' not found. 05:04:04 ------------------------------------------------------------------------------ 05:04:04 Check Entity Owner Status And Find Current Owner and Successor Aft... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 | FAIL | 05:04:15 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Variable '${original_owner_list}' not found. 05:04:15 ------------------------------------------------------------------------------ 05:04:15 Check Switch Moves To Current Master :: Check switch s1 is connect... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 | FAIL | 05:04:15 Variable '${current_new_owner}' not found. 05:04:15 ------------------------------------------------------------------------------ 05:04:15 Check Linear Topology After Owner Disconnect :: Check Linear Topol... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:15 | PASS | 05:04:15 ------------------------------------------------------------------------------ 05:04:15 Check Stats Are Not Frozen After Owner Disconnect :: Check that du... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:21 | PASS | 05:04:21 ------------------------------------------------------------------------------ 05:04:21 Remove Flows And Groups After Owner Disconnected :: Remove 1 group... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:21 | PASS | 05:04:21 ------------------------------------------------------------------------------ 05:04:21 Check Flows In Operational DS After Owner Disconnected :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:22 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:24 | PASS | 05:04:24 ------------------------------------------------------------------------------ 05:04:24 Check Groups In Operational DS After Owner Disconnected :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:24 | PASS | 05:04:24 ------------------------------------------------------------------------------ 05:04:24 Check Flows In Switch After Owner Disconnected :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 | PASS | 05:04:25 ------------------------------------------------------------------------------ 05:04:25 Disconnect Mininet From Cluster :: Disconnect Mininet from Cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 | FAIL | 05:04:25 Variable '${original_owner_list}' not found. 05:04:25 ------------------------------------------------------------------------------ 05:04:25 Check No Switches After Disconnect :: Check no switches in topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:56 | FAIL | 05:04:56 Keyword 'ClusterOpenFlow.Check No Switches On Member' failed after retrying for 30 seconds. The last error was: '{"network-topology:network-topology":{"topology":[{"topology-id":"flow:1","node":[{"node-id":"openflow:2","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']","termination-point":[{"tp-id":"openflow:2:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:LOCAL']"},{"tp-id":"openflow:2:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:1']"},{"tp-id":"openflow:2:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:2']"},{"tp-id":"openflow:2:3","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:2']/node-connector[id='openflow:2:3']"}]},{"node-id":"openflow:3","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']","termination-point":[{"tp-id":"openflow:3:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:1']"},{"tp-id":"openflow:3:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:2']"},{"tp-id":"openflow:3:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:3']/node-connector[id='openflow:3:LOCAL']"}]},{"node-id":"openflow:1","opendaylight-topology-inventory:inventory-node-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']","termination-point":[{"tp-id":"openflow:1:2","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:2']"},{"tp-id":"openflow:1:LOCAL","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:LOCAL']"},{"tp-id":"openflow:1:1","opendaylight-topology-inventory:inventory-node-connector-ref":"/opendaylight-inventory:nodes/node[id='openflow:1']/node-connector[id='openflow:1:1']"}]}],"link":[{"link-id":"openflow:2:2","source":{"source-node":"openflow:2","source-tp":"openflow:2:2"},"destination":{"dest-tp":"openflow:1:2","dest-node":"openflow:1"}},{"link-id":"openflow:2:3","source":{"source-node":"openflow:2","source-tp":"openflow:2:3"},"destination":{"dest-tp":"openflow:3:2","dest-node":"openflow:3"}},{"link-id":"openflow:3:2","source":{"source-node":"openflow:3","source-tp":"openflow:3:2"},"destination":{"dest-tp":"openflow:2:3","dest-node":"openflow:2"}},{"link-id":"openflow:1:2","source":{"source-node":"openflow:1","source-tp":"openflow:1:2"},"destination":{"dest-tp":"openflow:2:2","dest-node":"openflow:2"}}]}]}}' contains 'openflow:1' 05:04:56 ------------------------------------------------------------------------------ 05:04:56 Check Switch Is Not Connected :: Check switch s1 is not connected ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:56 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:04:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:07 | FAIL | 05:05:07 Keyword 'OvsManager.Should Be Disconnected' failed after retrying for 10 seconds. The last error was: Dictionary does not contain key 's1'. 05:05:07 ------------------------------------------------------------------------------ 05:05:07 Reconnect Mininet To Cluster :: Reconnect mininet to cluster by re... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:07 10.30.170.174 05:05:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:08 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:08 10.30.170.199 05:05:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:09 10.30.171.237 05:05:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:10 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:10 | PASS | 05:05:10 ------------------------------------------------------------------------------ 05:05:10 Check Linear Topology After Mininet Reconnects :: Check Linear Top... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:11 | PASS | 05:05:11 ------------------------------------------------------------------------------ 05:05:11 Add Flows And Groups After Mininet Reconnects :: Add 1 group type ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:13 | PASS | 05:05:13 ------------------------------------------------------------------------------ 05:05:13 Check Flows In Operational DS After Mininet Reconnects :: Check Fl... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:13 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:15 | PASS | 05:05:15 ------------------------------------------------------------------------------ 05:05:15 Check Groups In Operational DS After Mininet Reconnects :: Check G... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 | PASS | 05:05:16 ------------------------------------------------------------------------------ 05:05:16 Check Flows In Switch After Mininet Reconnects :: Check Flows in s... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:16 | PASS | 05:05:16 ------------------------------------------------------------------------------ 05:05:16 Check Entity Owner Status And Find Owner and Successor Before Owne... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:17 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:47 | FAIL | 05:05:47 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:05:47 ------------------------------------------------------------------------------ 05:05:47 Check Switch Generates Slave Connection Before Owner Stop :: Check... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:47 | FAIL | 05:05:47 Variable '${original_successor}' not found. 05:05:47 ------------------------------------------------------------------------------ 05:05:47 Check Shards Status Before Owner Stop :: Check Status for all shar... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 | PASS | 05:05:52 ------------------------------------------------------------------------------ 05:05:52 Stop Owner Instance :: Stop Owner Instance and verify it is shutdown :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 | FAIL | 05:05:52 Variable '${original_owner}' not found. 05:05:52 ------------------------------------------------------------------------------ 05:05:52 Check Shards Status After Stop :: Check Status for all shards in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 | FAIL | 05:05:52 Variable '${new_cluster_list}' not found. 05:05:52 ------------------------------------------------------------------------------ 05:05:52 Check Entity Owner Status And Find Owner and Successor After Stop ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:52 | FAIL | 05:05:52 Variable '${original_successor}' not found. 05:05:52 ------------------------------------------------------------------------------ 05:05:52 Check Stats Are Not Frozen After Owner Stop :: Check that duration... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:05:53 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:23 | FAIL | 05:06:23 Keyword 'Check Flow Stats Are Not Frozen' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 05:06:23 ------------------------------------------------------------------------------ 05:06:23 Remove Configuration In Owner and Verify After Owner Stop :: Remov... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:23 | FAIL | 05:06:23 Variable '${new_owner}' not found. 05:06:23 ------------------------------------------------------------------------------ 05:06:23 Check Flows After Owner Stop In Operational DS :: Check Flows in O... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:54 | FAIL | 05:06:54 Keyword 'ClusterOpenFlow.Check Number Of Flows On Member' failed after retrying for 30 seconds. The last error was: Variable '${new_owner}' not found. 05:06:54 ------------------------------------------------------------------------------ 05:06:54 Check Groups After Owner Stop In Operational DS :: Check Groups in... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:06:54 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:04 | FAIL | 05:07:04 Keyword 'ClusterOpenFlow.Check Number Of Groups On Member' failed after retrying for 10 seconds. The last error was: Variable '${new_owner}' not found. 05:07:04 ------------------------------------------------------------------------------ 05:07:04 Check Flows In Switch After Owner Stop :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 | FAIL | 05:07:05 303.0 != 300.0 05:07:05 ------------------------------------------------------------------------------ 05:07:05 Start Old Owner Instance :: Start old Owner Instance and verify it... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 | FAIL | 05:07:05 Variable '${original_owner}' not found. 05:07:05 ------------------------------------------------------------------------------ 05:07:05 Check Entity Owner Status And Find Owner and Successor After Start... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 | FAIL | 05:07:36 Keyword 'ClusterOpenFlow.Get OpenFlow Entity Owner Status For One Device' failed after retrying for 10 seconds. The last error was: Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:07:36 ------------------------------------------------------------------------------ 05:07:36 Check Linear Topology After Owner Restart :: Check Linear Topology. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 | PASS | 05:07:36 ------------------------------------------------------------------------------ 05:07:36 Add Configuration In Owner and Verify After Owner Restart :: Add 1... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:36 | FAIL | 05:07:36 Variable '${new_owner}' not found. 05:07:36 ------------------------------------------------------------------------------ 05:07:36 Check Stats Are Not Frozen After Owner Restart :: Check that durat... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 | FAIL | 05:07:37 Variable '${new_owner}' not found. 05:07:37 ------------------------------------------------------------------------------ 05:07:37 Check Flows In Operational DS After Owner Restart :: Check Flows i... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 | PASS | 05:07:37 ------------------------------------------------------------------------------ 05:07:37 Check Groups In Operational DS After Owner Restart :: Check Groups... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:37 | PASS | 05:07:37 ------------------------------------------------------------------------------ 05:07:37 Check Flows In Switch After Owner Restart :: Check Flows in switch. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:38 | PASS | 05:07:38 ------------------------------------------------------------------------------ 05:07:38 Restart Cluster :: Stop and Start cluster. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:39 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:39 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:07:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:00 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:07 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:12 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:19 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:31 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:38 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:43 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:52 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:55 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:08:59 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:02 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:06 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:09 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:14 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:16 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:18 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:21 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:23 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:26 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:28 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:30 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:33 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:35 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:37 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:42 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:45 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:48 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:49 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:09:50 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:25 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:25 | PASS | 05:10:25 ------------------------------------------------------------------------------ 05:10:25 Check Linear Topology After Controller Restarts :: Check Linear To... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:36 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:46 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:47 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:57 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:58 | PASS | 05:10:58 ------------------------------------------------------------------------------ 05:10:58 Check Stats Are Not Frozen After Cluster Restart :: Check that dur... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:10:58 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:03 | PASS | 05:11:03 ------------------------------------------------------------------------------ 05:11:03 Check Flows In Operational DS After Controller Restarts :: Check F... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:03 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:04 | PASS | 05:11:04 ------------------------------------------------------------------------------ 05:11:04 Check Groups In Operational DS After Controller Restarts :: Check ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:04 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:04 | PASS | 05:11:04 ------------------------------------------------------------------------------ 05:11:04 Check Flows In Switch After Controller Restarts :: Check Flows in ... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 | PASS | 05:11:05 ------------------------------------------------------------------------------ 05:11:05 Stop Mininet :: Stop Mininet. :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:05 | PASS | 05:11:05 ------------------------------------------------------------------------------ 05:11:05 Check No Switches :: Check no switches in topology. | PASS | 05:11:07 ------------------------------------------------------------------------------ 05:11:09 openflowplugin-clustering.txt.010 Group Flows :: Switch connection... | FAIL | 05:11:09 72 tests, 43 passed, 29 failed 05:11:09 ============================================================================== 05:11:09 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite for entit... 05:11:09 ============================================================================== 05:11:13 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 05:11:13 Parent suite setup failed: 05:11:13 Dictionary does not contain key 's1'. 05:11:13 ------------------------------------------------------------------------------ 05:11:13 Reconnecting Switch s1 | FAIL | 05:11:13 Parent suite setup failed: 05:11:13 Dictionary does not contain key 's1'. 05:11:13 ------------------------------------------------------------------------------ 05:11:13 Switches Still Be Connected To All Nodes | FAIL | 05:11:13 Parent suite setup failed: 05:11:13 Dictionary does not contain key 's1'. 05:11:13 ------------------------------------------------------------------------------ 05:11:13 openflowplugin-clustering.txt.010 Switch Disconnect :: Test suite ... | FAIL | 05:11:13 Suite setup failed: 05:11:13 Dictionary does not contain key 's1'. 05:11:13 05:11:13 3 tests, 0 passed, 3 failed 05:11:13 ============================================================================== 05:11:13 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test suite for en... 05:11:13 ============================================================================== 05:11:19 Switches To Be Connected To All Nodes :: Initial check for correct... | FAIL | 05:11:19 Parent suite setup failed: 05:11:19 Dictionary does not contain key 's1'. 05:11:19 ------------------------------------------------------------------------------ 05:11:19 Restarting Owner Of Switch s1 | FAIL | 05:11:19 Parent suite setup failed: 05:11:19 Dictionary does not contain key 's1'. 05:11:19 ------------------------------------------------------------------------------ 05:11:19 Switches Still Be Connected To All Nodes | FAIL | 05:11:19 Parent suite setup failed: 05:11:19 Dictionary does not contain key 's1'. 05:11:19 ------------------------------------------------------------------------------ 05:11:19 openflowplugin-clustering.txt.020 Cluster Node Failure :: Test sui... | FAIL | 05:11:19 Suite setup failed: 05:11:19 Dictionary does not contain key 's1'. 05:11:19 05:11:19 3 tests, 0 passed, 3 failed 05:11:19 ============================================================================== 05:11:19 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test suite for e... 05:11:19 ============================================================================== 05:11:21 Start Mininet To All Nodes | FAIL | 05:11:23 Dictionary does not contain key 's1'. 05:11:23 ------------------------------------------------------------------------------ 05:11:23 Switches To Be Connected To All Nodes :: Initial check for correct... :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:24 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:39 | FAIL | 05:11:39 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 05:11:39 ------------------------------------------------------------------------------ 05:11:39 Isolating Owner Of Switch s1 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:11:40 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:11 | FAIL | 05:12:11 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:12:11 05:12:11 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:12:11 ------------------------------------------------------------------------------ 05:12:11 Switches Still Be Connected To All Nodes :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:11 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:26 | FAIL | 05:12:26 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:12:26 05:12:26 Keyword 'Check All Switches Connected To All Cluster Nodes' failed after retrying 15 times. The last error was: Dictionary does not contain key 's1'. 05:12:26 ------------------------------------------------------------------------------ 05:12:26 Stop Mininet And Verify No Owners :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:27 :1: SyntaxWarning: "is not" with a literal. Did you mean "!="? 05:12:41 | FAIL | 05:12:41 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=6177 05:12:41 05:12:41 Keyword 'Check No Device Owners In Controller' failed after retrying 15 times. The last error was: Dictionary does not contain key '1'. 05:12:41 ------------------------------------------------------------------------------ 05:12:43 openflowplugin-clustering.txt.030 Cluster Sync Problems :: Test su... | FAIL | 05:12:43 5 tests, 0 passed, 5 failed 05:12:43 ============================================================================== 05:12:43 openflowplugin-clustering.txt.9145 :: Switch connections and cluster are re... 05:12:43 ============================================================================== 05:12:43 Start Mininet Multiple Connections :: Start mininet linear with co... | PASS | 05:12:51 ------------------------------------------------------------------------------ 05:12:51 Check Entity Owner Status And Find Owner and Successor :: Check En... | FAIL | 05:13:21 This test fails due to https://bugs.opendaylight.org/show_bug.cgi?id=9145 05:13:21 05:13:21 Keyword 'ClusterManagement.Verify_Owner_And_Successors_For_Device' failed after retrying for 30 seconds. The last error was: Could not parse owner and candidates for device openflow:1 05:13:21 ------------------------------------------------------------------------------ 05:13:21 Stop Mininet :: Stop Mininet. | PASS | 05:13:21 ------------------------------------------------------------------------------ 05:13:21 openflowplugin-clustering.txt.9145 :: Switch connections and clust... | FAIL | 05:13:21 3 tests, 2 passed, 1 failed 05:13:21 ============================================================================== 05:13:21 openflowplugin-clustering.txt | FAIL | 05:13:21 209 tests, 88 passed, 121 failed 05:13:21 ============================================================================== 05:13:21 Output: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/output.xml 05:13:28 Log: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/log.html 05:13:28 Report: /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/report.html 05:13:28 + true 05:13:28 + echo 'Examining the files in data/log and checking filesize' 05:13:28 Examining the files in data/log and checking filesize 05:13:28 + ssh 10.30.170.174 'ls -altr /tmp/karaf-0.21.4/data/log/' 05:13:28 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:29 total 30232 05:13:29 drwxrwxr-x 2 jenkins jenkins 4096 Dec 2 04:49 . 05:13:29 -rw-rw-r-- 1 jenkins jenkins 1720 Dec 2 04:49 karaf_console.log 05:13:29 drwxrwxr-x 9 jenkins jenkins 4096 Dec 2 04:50 .. 05:13:29 -rw-rw-r-- 1 jenkins jenkins 30938547 Dec 2 05:13 karaf.log 05:13:29 + ssh 10.30.170.174 'du -hs /tmp/karaf-0.21.4/data/log/*' 05:13:29 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:29 30M /tmp/karaf-0.21.4/data/log/karaf.log 05:13:29 4.0K /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:29 + ssh 10.30.170.199 'ls -altr /tmp/karaf-0.21.4/data/log/' 05:13:29 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:29 total 26324 05:13:29 drwxrwxr-x 2 jenkins jenkins 4096 Dec 2 04:49 . 05:13:29 -rw-rw-r-- 1 jenkins jenkins 1720 Dec 2 04:49 karaf_console.log 05:13:29 drwxrwxr-x 9 jenkins jenkins 4096 Dec 2 04:50 .. 05:13:29 -rw-rw-r-- 1 jenkins jenkins 26939281 Dec 2 05:13 karaf.log 05:13:29 + ssh 10.30.170.199 'du -hs /tmp/karaf-0.21.4/data/log/*' 05:13:29 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:29 26M /tmp/karaf-0.21.4/data/log/karaf.log 05:13:29 4.0K /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:29 + ssh 10.30.171.237 'ls -altr /tmp/karaf-0.21.4/data/log/' 05:13:29 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:29 total 29336 05:13:29 drwxrwxr-x 2 jenkins jenkins 4096 Dec 2 04:49 . 05:13:29 -rw-rw-r-- 1 jenkins jenkins 1720 Dec 2 04:49 karaf_console.log 05:13:29 drwxrwxr-x 9 jenkins jenkins 4096 Dec 2 04:50 .. 05:13:29 -rw-rw-r-- 1 jenkins jenkins 30019612 Dec 2 05:13 karaf.log 05:13:29 + ssh 10.30.171.237 'du -hs /tmp/karaf-0.21.4/data/log/*' 05:13:29 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:30 29M /tmp/karaf-0.21.4/data/log/karaf.log 05:13:30 4.0K /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:30 + set +e 05:13:30 ++ seq 1 3 05:13:30 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:30 + CONTROLLERIP=ODL_SYSTEM_1_IP 05:13:30 + echo 'Let'\''s take the karaf thread dump again' 05:13:30 Let's take the karaf thread dump again 05:13:30 + ssh 10.30.170.174 'sudo ps aux' 05:13:30 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:30 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_after.log 05:13:30 ++ grep -v grep 05:13:30 ++ tr -s ' ' 05:13:30 ++ cut -f2 '-d ' 05:13:30 + pid=6919 05:13:30 + echo 'karaf main: org.apache.karaf.main.Main, pid:6919' 05:13:30 karaf main: org.apache.karaf.main.Main, pid:6919 05:13:30 + ssh 10.30.170.174 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 6919' 05:13:30 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:30 + echo 'killing karaf process...' 05:13:30 killing karaf process... 05:13:30 + ssh 10.30.170.174 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:13:30 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:31 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:31 + CONTROLLERIP=ODL_SYSTEM_2_IP 05:13:31 + echo 'Let'\''s take the karaf thread dump again' 05:13:31 Let's take the karaf thread dump again 05:13:31 + ssh 10.30.170.199 'sudo ps aux' 05:13:31 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:31 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_after.log 05:13:31 ++ grep -v grep 05:13:31 ++ cut -f2 '-d ' 05:13:31 ++ tr -s ' ' 05:13:31 + pid=7454 05:13:31 + echo 'karaf main: org.apache.karaf.main.Main, pid:7454' 05:13:31 karaf main: org.apache.karaf.main.Main, pid:7454 05:13:31 + ssh 10.30.170.199 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 7454' 05:13:31 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:31 + echo 'killing karaf process...' 05:13:31 killing karaf process... 05:13:31 + ssh 10.30.170.199 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:13:32 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:32 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:32 + CONTROLLERIP=ODL_SYSTEM_3_IP 05:13:32 + echo 'Let'\''s take the karaf thread dump again' 05:13:32 Let's take the karaf thread dump again 05:13:32 + ssh 10.30.171.237 'sudo ps aux' 05:13:32 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:32 ++ grep org.apache.karaf.main.Main /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/ps_after.log 05:13:32 ++ grep -v grep 05:13:32 ++ cut -f2 '-d ' 05:13:32 ++ tr -s ' ' 05:13:32 + pid=14081 05:13:32 + echo 'karaf main: org.apache.karaf.main.Main, pid:14081' 05:13:32 karaf main: org.apache.karaf.main.Main, pid:14081 05:13:32 + ssh 10.30.171.237 '/usr/lib/jvm/java-21-openjdk-amd64/bin/jstack -l 14081' 05:13:32 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:32 + echo 'killing karaf process...' 05:13:32 killing karaf process... 05:13:32 + ssh 10.30.171.237 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' 05:13:32 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:33 + sleep 5 05:13:38 ++ seq 1 3 05:13:38 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:38 + CONTROLLERIP=ODL_SYSTEM_1_IP 05:13:38 + echo 'Compressing karaf.log 1' 05:13:38 Compressing karaf.log 1 05:13:38 + ssh 10.30.170.174 gzip --best /tmp/karaf-0.21.4/data/log/karaf.log 05:13:38 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:38 gzip: /tmp/karaf-0.21.4/data/log/karaf.log: file size changed while zipping 05:13:38 + echo 'Fetching compressed karaf.log 1' 05:13:38 Fetching compressed karaf.log 1 05:13:38 + scp 10.30.170.174:/tmp/karaf-0.21.4/data/log/karaf.log.gz odl1_karaf.log.gz 05:13:38 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:38 + ssh 10.30.170.174 rm -f /tmp/karaf-0.21.4/data/log/karaf.log.gz 05:13:38 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:39 + scp 10.30.170.174:/tmp/karaf-0.21.4/data/log/karaf_console.log odl1_karaf_console.log 05:13:39 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:39 + ssh 10.30.170.174 rm -f /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:39 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:39 + echo 'Fetch GC logs' 05:13:39 Fetch GC logs 05:13:39 + mkdir -p gclogs-1 05:13:39 + scp '10.30.170.174:/tmp/karaf-0.21.4/data/log/*.log' gclogs-1/ 05:13:39 Warning: Permanently added '10.30.170.174' (ECDSA) to the list of known hosts. 05:13:39 scp: /tmp/karaf-0.21.4/data/log/*.log: No such file or directory 05:13:39 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:39 + CONTROLLERIP=ODL_SYSTEM_2_IP 05:13:39 + echo 'Compressing karaf.log 2' 05:13:39 Compressing karaf.log 2 05:13:39 + ssh 10.30.170.199 gzip --best /tmp/karaf-0.21.4/data/log/karaf.log 05:13:39 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:40 gzip: /tmp/karaf-0.21.4/data/log/karaf.log: file size changed while zipping 05:13:40 + echo 'Fetching compressed karaf.log 2' 05:13:40 Fetching compressed karaf.log 2 05:13:40 + scp 10.30.170.199:/tmp/karaf-0.21.4/data/log/karaf.log.gz odl2_karaf.log.gz 05:13:40 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:40 + ssh 10.30.170.199 rm -f /tmp/karaf-0.21.4/data/log/karaf.log.gz 05:13:40 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:40 + scp 10.30.170.199:/tmp/karaf-0.21.4/data/log/karaf_console.log odl2_karaf_console.log 05:13:40 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:40 + ssh 10.30.170.199 rm -f /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:41 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:41 + echo 'Fetch GC logs' 05:13:41 Fetch GC logs 05:13:41 + mkdir -p gclogs-2 05:13:41 + scp '10.30.170.199:/tmp/karaf-0.21.4/data/log/*.log' gclogs-2/ 05:13:41 Warning: Permanently added '10.30.170.199' (ECDSA) to the list of known hosts. 05:13:41 scp: /tmp/karaf-0.21.4/data/log/*.log: No such file or directory 05:13:41 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") 05:13:41 + CONTROLLERIP=ODL_SYSTEM_3_IP 05:13:41 + echo 'Compressing karaf.log 3' 05:13:41 Compressing karaf.log 3 05:13:41 + ssh 10.30.171.237 gzip --best /tmp/karaf-0.21.4/data/log/karaf.log 05:13:41 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:41 gzip: /tmp/karaf-0.21.4/data/log/karaf.log: file size changed while zipping 05:13:41 + echo 'Fetching compressed karaf.log 3' 05:13:41 Fetching compressed karaf.log 3 05:13:41 + scp 10.30.171.237:/tmp/karaf-0.21.4/data/log/karaf.log.gz odl3_karaf.log.gz 05:13:42 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:42 + ssh 10.30.171.237 rm -f /tmp/karaf-0.21.4/data/log/karaf.log.gz 05:13:42 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:42 + scp 10.30.171.237:/tmp/karaf-0.21.4/data/log/karaf_console.log odl3_karaf_console.log 05:13:42 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:42 + ssh 10.30.171.237 rm -f /tmp/karaf-0.21.4/data/log/karaf_console.log 05:13:42 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:42 + echo 'Fetch GC logs' 05:13:42 Fetch GC logs 05:13:42 + mkdir -p gclogs-3 05:13:42 + scp '10.30.171.237:/tmp/karaf-0.21.4/data/log/*.log' gclogs-3/ 05:13:42 Warning: Permanently added '10.30.171.237' (ECDSA) to the list of known hosts. 05:13:43 scp: /tmp/karaf-0.21.4/data/log/*.log: No such file or directory 05:13:43 + echo 'Examine copied files' 05:13:43 Examine copied files 05:13:43 + ls -lt 05:13:43 total 56296 05:13:43 drwxrwxr-x. 2 jenkins jenkins 6 Dec 2 05:13 gclogs-3 05:13:43 -rw-rw-r--. 1 jenkins jenkins 1720 Dec 2 05:13 odl3_karaf_console.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 525456 Dec 2 05:13 odl3_karaf.log.gz 05:13:43 drwxrwxr-x. 2 jenkins jenkins 6 Dec 2 05:13 gclogs-2 05:13:43 -rw-rw-r--. 1 jenkins jenkins 1720 Dec 2 05:13 odl2_karaf_console.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 481211 Dec 2 05:13 odl2_karaf.log.gz 05:13:43 drwxrwxr-x. 2 jenkins jenkins 6 Dec 2 05:13 gclogs-1 05:13:43 -rw-rw-r--. 1 jenkins jenkins 1720 Dec 2 05:13 odl1_karaf_console.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 749310 Dec 2 05:13 odl1_karaf.log.gz 05:13:43 -rw-rw-r--. 1 jenkins jenkins 122375 Dec 2 05:13 karaf_3_14081_threads_after.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 13721 Dec 2 05:13 ps_after.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 131666 Dec 2 05:13 karaf_2_7454_threads_after.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 132337 Dec 2 05:13 karaf_1_6919_threads_after.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 275222 Dec 2 05:13 report.html 05:13:43 -rw-rw-r--. 1 jenkins jenkins 4183829 Dec 2 05:13 log.html 05:13:43 -rw-rw-r--. 1 jenkins jenkins 50636684 Dec 2 05:13 output.xml 05:13:43 -rw-rw-r--. 1 jenkins jenkins 1180 Dec 2 04:53 testplan.txt 05:13:43 -rw-rw-r--. 1 jenkins jenkins 92372 Dec 2 04:53 karaf_3_2143_threads_before.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 13957 Dec 2 04:53 ps_before.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 98247 Dec 2 04:53 karaf_2_2144_threads_before.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 95778 Dec 2 04:53 karaf_1_2134_threads_before.log 05:13:43 -rw-rw-r--. 1 jenkins jenkins 3043 Dec 2 04:49 post-startup-script.sh 05:13:43 -rw-rw-r--. 1 jenkins jenkins 1179 Dec 2 04:49 set_akka_debug.sh 05:13:43 -rw-rw-r--. 1 jenkins jenkins 133 Dec 2 04:49 configplan.txt 05:13:43 -rw-rw-r--. 1 jenkins jenkins 225 Dec 2 04:49 startup-script.sh 05:13:43 -rw-rw-r--. 1 jenkins jenkins 3289 Dec 2 04:49 configuration-script.sh 05:13:43 -rw-rw-r--. 1 jenkins jenkins 266 Dec 2 04:49 detect_variables.env 05:13:43 -rw-rw-r--. 1 jenkins jenkins 92 Dec 2 04:49 set_variables.env 05:13:43 -rw-rw-r--. 1 jenkins jenkins 359 Dec 2 04:49 slave_addresses.txt 05:13:43 -rw-rw-r--. 1 jenkins jenkins 570 Dec 2 04:48 requirements.txt 05:13:43 -rw-rw-r--. 1 jenkins jenkins 26 Dec 2 04:48 env.properties 05:13:43 -rw-rw-r--. 1 jenkins jenkins 334 Dec 2 04:46 stack-parameters.yaml 05:13:43 drwxrwxr-x. 7 jenkins jenkins 4096 Dec 2 04:45 test 05:13:43 drwxrwxr-x. 2 jenkins jenkins 6 Dec 2 04:45 test@tmp 05:13:43 + true 05:13:43 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/sh /tmp/jenkins6224761611688636896.sh 05:13:43 Cleaning up Robot installation... 05:13:43 $ ssh-agent -k 05:13:43 unset SSH_AUTH_SOCK; 05:13:43 unset SSH_AGENT_PID; 05:13:43 echo Agent pid 5283 killed; 05:13:43 [ssh-agent] Stopped. 05:13:43 Recording plot data 05:13:43 Robot results publisher started... 05:13:43 INFO: Checking test criticality is deprecated and will be dropped in a future release! 05:13:43 -Parsing output xml: 05:13:44 Done! 05:13:44 -Copying log files to build dir: 05:13:46 Done! 05:13:46 -Assigning results to build: 05:13:46 Done! 05:13:46 -Checking thresholds: 05:13:46 Done! 05:13:46 Done publishing Robot results. 05:13:46 Build step 'Publish Robot Framework test results' changed build result to UNSTABLE 05:13:46 [PostBuildScript] - [INFO] Executing post build scripts. 05:13:46 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins6145812598739011883.sh 05:13:46 Archiving csit artifacts 05:13:46 mv: cannot stat '*_1.png': No such file or directory 05:13:46 mv: cannot stat '/tmp/odl1_*': No such file or directory 05:13:46 mv: cannot stat '*_2.png': No such file or directory 05:13:46 mv: cannot stat '/tmp/odl2_*': No such file or directory 05:13:46 mv: cannot stat '*_3.png': No such file or directory 05:13:46 mv: cannot stat '/tmp/odl3_*': No such file or directory 05:13:46 % Total % Received % Xferd Average Speed Time Time Time Current 05:13:46 Dload Upload Total Spent Left Speed 05:13:46 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 4680k 0 4680k 0 0 5086k 0 --:--:-- --:--:-- --:--:-- 5081k 100 5173k 0 5173k 0 0 5127k 0 --:--:-- 0:00:01 --:--:-- 5127k 05:13:47 Archive: robot-plugin.zip 05:13:47 inflating: ./archives/robot-plugin/log.html 05:13:47 inflating: ./archives/robot-plugin/output.xml 05:13:48 inflating: ./archives/robot-plugin/report.html 05:13:48 mv: cannot stat '*.log.gz': No such file or directory 05:13:48 mv: cannot stat '*.csv': No such file or directory 05:13:48 mv: cannot stat '*.png': No such file or directory 05:13:48 [PostBuildScript] - [INFO] Executing post build scripts. 05:13:48 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins15888623006471519486.sh 05:13:48 [PostBuildScript] - [INFO] Executing post build scripts. 05:13:48 [EnvInject] - Injecting environment variables from a build step. 05:13:48 [EnvInject] - Injecting as environment variables the properties content 05:13:48 OS_CLOUD=vex 05:13:48 OS_STACK_NAME=releng-openflowplugin-csit-3node-clustering-only-scandium-465 05:13:48 05:13:48 [EnvInject] - Variables injected successfully. 05:13:48 provisioning config files... 05:13:48 copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml 05:13:48 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins17845424921225017148.sh 05:13:48 ---> openstack-stack-delete.sh 05:13:48 Setup pyenv: 05:13:48 system 05:13:48 3.8.13 05:13:48 3.9.13 05:13:48 3.10.13 05:13:48 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 05:13:49 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 05:13:49 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 05:13:49 lf-activate-venv(): INFO: Attempting to install with network-safe options... 05:13:50 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 05:13:50 lftools 0.37.16 requires urllib3<2.1.0, but you have urllib3 2.5.0 which is incompatible. 05:13:51 lf-activate-venv(): INFO: Base packages installed successfully 05:13:51 lf-activate-venv(): INFO: Installing additional packages: lftools[openstack] kubernetes python-heatclient python-openstackclient urllib3~=1.26.15 05:14:09 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 05:14:09 INFO: Stack cost retrieval disabled, setting cost to 0 05:14:21 INFO: Deleting stack releng-openflowplugin-csit-3node-clustering-only-scandium-465 05:14:21 Successfully deleted stack releng-openflowplugin-csit-3node-clustering-only-scandium-465 05:14:21 [PostBuildScript] - [INFO] Executing post build scripts. 05:14:21 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins12886800453082081828.sh 05:14:21 ---> sysstat.sh 05:14:21 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins11336835355522030705.sh 05:14:21 ---> package-listing.sh 05:14:21 ++ facter osfamily 05:14:21 ++ tr '[:upper:]' '[:lower:]' 05:14:21 + OS_FAMILY=redhat 05:14:21 + workspace=/w/workspace/openflowplugin-csit-3node-clustering-only-scandium 05:14:21 + START_PACKAGES=/tmp/packages_start.txt 05:14:21 + END_PACKAGES=/tmp/packages_end.txt 05:14:21 + DIFF_PACKAGES=/tmp/packages_diff.txt 05:14:21 + PACKAGES=/tmp/packages_start.txt 05:14:21 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-scandium ']' 05:14:21 + PACKAGES=/tmp/packages_end.txt 05:14:21 + case "${OS_FAMILY}" in 05:14:21 + rpm -qa 05:14:21 + sort 05:14:22 + '[' -f /tmp/packages_start.txt ']' 05:14:22 + '[' -f /tmp/packages_end.txt ']' 05:14:22 + diff /tmp/packages_start.txt /tmp/packages_end.txt 05:14:22 + '[' /w/workspace/openflowplugin-csit-3node-clustering-only-scandium ']' 05:14:22 + mkdir -p /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/archives/ 05:14:22 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/archives/ 05:14:22 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins4872148277189422386.sh 05:14:22 ---> capture-instance-metadata.sh 05:14:22 Setup pyenv: 05:14:22 system 05:14:22 3.8.13 05:14:22 3.9.13 05:14:22 3.10.13 05:14:22 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 05:14:22 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 05:14:22 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 05:14:22 lf-activate-venv(): INFO: Attempting to install with network-safe options... 05:14:24 lf-activate-venv(): INFO: Base packages installed successfully 05:14:24 lf-activate-venv(): INFO: Installing additional packages: lftools 05:14:34 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 05:14:34 INFO: Running in OpenStack, capturing instance metadata 05:14:35 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins8047555434910434791.sh 05:14:35 provisioning config files... 05:14:35 Could not find credentials [logs] for openflowplugin-csit-3node-clustering-only-scandium #465 05:14:35 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/openflowplugin-csit-3node-clustering-only-scandium@tmp/config7801131339139766757tmp 05:14:35 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 05:14:35 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 05:14:35 provisioning config files... 05:14:35 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 05:14:35 [EnvInject] - Injecting environment variables from a build step. 05:14:35 [EnvInject] - Injecting as environment variables the properties content 05:14:35 SERVER_ID=logs 05:14:35 05:14:35 [EnvInject] - Variables injected successfully. 05:14:35 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins8063516744325674639.sh 05:14:35 ---> create-netrc.sh 05:14:35 WARN: Log server credential not found. 05:14:35 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins13036158284682574959.sh 05:14:35 ---> python-tools-install.sh 05:14:35 Setup pyenv: 05:14:35 system 05:14:35 3.8.13 05:14:35 3.9.13 05:14:35 3.10.13 05:14:35 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 05:14:36 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 05:14:36 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 05:14:36 lf-activate-venv(): INFO: Attempting to install with network-safe options... 05:14:37 lf-activate-venv(): INFO: Base packages installed successfully 05:14:37 lf-activate-venv(): INFO: Installing additional packages: lftools 05:14:47 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 05:14:47 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins10842260149835362334.sh 05:14:47 ---> sudo-logs.sh 05:14:47 Archiving 'sudo' log.. 05:14:48 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash /tmp/jenkins4055252535735832489.sh 05:14:48 ---> job-cost.sh 05:14:48 Setup pyenv: 05:14:48 system 05:14:48 3.8.13 05:14:48 3.9.13 05:14:48 3.10.13 05:14:48 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 05:14:48 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 05:14:48 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 05:14:48 lf-activate-venv(): INFO: Attempting to install with network-safe options... 05:14:50 lf-activate-venv(): INFO: Base packages installed successfully 05:14:50 lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 05:14:57 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 05:14:57 DEBUG: total: 0 05:14:57 INFO: Retrieving Stack Cost... 05:14:58 INFO: Retrieving Pricing Info for: v3-standard-2 05:14:58 INFO: Archiving Costs 05:14:58 [openflowplugin-csit-3node-clustering-only-scandium] $ /bin/bash -l /tmp/jenkins6828376327948628276.sh 05:14:58 ---> logs-deploy.sh 05:14:58 Setup pyenv: 05:14:58 system 05:14:58 3.8.13 05:14:58 3.9.13 05:14:58 3.10.13 05:14:58 * 3.11.7 (set by /w/workspace/openflowplugin-csit-3node-clustering-only-scandium/.python-version) 05:14:58 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-AAlE from file:/tmp/.os_lf_venv 05:14:58 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 05:14:58 lf-activate-venv(): INFO: Attempting to install with network-safe options... 05:15:00 lf-activate-venv(): INFO: Base packages installed successfully 05:15:00 lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 05:15:13 lf-activate-venv(): INFO: Adding /tmp/venv-AAlE/bin to PATH 05:15:13 WARNING: Nexus logging server not set 05:15:13 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/openflowplugin-csit-3node-clustering-only-scandium/465/ 05:15:13 INFO: archiving logs to S3 05:15:14 ---> uname -a: 05:15:14 Linux prd-centos8-robot-2c-8g-3048.novalocal 4.18.0-553.5.1.el8.x86_64 #1 SMP Tue May 21 05:46:01 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux 05:15:14 05:15:14 05:15:14 ---> lscpu: 05:15:14 Architecture: x86_64 05:15:14 CPU op-mode(s): 32-bit, 64-bit 05:15:14 Byte Order: Little Endian 05:15:14 CPU(s): 2 05:15:14 On-line CPU(s) list: 0,1 05:15:14 Thread(s) per core: 1 05:15:14 Core(s) per socket: 1 05:15:14 Socket(s): 2 05:15:14 NUMA node(s): 1 05:15:14 Vendor ID: AuthenticAMD 05:15:14 CPU family: 23 05:15:14 Model: 49 05:15:14 Model name: AMD EPYC-Rome Processor 05:15:14 Stepping: 0 05:15:14 CPU MHz: 2800.000 05:15:14 BogoMIPS: 5600.00 05:15:14 Virtualization: AMD-V 05:15:14 Hypervisor vendor: KVM 05:15:14 Virtualization type: full 05:15:14 L1d cache: 32K 05:15:14 L1i cache: 32K 05:15:14 L2 cache: 512K 05:15:14 L3 cache: 16384K 05:15:14 NUMA node0 CPU(s): 0,1 05:15:14 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 05:15:14 05:15:14 05:15:14 ---> nproc: 05:15:14 2 05:15:14 05:15:14 05:15:14 ---> df -h: 05:15:14 Filesystem Size Used Avail Use% Mounted on 05:15:14 devtmpfs 3.8G 0 3.8G 0% /dev 05:15:14 tmpfs 3.8G 0 3.8G 0% /dev/shm 05:15:14 tmpfs 3.8G 17M 3.8G 1% /run 05:15:14 tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup 05:15:14 /dev/vda1 40G 8.4G 32G 21% / 05:15:14 tmpfs 770M 0 770M 0% /run/user/1001 05:15:14 05:15:14 05:15:14 ---> free -m: 05:15:14 total used free shared buff/cache available 05:15:14 Mem: 7697 657 4755 19 2284 6741 05:15:14 Swap: 1023 0 1023 05:15:14 05:15:14 05:15:14 ---> ip addr: 05:15:14 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 05:15:14 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 05:15:14 inet 127.0.0.1/8 scope host lo 05:15:14 valid_lft forever preferred_lft forever 05:15:14 inet6 ::1/128 scope host 05:15:14 valid_lft forever preferred_lft forever 05:15:14 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 05:15:14 link/ether fa:16:3e:32:33:2a brd ff:ff:ff:ff:ff:ff 05:15:14 altname enp0s3 05:15:14 altname ens3 05:15:14 inet 10.30.171.4/23 brd 10.30.171.255 scope global dynamic noprefixroute eth0 05:15:14 valid_lft 84589sec preferred_lft 84589sec 05:15:14 inet6 fe80::f816:3eff:fe32:332a/64 scope link 05:15:14 valid_lft forever preferred_lft forever 05:15:14 05:15:14 05:15:14 ---> sar -b -r -n DEV: 05:15:14 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 12/02/2025 _x86_64_ (2 CPU) 05:15:14 05:15:14 04:44:59 LINUX RESTART (2 CPU) 05:15:14 05:15:14 04:46:01 AM tps rtps wtps bread/s bwrtn/s 05:15:14 04:47:01 AM 113.16 0.30 112.86 14.00 10195.37 05:15:14 04:48:01 AM 47.50 0.85 46.65 63.98 6640.94 05:15:14 04:49:01 AM 32.79 0.42 32.38 64.79 2132.46 05:15:14 04:50:01 AM 69.84 6.96 62.88 1292.77 7692.62 05:15:14 04:51:01 AM 11.06 0.00 11.06 0.00 694.69 05:15:14 04:52:01 AM 2.38 0.00 2.38 0.00 68.04 05:15:14 04:53:01 AM 0.13 0.00 0.13 0.00 1.68 05:15:14 04:54:01 AM 0.62 0.25 0.37 13.60 95.33 05:15:14 04:55:01 AM 2.18 0.00 2.18 0.00 233.09 05:15:14 04:56:01 AM 0.63 0.00 0.63 0.00 190.37 05:15:14 04:57:01 AM 0.37 0.00 0.37 0.00 150.16 05:15:14 04:58:01 AM 0.20 0.00 0.20 0.00 28.20 05:15:14 04:59:01 AM 0.15 0.00 0.15 0.00 42.15 05:15:14 05:00:00 AM 0.13 0.00 0.13 0.00 16.60 05:15:14 05:00:01 AM 7.46 2.99 4.48 23.88 202.99 05:15:14 05:01:01 AM 1.38 0.02 1.37 0.13 39.15 05:15:14 05:02:01 AM 0.38 0.07 0.32 1.73 22.73 05:15:14 05:03:01 AM 0.53 0.00 0.53 0.00 204.82 05:15:14 05:04:01 AM 0.28 0.00 0.28 0.00 134.37 05:15:14 05:05:01 AM 0.33 0.00 0.33 0.00 158.81 05:15:14 05:06:01 AM 0.85 0.32 0.53 6.93 210.45 05:15:14 05:07:01 AM 0.73 0.00 0.73 0.00 21.11 05:15:14 05:08:01 AM 0.27 0.00 0.27 0.00 87.87 05:15:14 05:09:01 AM 0.25 0.00 0.25 0.00 16.03 05:15:14 05:10:01 AM 0.12 0.00 0.12 0.00 9.07 05:15:14 05:11:01 AM 0.37 0.00 0.37 0.00 26.14 05:15:14 05:12:01 AM 0.43 0.00 0.43 0.00 97.68 05:15:14 05:13:01 AM 0.47 0.00 0.47 0.00 49.13 05:15:14 05:14:01 AM 7.23 0.32 6.92 36.93 367.61 05:15:14 05:15:01 AM 34.62 0.52 34.11 63.85 3448.62 05:15:14 Average: 11.36 0.35 11.02 53.76 1140.60 05:15:14 05:15:14 04:46:01 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 05:15:14 04:47:01 AM 5242060 7041444 2640364 33.50 2688 1976208 679724 7.61 187560 2111192 129772 05:15:14 04:48:01 AM 5191936 7035976 2690488 34.13 2688 2020236 683544 7.65 209560 2132876 9936 05:15:14 04:49:01 AM 5011620 6986104 2870804 36.42 2688 2145276 721304 8.08 234196 2272396 102020 05:15:14 04:50:01 AM 4908904 7008436 2973520 37.72 2688 2267152 687160 7.69 261016 2326084 10852 05:15:14 04:51:01 AM 4913932 7013296 2968492 37.66 2688 2267156 687160 7.69 261020 2325968 8 05:15:14 04:52:01 AM 4913912 7013276 2968512 37.66 2688 2267156 687160 7.69 261020 2325928 4 05:15:14 04:53:01 AM 4914000 7013364 2968424 37.66 2688 2267156 687160 7.69 261020 2325928 4 05:15:14 04:54:01 AM 4856056 6958860 3026368 38.39 2688 2270620 793596 8.89 261344 2383032 436 05:15:14 04:55:01 AM 4846336 6955484 3036088 38.52 2688 2276940 793596 8.89 261344 2392432 1904 05:15:14 04:56:01 AM 4840312 6953340 3042112 38.59 2688 2280836 813736 9.11 261344 2398528 340 05:15:14 04:57:01 AM 4838336 6955452 3044088 38.62 2688 2284908 813124 9.10 261568 2400412 8 05:15:14 04:58:01 AM 4837356 6955524 3045068 38.63 2688 2285960 813124 9.10 261712 2401580 284 05:15:14 04:59:01 AM 4838388 6957616 3044036 38.62 2688 2287028 796800 8.92 261712 2400648 116 05:15:14 05:00:00 AM 4836996 6956608 3045428 38.64 2688 2287412 793444 8.88 261712 2402000 36 05:15:14 05:00:01 AM 4837296 6956920 3045128 38.63 2688 2287412 793444 8.88 261840 2401728 52 05:15:14 05:01:01 AM 4835780 6955736 3046644 38.65 2688 2287744 773904 8.67 261840 2402916 132 05:15:14 05:02:01 AM 4828912 6952852 3053512 38.74 2688 2291728 808332 9.05 262068 2409592 3508 05:15:14 05:03:01 AM 4820632 6947064 3061792 38.84 2688 2294220 824780 9.24 262068 2417580 40 05:15:14 05:04:01 AM 4813888 6945560 3068536 38.93 2688 2299468 824780 9.24 262068 2424252 1320 05:15:14 05:05:01 AM 4810888 6946000 3071536 38.97 2688 2302952 824780 9.24 262068 2427256 120 05:15:14 05:06:01 AM 4804584 6945756 3077840 39.05 2688 2309076 824780 9.24 262068 2433380 4 05:15:14 05:07:01 AM 4804768 6946280 3077656 39.04 2688 2309300 824780 9.24 262192 2433552 56 05:15:14 05:08:01 AM 4804076 6948124 3078348 39.05 2688 2311836 808296 9.05 262192 2434180 40 05:15:14 05:09:01 AM 4803604 6948048 3078820 39.06 2688 2312232 808296 9.05 262192 2434576 24 05:15:14 05:10:01 AM 4804492 6949416 3077932 39.05 2688 2312712 808440 9.05 262192 2433904 264 05:15:14 05:11:01 AM 4804160 6949540 3078264 39.05 2688 2313168 788968 8.83 262192 2434184 124 05:15:14 05:12:01 AM 4798188 6946392 3084236 39.13 2688 2316000 853648 9.56 262340 2439360 260 05:15:14 05:13:01 AM 4794736 6944064 3087688 39.17 2688 2317144 871628 9.76 262340 2442688 84 05:15:14 05:14:01 AM 4822828 6888880 3059596 38.82 2688 2237720 787444 8.82 344092 2350044 107980 05:15:14 05:15:01 AM 4928744 6960064 2953680 37.47 2688 2204704 726900 8.14 510660 2084896 25104 05:15:14 Average: 4870257 6964516 3012167 38.21 2688 2263049 780128 8.74 267685 2366770 13161 05:15:14 05:15:14 04:46:01 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 05:15:14 04:47:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:47:01 AM eth0 82.05 68.77 889.55 9.73 0.00 0.00 0.00 0.00 05:15:14 04:48:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:48:01 AM eth0 58.33 40.50 582.92 5.70 0.00 0.00 0.00 0.00 05:15:14 04:49:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:49:01 AM eth0 45.01 40.68 329.77 8.64 0.00 0.00 0.00 0.00 05:15:14 04:50:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:50:01 AM eth0 587.40 413.50 126.81 99.26 0.00 0.00 0.00 0.00 05:15:14 04:51:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:51:01 AM eth0 2.78 2.03 0.67 0.49 0.00 0.00 0.00 0.00 05:15:14 04:52:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:52:01 AM eth0 1.22 0.72 0.17 0.12 0.00 0.00 0.00 0.00 05:15:14 04:53:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:53:01 AM eth0 2.33 1.87 0.43 0.41 0.00 0.00 0.00 0.00 05:15:14 04:54:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:54:01 AM eth0 53.86 39.04 21.85 4.62 0.00 0.00 0.00 0.00 05:15:14 04:55:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:55:01 AM eth0 30.91 31.07 35.83 4.32 0.00 0.00 0.00 0.00 05:15:14 04:56:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:56:01 AM eth0 19.28 17.70 18.16 2.87 0.00 0.00 0.00 0.00 05:15:14 04:57:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:57:01 AM eth0 90.02 86.29 125.90 7.57 0.00 0.00 0.00 0.00 05:15:14 04:58:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:58:01 AM eth0 52.66 51.31 10.96 4.34 0.00 0.00 0.00 0.00 05:15:14 04:59:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 04:59:01 AM eth0 53.80 52.07 11.52 4.95 0.00 0.00 0.00 0.00 05:15:14 05:00:00 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:00:00 AM eth0 11.76 9.23 2.89 1.82 0.00 0.00 0.00 0.00 05:15:14 05:00:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:00:01 AM eth0 13.43 14.93 3.52 4.50 0.00 0.00 0.00 0.00 05:15:14 05:01:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:01:01 AM eth0 9.00 7.30 2.50 1.63 0.00 0.00 0.00 0.00 05:15:14 05:02:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:02:01 AM eth0 72.50 71.77 15.10 14.26 0.00 0.00 0.00 0.00 05:15:14 05:03:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:03:01 AM eth0 66.62 66.14 27.44 5.33 0.00 0.00 0.00 0.00 05:15:14 05:04:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:04:01 AM eth0 120.16 120.86 69.47 9.07 0.00 0.00 0.00 0.00 05:15:14 05:05:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:05:01 AM eth0 84.22 83.07 49.47 6.12 0.00 0.00 0.00 0.00 05:15:14 05:06:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:06:01 AM eth0 98.73 96.08 59.97 7.86 0.00 0.00 0.00 0.00 05:15:14 05:07:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:07:01 AM eth0 20.41 20.16 3.17 1.41 0.00 0.00 0.00 0.00 05:15:14 05:08:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:08:01 AM eth0 74.60 74.15 29.24 6.35 0.00 0.00 0.00 0.00 05:15:14 05:09:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:09:01 AM eth0 12.63 10.76 3.19 2.20 0.00 0.00 0.00 0.00 05:15:14 05:10:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:10:01 AM eth0 13.26 11.50 3.31 2.32 0.00 0.00 0.00 0.00 05:15:14 05:11:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:11:01 AM eth0 12.13 11.46 4.11 1.24 0.00 0.00 0.00 0.00 05:15:14 05:12:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:12:01 AM eth0 115.05 101.92 35.94 8.90 0.00 0.00 0.00 0.00 05:15:14 05:13:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:13:01 AM eth0 29.81 25.69 5.54 2.65 0.00 0.00 0.00 0.00 05:15:14 05:14:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:14:01 AM eth0 125.45 76.07 201.73 146.28 0.00 0.00 0.00 0.00 05:15:14 05:15:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 05:15:01 AM eth0 17.23 16.23 14.74 7.92 0.00 0.00 0.00 0.00 05:15:14 Average: lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 05:15:14 Average: eth0 67.70 56.83 92.49 13.05 0.00 0.00 0.00 0.00 05:15:14 05:15:14 05:15:14 ---> sar -P ALL: 05:15:14 Linux 4.18.0-553.5.1.el8.x86_64 (centos-stream-8-robot-7d7a37eb-bc14-4dd6-9530-dc22c5eae738.noval) 12/02/2025 _x86_64_ (2 CPU) 05:15:14 05:15:14 04:44:59 LINUX RESTART (2 CPU) 05:15:14 05:15:14 04:46:01 AM CPU %user %nice %system %iowait %steal %idle 05:15:14 04:47:01 AM all 30.13 0.00 4.79 1.45 0.09 63.53 05:15:14 04:47:01 AM 0 24.81 0.00 4.04 1.80 0.10 69.25 05:15:14 04:47:01 AM 1 35.47 0.00 5.55 1.10 0.08 57.80 05:15:14 04:48:01 AM all 15.04 0.00 1.98 0.83 0.10 82.05 05:15:14 04:48:01 AM 0 4.27 0.00 1.62 1.13 0.10 92.88 05:15:14 04:48:01 AM 1 25.82 0.00 2.34 0.53 0.10 71.21 05:15:14 04:49:01 AM all 24.96 0.00 3.29 0.26 0.08 71.41 05:15:14 04:49:01 AM 0 20.97 0.00 2.92 0.30 0.08 75.73 05:15:14 04:49:01 AM 1 28.96 0.00 3.67 0.22 0.07 67.08 05:15:14 04:50:01 AM all 22.92 0.00 5.23 0.95 0.08 70.81 05:15:14 04:50:01 AM 0 15.01 0.00 4.75 0.90 0.08 79.26 05:15:14 04:50:01 AM 1 30.85 0.00 5.71 1.00 0.08 62.36 05:15:14 04:51:01 AM all 0.17 0.00 0.13 0.07 0.04 99.60 05:15:14 04:51:01 AM 0 0.07 0.00 0.13 0.12 0.03 99.65 05:15:14 04:51:01 AM 1 0.27 0.00 0.12 0.02 0.05 99.55 05:15:14 04:52:01 AM all 0.33 0.00 0.07 0.02 0.05 99.54 05:15:14 04:52:01 AM 0 0.07 0.00 0.07 0.03 0.05 99.78 05:15:14 04:52:01 AM 1 0.58 0.00 0.07 0.00 0.05 99.30 05:15:14 04:53:01 AM all 0.33 0.00 0.07 0.00 0.05 99.55 05:15:14 04:53:01 AM 0 0.07 0.00 0.08 0.00 0.03 99.82 05:15:14 04:53:01 AM 1 0.60 0.00 0.05 0.00 0.07 99.28 05:15:14 04:54:01 AM all 7.67 0.00 0.80 0.01 0.08 91.44 05:15:14 04:54:01 AM 0 4.64 0.00 0.45 0.02 0.08 94.81 05:15:14 04:54:01 AM 1 10.70 0.00 1.15 0.00 0.08 88.07 05:15:14 04:55:01 AM all 11.12 0.00 0.74 0.00 0.08 88.05 05:15:14 04:55:01 AM 0 5.50 0.00 0.25 0.00 0.08 94.17 05:15:14 04:55:01 AM 1 16.75 0.00 1.24 0.00 0.08 81.93 05:15:14 04:56:01 AM all 8.05 0.00 0.28 0.03 0.08 91.55 05:15:14 04:56:01 AM 0 3.51 0.00 0.22 0.05 0.08 96.13 05:15:14 04:56:01 AM 1 12.59 0.00 0.35 0.00 0.08 86.98 05:15:14 04:57:01 AM all 11.03 0.00 0.91 0.00 0.08 87.98 05:15:14 04:57:01 AM 0 4.23 0.00 0.38 0.00 0.07 95.32 05:15:14 04:57:01 AM 1 17.96 0.00 1.45 0.00 0.09 80.51 05:15:14 05:15:14 04:57:01 AM CPU %user %nice %system %iowait %steal %idle 05:15:14 04:58:01 AM all 5.74 0.00 0.49 0.00 0.08 93.69 05:15:14 04:58:01 AM 0 5.94 0.00 0.63 0.00 0.07 93.36 05:15:14 04:58:01 AM 1 5.54 0.00 0.35 0.00 0.08 94.02 05:15:14 04:59:01 AM all 6.37 0.00 0.54 0.00 0.08 93.01 05:15:14 04:59:01 AM 0 5.65 0.00 0.55 0.00 0.10 93.70 05:15:14 04:59:01 AM 1 7.09 0.00 0.52 0.00 0.07 92.32 05:15:14 05:00:00 AM all 4.37 0.00 0.32 0.00 0.08 95.24 05:15:14 05:00:00 AM 0 2.94 0.00 0.24 0.00 0.07 96.76 05:15:14 05:00:00 AM 1 5.79 0.00 0.41 0.00 0.08 93.72 05:15:14 05:00:01 AM all 16.42 0.00 11.94 0.00 0.00 71.64 05:15:14 05:00:01 AM 0 22.39 0.00 4.48 0.00 0.00 73.13 05:15:14 05:00:01 AM 1 10.45 0.00 19.40 0.00 0.00 70.15 05:15:14 05:01:01 AM all 3.28 0.00 0.28 0.00 0.08 96.35 05:15:14 05:01:01 AM 0 2.39 0.00 0.25 0.00 0.10 97.26 05:15:14 05:01:01 AM 1 4.18 0.00 0.32 0.00 0.07 95.44 05:15:14 05:02:01 AM all 11.38 0.00 0.81 0.01 0.07 87.73 05:15:14 05:02:01 AM 0 8.67 0.00 0.94 0.00 0.07 90.32 05:15:14 05:02:01 AM 1 14.07 0.00 0.68 0.02 0.07 85.16 05:15:14 05:03:01 AM all 5.12 0.00 0.30 0.03 0.09 94.46 05:15:14 05:03:01 AM 0 6.00 0.00 0.28 0.05 0.10 93.56 05:15:14 05:03:01 AM 1 4.24 0.00 0.32 0.00 0.08 95.36 05:15:14 05:04:01 AM all 6.20 0.00 0.64 0.01 0.10 93.05 05:15:14 05:04:01 AM 0 4.71 0.00 0.67 0.00 0.10 94.52 05:15:14 05:04:01 AM 1 7.69 0.00 0.60 0.02 0.10 91.59 05:15:14 05:05:01 AM all 3.14 0.00 0.33 0.01 0.09 96.43 05:15:14 05:05:01 AM 0 1.39 0.00 0.35 0.00 0.08 98.17 05:15:14 05:05:01 AM 1 4.90 0.00 0.30 0.02 0.10 94.68 05:15:14 05:06:01 AM all 9.70 0.00 0.94 0.02 0.09 89.25 05:15:14 05:06:01 AM 0 8.24 0.00 0.52 0.02 0.08 91.14 05:15:14 05:06:01 AM 1 11.16 0.00 1.36 0.02 0.10 87.36 05:15:14 05:07:01 AM all 1.24 0.00 0.13 0.01 0.08 98.54 05:15:14 05:07:01 AM 0 1.26 0.00 0.12 0.00 0.08 98.54 05:15:14 05:07:01 AM 1 1.22 0.00 0.15 0.02 0.08 98.53 05:15:14 05:15:14 05:07:01 AM CPU %user %nice %system %iowait %steal %idle 05:15:14 05:08:01 AM all 5.83 0.00 0.55 0.00 0.10 93.52 05:15:14 05:08:01 AM 0 3.26 0.00 0.47 0.00 0.10 96.17 05:15:14 05:08:01 AM 1 8.40 0.00 0.64 0.00 0.10 90.86 05:15:14 05:09:01 AM all 4.35 0.00 0.16 0.00 0.08 95.41 05:15:14 05:09:01 AM 0 2.66 0.00 0.15 0.00 0.07 97.13 05:15:14 05:09:01 AM 1 6.03 0.00 0.17 0.00 0.10 93.70 05:15:14 05:10:01 AM all 4.83 0.00 0.26 0.00 0.10 94.81 05:15:14 05:10:01 AM 0 2.29 0.00 0.20 0.00 0.08 97.43 05:15:14 05:10:01 AM 1 7.36 0.00 0.32 0.00 0.12 92.20 05:15:14 05:11:01 AM all 5.02 0.00 0.44 0.00 0.08 94.46 05:15:14 05:11:01 AM 0 8.36 0.00 0.56 0.00 0.07 91.02 05:15:14 05:11:01 AM 1 1.82 0.00 0.33 0.00 0.08 97.76 05:15:14 05:12:01 AM all 7.33 0.00 1.14 0.00 0.10 91.43 05:15:14 05:12:01 AM 0 6.73 0.00 1.23 0.00 0.10 91.94 05:15:14 05:12:01 AM 1 7.93 0.00 1.06 0.00 0.10 90.91 05:15:14 05:13:01 AM all 4.70 0.00 0.44 0.00 0.09 94.77 05:15:14 05:13:01 AM 0 3.25 0.00 0.40 0.00 0.08 96.27 05:15:14 05:13:01 AM 1 6.16 0.00 0.47 0.00 0.10 93.27 05:15:14 05:14:01 AM all 18.81 0.00 2.01 0.05 0.09 79.04 05:15:14 05:14:01 AM 0 12.83 0.00 1.67 0.05 0.08 85.36 05:15:14 05:14:01 AM 1 24.79 0.00 2.34 0.05 0.10 72.72 05:15:14 05:15:01 AM all 32.89 0.00 4.29 0.24 0.08 62.49 05:15:14 05:15:01 AM 0 32.47 0.00 4.33 0.18 0.08 62.93 05:15:14 05:15:01 AM 1 33.32 0.00 4.25 0.30 0.08 62.05 05:15:14 Average: all 9.40 0.00 1.12 0.14 0.08 89.26 05:15:14 Average: 0 6.98 0.00 0.99 0.16 0.08 91.79 05:15:14 Average: 1 11.80 0.00 1.26 0.11 0.08 86.74 05:15:14 05:15:14 05:15:14