Started by upstream project "integration-distribution-test-calcium" build number 116 originally caused by: Started by upstream project "autorelease-release-calcium-mvn38-openjdk17" build number 133 originally caused by: Started by timer Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-centos8-robot-2c-8g-19201 (centos8-robot-2c-8g) in workspace /w/workspace/daexim-csit-3node-clustering-basic-only-calcium [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-CJIIQU0BkY68/agent.5335 SSH_AGENT_PID=5336 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium@tmp/private_key_17854408541352533432.key (/w/workspace/daexim-csit-3node-clustering-basic-only-calcium@tmp/private_key_17854408541352533432.key) [ssh-agent] Using credentials jenkins (Release Engineering Jenkins Key) The recommended git tool is: NONE using credential opendaylight-jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://devvexx.opendaylight.org/mirror/integration/test > git init /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test > git --version # timeout=10 > git --version # 'git version 2.43.0' using GIT_SSH to set credentials Release Engineering Jenkins Key [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test@tmp/jenkins-gitclient-ssh10838130589826818951.key Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Configure Global Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/integration/test # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/integration/test using GIT_SSH to set credentials Release Engineering Jenkins Key [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test@tmp/jenkins-gitclient-ssh1309576845478873226.key Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Configure Global Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/integration/test master # timeout=10 > git rev-parse FETCH_HEAD^{commit} # timeout=10 Checking out Revision 2778f151c668b610353ea1551d4ec3a610c3215c (origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 2778f151c668b610353ea1551d4ec3a610c3215c # timeout=10 Commit message: "Remove code for backuprestore" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk d56af8a400b69f8c1a646c8241a7e4ed6f9c3c13 # timeout=10 No emails were triggered. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins14038970922447574925.sh ---> python-tools-install.sh Setup pyenv: system * 3.8.13 (set by /opt/pyenv/version) * 3.9.13 (set by /opt/pyenv/version) * 3.10.6 (set by /opt/pyenv/version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-SlGu lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH Generating Requirements File Python 3.10.6 pip 23.3.2 from /tmp/venv-SlGu/lib/python3.10/site-packages/pip (python 3.10) appdirs==1.4.4 argcomplete==3.2.1 aspy.yaml==1.3.0 attrs==23.2.0 autopage==0.5.2 beautifulsoup4==4.12.3 boto3==1.34.25 botocore==1.34.25 bs4==0.0.2 cachetools==5.3.2 certifi==2023.11.17 cffi==1.16.0 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.3.2 click==8.1.7 cliff==4.5.0 cmd2==2.4.3 cryptography==3.3.2 debtcollector==2.5.0 decorator==5.1.1 defusedxml==0.7.1 Deprecated==1.2.14 distlib==0.3.8 dnspython==2.5.0 docker==4.2.2 dogpile.cache==1.3.0 email-validator==2.1.0.post1 filelock==3.13.1 future==0.18.3 gitdb==4.0.11 GitPython==3.1.41 google-auth==2.26.2 httplib2==0.22.0 identify==2.5.33 idna==3.6 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.3 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==2.4 jsonschema==4.21.1 jsonschema-specifications==2023.12.1 keystoneauth1==5.5.0 kubernetes==29.0.0 lftools==0.37.8 lxml==5.1.0 MarkupSafe==2.1.4 msgpack==1.0.7 multi_key_dict==2.0.3 munch==4.0.0 netaddr==0.10.1 netifaces==0.11.0 niet==1.4.2 nodeenv==1.8.0 oauth2client==4.1.3 oauthlib==3.2.2 openstacksdk==0.62.0 os-client-config==2.1.0 os-service-types==1.7.0 osc-lib==3.0.0 oslo.config==9.3.0 oslo.context==5.3.0 oslo.i18n==6.2.0 oslo.log==5.4.0 oslo.serialization==5.3.0 oslo.utils==7.0.0 packaging==23.2 pbr==6.0.0 platformdirs==4.1.0 prettytable==3.9.0 pyasn1==0.5.1 pyasn1-modules==0.3.0 pycparser==2.21 pygerrit2==2.0.15 PyGithub==2.1.1 pyinotify==0.9.6 PyJWT==2.8.0 PyNaCl==1.5.0 pyparsing==2.4.7 pyperclip==1.8.2 pyrsistent==0.20.0 python-cinderclient==9.4.0 python-dateutil==2.8.2 python-heatclient==3.4.0 python-jenkins==1.8.2 python-keystoneclient==5.3.0 python-magnumclient==4.3.0 python-novaclient==18.4.0 python-openstackclient==6.0.0 python-swiftclient==4.4.0 pytz==2023.3.post1 PyYAML==6.0.1 referencing==0.32.1 requests==2.31.0 requests-oauthlib==1.3.1 requestsexceptions==1.4.0 rfc3986==2.0.0 rpds-py==0.17.1 rsa==4.9 ruamel.yaml==0.18.5 ruamel.yaml.clib==0.2.8 s3transfer==0.10.0 simplejson==3.19.2 six==1.16.0 smmap==5.0.1 soupsieve==2.5 stevedore==5.1.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.12.3 tqdm==4.66.1 typing_extensions==4.9.0 tzdata==2023.4 urllib3==1.26.18 virtualenv==20.25.0 wcwidth==0.2.13 websocket-client==1.7.0 wrapt==1.16.0 xdg==6.0.0 xmltodict==0.13.0 yq==3.2.3 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content OS_STACK_TEMPLATE=csit-2-instance-type.yaml OS_CLOUD=vex OS_STACK_NAME=releng-daexim-csit-3node-clustering-basic-only-calcium-116 OS_STACK_TEMPLATE_DIR=openstack-hot [EnvInject] - Variables injected successfully. provisioning config files... copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins8020072306455074751.sh ---> Create parameters file for OpenStack HOT OpenStack Heat parameters generated ----------------------------------- parameters: vm_0_count: '3' vm_0_flavor: 'v3-standard-4' vm_0_image: 'ZZCI - CentOS Stream 8 - builder - x86_64 - 20240117-011746.201' vm_1_count: '0' vm_1_flavor: 'v3-standard-2' vm_1_image: 'ZZCI - Ubuntu 18.04 - mininet-ovs-28 - x86_64 - 20230601-180106.003' job_name: '16269-116' silo: 'releng' [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins9102049018691934413.sh ---> Create HEAT stack + source /home/jenkins/lf-env.sh + lf-activate-venv --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-wgVn + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --python python3 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq + options=' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' + eval set -- ' --python '\''python3'\'' -- '\''lftools[openstack]'\'' '\''kubernetes'\'' '\''niet'\'' '\''python-heatclient'\'' '\''python-openstackclient'\'' '\''python-magnumclient'\'' '\''yq'\''' ++ set -- --python python3 -- 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq + true + case $1 in + python=python3 + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin + pyenv versions system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\''/opt/pyenv/shims'\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ pyenv versions ++ grep -E '^[0-9.]*[0-9]$' ++ sed 's/^[ *]* //' ++ awk '{ print $1 }' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.10.6 ++ [[ -z 3.10.6 ]] ++ echo 3.10.6 ++ return 0 + pyenv local 3.10.6 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.10.6 + pyenv local 3.10.6 + for arg in "$@" + case $arg in + pkg_list+='lftools[openstack] ' + for arg in "$@" + case $arg in + pkg_list+='kubernetes ' + for arg in "$@" + case $arg in + pkg_list+='niet ' + for arg in "$@" + case $arg in + pkg_list+='python-heatclient ' + for arg in "$@" + case $arg in + pkg_list+='python-openstackclient ' + for arg in "$@" + case $arg in + pkg_list+='python-magnumclient ' + for arg in "$@" + case $arg in + pkg_list+='yq ' + [[ -f /tmp/.os_lf_venv ]] ++ cat /tmp/.os_lf_venv + lf_venv=/tmp/venv-SlGu + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from' file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv + /tmp/venv-SlGu/bin/python3 -m pip install --upgrade --quiet pip virtualenv + [[ -z lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ]] + echo 'lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq ' lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes niet python-heatclient python-openstackclient python-magnumclient yq + /tmp/venv-SlGu/bin/python3 -m pip install --upgrade --quiet --upgrade-strategy eager 'lftools[openstack]' kubernetes niet python-heatclient python-openstackclient python-magnumclient yq + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH + PATH=/tmp/venv-SlGu/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/home/jenkins/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/opt/puppetlabs/bin + return 0 + openstack --os-cloud vex limits show --absolute +--------------------------+---------+ | Name | Value | +--------------------------+---------+ | maxTotalInstances | -1 | | maxTotalCores | 450 | | maxTotalRAMSize | 1000000 | | maxServerMeta | 128 | | maxImageMeta | 128 | | maxPersonality | 5 | | maxPersonalitySize | 10240 | | maxTotalKeypairs | 100 | | maxServerGroups | 10 | | maxServerGroupMembers | 10 | | maxTotalFloatingIps | -1 | | maxSecurityGroups | -1 | | maxSecurityGroupRules | -1 | | totalRAMUsed | 532480 | | totalCoresUsed | 130 | | totalInstancesUsed | 42 | | totalFloatingIpsUsed | 0 | | totalSecurityGroupsUsed | 0 | | totalServerGroupsUsed | 0 | | maxTotalVolumes | -1 | | maxTotalSnapshots | 10 | | maxTotalVolumeGigabytes | 4096 | | maxTotalBackups | 10 | | maxTotalBackupGigabytes | 1000 | | totalVolumesUsed | 0 | | totalGigabytesUsed | 0 | | totalSnapshotsUsed | 0 | | totalBackupsUsed | 0 | | totalBackupGigabytesUsed | 0 | +--------------------------+---------+ + pushd /opt/ciman/openstack-hot /opt/ciman/openstack-hot /w/workspace/daexim-csit-3node-clustering-basic-only-calcium + lftools openstack --os-cloud vex stack create releng-daexim-csit-3node-clustering-basic-only-calcium-116 csit-2-instance-type.yaml /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/stack-parameters.yaml Creating stack releng-daexim-csit-3node-clustering-basic-only-calcium-116 Waiting to initialize infrastructure... Stack initialization successful. ------------------------------------ Stack Details ------------------------------------ {'added': None, 'capabilities': [], 'created_at': '2024-01-23T07:29:39Z', 'deleted': None, 'deleted_at': None, 'description': 'No description', 'environment': None, 'environment_files': None, 'files': None, 'files_container': None, 'id': '8786f7b1-8c51-4222-bec1-2ec3998c0e26', 'is_rollback_disabled': True, 'links': [{'href': 'https://orchestration.public.mtl1.vexxhost.net/v1/12c36e260d8e4bb2913965203b1b491f/stacks/releng-daexim-csit-3node-clustering-basic-only-calcium-116/8786f7b1-8c51-4222-bec1-2ec3998c0e26', 'rel': 'self'}], 'location': Munch({'cloud': 'vex', 'region_name': 'ca-ymq-1', 'zone': None, 'project': Munch({'id': '12c36e260d8e4bb2913965203b1b491f', 'name': '61975f2c-7c17-4d69-82fa-c3ae420ad6fd', 'domain_id': None, 'domain_name': 'Default'})}), 'name': 'releng-daexim-csit-3node-clustering-basic-only-calcium-116', 'notification_topics': [], 'outputs': [{'description': 'IP addresses of the 2nd vm types', 'output_key': 'vm_1_ips', 'output_value': []}, {'description': 'IP addresses of the 1st vm types', 'output_key': 'vm_0_ips', 'output_value': ['10.30.170.15', '10.30.170.12', '10.30.170.65']}], 'owner_id': ****, 'parameters': {'OS::project_id': '12c36e260d8e4bb2913965203b1b491f', 'OS::stack_id': '8786f7b1-8c51-4222-bec1-2ec3998c0e26', 'OS::stack_name': 'releng-daexim-csit-3node-clustering-basic-only-calcium-116', 'job_name': '16269-116', 'silo': 'releng', 'vm_0_count': '3', 'vm_0_flavor': 'v3-standard-4', 'vm_0_image': 'ZZCI - CentOS Stream 8 - builder - x86_64 - ' '20240117-011746.201', 'vm_1_count': '0', 'vm_1_flavor': 'v3-standard-2', 'vm_1_image': 'ZZCI - Ubuntu 18.04 - mininet-ovs-28 - x86_64 - ' '20230601-180106.003'}, 'parent_id': None, 'replaced': None, 'status': 'CREATE_COMPLETE', 'status_reason': 'Stack CREATE completed successfully', 'tags': [], 'template': None, 'template_description': 'No description', 'template_url': None, 'timeout_mins': 15, 'unchanged': None, 'updated': None, 'updated_at': None, 'user_project_id': '3646041ec7fc46cba27b55432e61e0ee'} ------------------------------------ + popd /w/workspace/daexim-csit-3node-clustering-basic-only-calcium [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins15948417246994303127.sh ---> Copy SSH public keys to CSIT lab Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH SSH not responding on 10.30.170.15. Retrying in 10 seconds... SSH not responding on 10.30.170.12. Retrying in 10 seconds... SSH not responding on 10.30.170.65. Retrying in 10 seconds... Ping to 10.30.170.15 successful. Ping to 10.30.170.12 successful. Ping to 10.30.170.65 successful. SSH not responding on 10.30.170.12. Retrying in 10 seconds... SSH not responding on 10.30.170.65. Retrying in 10 seconds... SSH not responding on 10.30.170.15. Retrying in 10 seconds... Ping to 10.30.170.12 successful. Ping to 10.30.170.65 successful. Ping to 10.30.170.15 successful. SSH not responding on 10.30.170.12. Retrying in 10 seconds... SSH not responding on 10.30.170.65. Retrying in 10 seconds... SSH not responding on 10.30.170.15. Retrying in 10 seconds... Ping to 10.30.170.12 successful. Ping to 10.30.170.65 successful. Ping to 10.30.170.15 successful. SSH not responding on 10.30.170.15. Retrying in 10 seconds... Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. 10.30.170.12 releng-16269-116-0-builder-1.novalocal Successfully copied public keys to slave 10.30.170.12 10.30.170.65 releng-16269-116-0-builder-2.novalocal Successfully copied public keys to slave 10.30.170.65 Ping to 10.30.170.15 successful. Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. 10.30.170.15 releng-16269-116-0-builder-0.novalocal Successfully copied public keys to slave 10.30.170.15 Process 6470 ready. Process 6471 ready. Process 6472 ready. SSH ready on all stack servers. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins726344902524819412.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-lGmA lf-activate-venv(): INFO: Save venv in file: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.robot_venv lf-activate-venv(): INFO: Installing: setuptools wheel lf-activate-venv(): INFO: Adding /tmp/venv-lGmA/bin to PATH + echo 'Installing Python Requirements' Installing Python Requirements + cat + python -m pip install -r requirements.txt Looking in indexes: https://nexus3.opendaylight.org/repository/PyPi/simple Collecting docker-py (from -r requirements.txt (line 1)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-py/1.10.6/docker_py-1.10.6-py2.py3-none-any.whl (50 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 50.0/50.0 kB 591.5 kB/s eta 0:00:00 Collecting ipaddr (from -r requirements.txt (line 2)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ipaddr/2.2.0/ipaddr-2.2.0.tar.gz (26 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting netaddr (from -r requirements.txt (line 3)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/netaddr/0.10.1/netaddr-0.10.1-py2.py3-none-any.whl (2.2 MB) Collecting netifaces (from -r requirements.txt (line 4)) Using cached netifaces-0.11.0-cp310-cp310-linux_x86_64.whl Collecting pyhocon (from -r requirements.txt (line 5)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyhocon/0.3.60/pyhocon-0.3.60.tar.gz (158 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 158.8/158.8 kB 1.4 MB/s eta 0:00:00 Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting requests (from -r requirements.txt (line 6)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/requests/2.31.0/requests-2.31.0-py3-none-any.whl (62 kB) Collecting robotframework (from -r requirements.txt (line 7)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework/7.0/robotframework-7.0-py3-none-any.whl (726 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 726.3/726.3 kB 10.4 MB/s eta 0:00:00 Collecting robotframework-httplibrary (from -r requirements.txt (line 8)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-httplibrary/0.4.2/robotframework-httplibrary-0.4.2.tar.gz (9.1 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting robotframework-requests==0.9.3 (from -r requirements.txt (line 9)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-requests/0.9.3/robotframework_requests-0.9.3-py3-none-any.whl (21 kB) Collecting robotframework-selenium2library (from -r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-selenium2library/3.0.0/robotframework_selenium2library-3.0.0-py2.py3-none-any.whl (6.2 kB) Collecting robotframework-sshlibrary==3.8.0 (from -r requirements.txt (line 11)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-sshlibrary/3.8.0/robotframework-sshlibrary-3.8.0.tar.gz (51 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.4/51.4 kB 380.7 kB/s eta 0:00:00 Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting scapy (from -r requirements.txt (line 12)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scapy/2.5.0/scapy-2.5.0.tar.gz (1.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 4.7 MB/s eta 0:00:00 Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting jsonpath-rw (from -r requirements.txt (line 15)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpath-rw/1.4.0/jsonpath-rw-1.4.0.tar.gz (13 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting elasticsearch (from -r requirements.txt (line 18)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch/8.12.0/elasticsearch-8.12.0-py3-none-any.whl (431 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 431.9/431.9 kB 1.3 MB/s eta 0:00:00 Collecting elasticsearch-dsl (from -r requirements.txt (line 19)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elasticsearch-dsl/8.12.0/elasticsearch_dsl-8.12.0-py3-none-any.whl (63 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 64.0/64.0 kB 317.2 kB/s eta 0:00:00 Collecting pyangbind (from -r requirements.txt (line 22)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyangbind/0.8.4.post1/pyangbind-0.8.4.post1-py3-none-any.whl (52 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.8/52.8 kB 234.7 kB/s eta 0:00:00 Collecting isodate (from -r requirements.txt (line 25)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/isodate/0.6.1/isodate-0.6.1-py2.py3-none-any.whl (41 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 41.7/41.7 kB 424.0 kB/s eta 0:00:00 Collecting jmespath (from -r requirements.txt (line 28)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jmespath/1.0.1/jmespath-1.0.1-py3-none-any.whl (20 kB) Collecting jsonpatch (from -r requirements.txt (line 31)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpatch/1.33/jsonpatch-1.33-py2.py3-none-any.whl (12 kB) Collecting paramiko>=1.15.3 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/paramiko/3.4.0/paramiko-3.4.0-py3-none-any.whl (225 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 225.9/225.9 kB 2.3 MB/s eta 0:00:00 Collecting scp>=0.13.0 (from robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/scp/0.14.5/scp-0.14.5-py2.py3-none-any.whl (8.7 kB) Collecting docker-pycreds>=0.2.1 (from docker-py->-r requirements.txt (line 1)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/docker-pycreds/0.4.0/docker_pycreds-0.4.0-py2.py3-none-any.whl (9.0 kB) Collecting six>=1.4.0 (from docker-py->-r requirements.txt (line 1)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/six/1.16.0/six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting websocket-client>=0.32.0 (from docker-py->-r requirements.txt (line 1)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/websocket-client/1.7.0/websocket_client-1.7.0-py3-none-any.whl (58 kB) Collecting pyparsing<4,>=2 (from pyhocon->-r requirements.txt (line 5)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pyparsing/3.1.1/pyparsing-3.1.1-py3-none-any.whl (103 kB) Collecting charset-normalizer<4,>=2 (from requests->-r requirements.txt (line 6)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/charset-normalizer/3.3.2/charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (142 kB) Collecting idna<4,>=2.5 (from requests->-r requirements.txt (line 6)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/idna/3.6/idna-3.6-py3-none-any.whl (61 kB) Collecting urllib3<3,>=1.21.1 (from requests->-r requirements.txt (line 6)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/urllib3/2.1.0/urllib3-2.1.0-py3-none-any.whl (104 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 104.6/104.6 kB 1.0 MB/s eta 0:00:00 Collecting certifi>=2017.4.17 (from requests->-r requirements.txt (line 6)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/certifi/2023.11.17/certifi-2023.11.17-py3-none-any.whl (162 kB) Collecting webtest>=2.0 (from robotframework-httplibrary->-r requirements.txt (line 8)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webtest/3.0.0/WebTest-3.0.0-py3-none-any.whl (31 kB) Collecting jsonpointer (from robotframework-httplibrary->-r requirements.txt (line 8)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/jsonpointer/2.4/jsonpointer-2.4-py2.py3-none-any.whl (7.8 kB) Collecting robotframework-seleniumlibrary>=3.0.0 (from robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-seleniumlibrary/6.2.0/robotframework_seleniumlibrary-6.2.0-py2.py3-none-any.whl (95 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 95.5/95.5 kB 4.7 MB/s eta 0:00:00 Collecting ply (from jsonpath-rw->-r requirements.txt (line 15)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/ply/3.11/ply-3.11-py2.py3-none-any.whl (49 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.6/49.6 kB 560.6 kB/s eta 0:00:00 Collecting decorator (from jsonpath-rw->-r requirements.txt (line 15)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/decorator/5.1.1/decorator-5.1.1-py3-none-any.whl (9.1 kB) Collecting elastic-transport<9,>=8 (from elasticsearch->-r requirements.txt (line 18)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/elastic-transport/8.12.0/elastic_transport-8.12.0-py3-none-any.whl (59 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.9/59.9 kB 1.7 MB/s eta 0:00:00 Collecting python-dateutil (from elasticsearch-dsl->-r requirements.txt (line 19)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/python-dateutil/2.8.2/python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting pyang (from pyangbind->-r requirements.txt (line 22)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pyang/2.6.0/pyang-2.6.0-py2.py3-none-any.whl (594 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 594.1/594.1 kB 7.6 MB/s eta 0:00:00 Collecting lxml (from pyangbind->-r requirements.txt (line 22)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/lxml/5.1.0/lxml-5.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.0 MB) Collecting regex (from pyangbind->-r requirements.txt (line 22)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/regex/2023.12.25/regex-2023.12.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (773 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 774.0/774.0 kB 3.3 MB/s eta 0:00:00 Collecting enum34 (from pyangbind->-r requirements.txt (line 22)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/enum34/1.1.10/enum34-1.1.10-py3-none-any.whl (11 kB) Collecting bcrypt>=3.2 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/bcrypt/4.1.2/bcrypt-4.1.2-cp39-abi3-manylinux_2_28_x86_64.whl (698 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 698.9/698.9 kB 6.7 MB/s eta 0:00:00 Collecting cryptography>=3.3 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cryptography/42.0.0/cryptography-42.0.0-cp39-abi3-manylinux_2_28_x86_64.whl (4.6 MB) Collecting pynacl>=1.5 (from paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pynacl/1.5.0/PyNaCl-1.5.0-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (856 kB) Collecting selenium>=4.3.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/selenium/4.17.0/selenium-4.17.0-py3-none-any.whl (9.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.9/9.9 MB 47.0 MB/s eta 0:00:00 Collecting robotframework-pythonlibcore>=3.0.0 (from robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/robotframework-pythonlibcore/4.3.0/robotframework_pythonlibcore-4.3.0-py2.py3-none-any.whl (10 kB) Collecting WebOb>=1.2 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/webob/1.8.7/WebOb-1.8.7-py2.py3-none-any.whl (114 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 115.0/115.0 kB 16.8 MB/s eta 0:00:00 Collecting waitress>=0.8.5 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/waitress/2.1.2/waitress-2.1.2-py3-none-any.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.7/57.7 kB 8.0 MB/s eta 0:00:00 Collecting beautifulsoup4 (from webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/beautifulsoup4/4.12.3/beautifulsoup4-4.12.3-py3-none-any.whl (147 kB) Collecting cffi>=1.12 (from cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/cffi/1.16.0/cffi-1.16.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (443 kB) Collecting trio~=0.17 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio/0.24.0/trio-0.24.0-py3-none-any.whl (460 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 460.2/460.2 kB 41.1 MB/s eta 0:00:00 Collecting trio-websocket~=0.9 (from selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/trio-websocket/0.11.1/trio_websocket-0.11.1-py3-none-any.whl (17 kB) Collecting soupsieve>1.2 (from beautifulsoup4->webtest>=2.0->robotframework-httplibrary->-r requirements.txt (line 8)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/soupsieve/2.5/soupsieve-2.5-py3-none-any.whl (36 kB) Collecting pycparser (from cffi>=1.12->cryptography>=3.3->paramiko>=1.15.3->robotframework-sshlibrary==3.8.0->-r requirements.txt (line 11)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/pycparser/2.21/pycparser-2.21-py2.py3-none-any.whl (118 kB) Collecting attrs>=20.1.0 (from trio~=0.17->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Using cached https://nexus3.opendaylight.org/repository/PyPi/packages/attrs/23.2.0/attrs-23.2.0-py3-none-any.whl (60 kB) Collecting sortedcontainers (from trio~=0.17->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sortedcontainers/2.4.0/sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) Collecting outcome (from trio~=0.17->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/outcome/1.3.0.post0/outcome-1.3.0.post0-py2.py3-none-any.whl (10 kB) Collecting sniffio>=1.3.0 (from trio~=0.17->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/sniffio/1.3.0/sniffio-1.3.0-py3-none-any.whl (10 kB) Collecting exceptiongroup (from trio~=0.17->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/exceptiongroup/1.2.0/exceptiongroup-1.2.0-py3-none-any.whl (16 kB) Collecting wsproto>=0.14 (from trio-websocket~=0.9->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/wsproto/1.2.0/wsproto-1.2.0-py3-none-any.whl (24 kB) Collecting pysocks!=1.5.7,<2.0,>=1.5.6 (from urllib3[socks]<3,>=1.26->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/pysocks/1.7.1/PySocks-1.7.1-py3-none-any.whl (16 kB) Collecting h11<1,>=0.9.0 (from wsproto>=0.14->trio-websocket~=0.9->selenium>=4.3.0->robotframework-seleniumlibrary>=3.0.0->robotframework-selenium2library->-r requirements.txt (line 10)) Downloading https://nexus3.opendaylight.org/repository/PyPi/packages/h11/0.14.0/h11-0.14.0-py3-none-any.whl (58 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.3/58.3 kB 12.7 MB/s eta 0:00:00 Building wheels for collected packages: robotframework-sshlibrary, ipaddr, pyhocon, robotframework-httplibrary, scapy, jsonpath-rw Building wheel for robotframework-sshlibrary (setup.py): started Building wheel for robotframework-sshlibrary (setup.py): finished with status 'done' Created wheel for robotframework-sshlibrary: filename=robotframework_sshlibrary-3.8.0-py3-none-any.whl size=55116 sha256=bfe780e47aecb4a9b2bb0f831e3692b6e46c652593cb429284814d625dff5fc7 Stored in directory: /home/jenkins/.cache/pip/wheels/4f/7c/b3/1dfc58277260e643335754e93ea5854bc7ba3d181575a9cea9 Building wheel for ipaddr (setup.py): started Building wheel for ipaddr (setup.py): finished with status 'done' Created wheel for ipaddr: filename=ipaddr-2.2.0-py3-none-any.whl size=18282 sha256=13a0fe7cf6aa15c88f1bc4999084046413da758527d10b47157b1708b136c07e Stored in directory: /home/jenkins/.cache/pip/wheels/b5/7c/08/e4a91c0b38d5ae5fea76fbb9b60ce1b5f5f8d6e69cbe801870 Building wheel for pyhocon (setup.py): started Building wheel for pyhocon (setup.py): finished with status 'done' Created wheel for pyhocon: filename=pyhocon-0.3.60-py3-none-any.whl size=20864 sha256=a1660c66de2b3982112aabbcbab7ff82050815823951277ea6e6ed3f42723eef Stored in directory: /home/jenkins/.cache/pip/wheels/7e/3f/4f/66e113a4b8eca7d8ff120a7fcfb53be15fc5078eb353482440 Building wheel for robotframework-httplibrary (setup.py): started Building wheel for robotframework-httplibrary (setup.py): finished with status 'done' Created wheel for robotframework-httplibrary: filename=robotframework_httplibrary-0.4.2-py3-none-any.whl size=9954 sha256=7a927da396a39a313f355b6d4aa6284301767d1fca4ef612d60b1ecbe0724d26 Stored in directory: /home/jenkins/.cache/pip/wheels/2e/b8/87/b9b31f375281dd80f51ca94affc6e39f3e45d192e1e1f95553 Building wheel for scapy (setup.py): started Building wheel for scapy (setup.py): finished with status 'done' Created wheel for scapy: filename=scapy-2.5.0-py2.py3-none-any.whl size=1444328 sha256=ed54387fad0166bf3e044934d6d0697c283fd6ef12a310c281b9395d797ae5e9 Stored in directory: /home/jenkins/.cache/pip/wheels/8e/0a/14/42ef36f95fd1fbf75090db30f0cbb4ad4186b84b7a3d10dfcb Building wheel for jsonpath-rw (setup.py): started Building wheel for jsonpath-rw (setup.py): finished with status 'done' Created wheel for jsonpath-rw: filename=jsonpath_rw-1.4.0-py3-none-any.whl size=15127 sha256=68f7cad9286d2382ebb53a34fde629f8f481e458ac9b2a0ffbfa843b1bc1fd11 Stored in directory: /home/jenkins/.cache/pip/wheels/22/71/20/0f098fcfc9905114ba8ed62f4b7f9b66ef9a017f99d918fc12 Successfully built robotframework-sshlibrary ipaddr pyhocon robotframework-httplibrary scapy jsonpath-rw Installing collected packages: sortedcontainers, ply, netifaces, netaddr, ipaddr, enum34, websocket-client, WebOb, waitress, urllib3, soupsieve, sniffio, six, scapy, robotframework-pythonlibcore, robotframework, regex, pysocks, pyparsing, pycparser, lxml, jsonpointer, jmespath, idna, h11, exceptiongroup, decorator, charset-normalizer, certifi, bcrypt, attrs, wsproto, requests, python-dateutil, pyhocon, pyang, outcome, jsonpath-rw, jsonpatch, isodate, elastic-transport, docker-pycreds, cffi, beautifulsoup4, webtest, trio, robotframework-requests, pynacl, pyangbind, elasticsearch, docker-py, cryptography, trio-websocket, robotframework-httplibrary, paramiko, elasticsearch-dsl, selenium, scp, robotframework-sshlibrary, robotframework-seleniumlibrary, robotframework-selenium2library Successfully installed WebOb-1.8.7 attrs-23.2.0 bcrypt-4.1.2 beautifulsoup4-4.12.3 certifi-2023.11.17 cffi-1.16.0 charset-normalizer-3.3.2 cryptography-42.0.0 decorator-5.1.1 docker-py-1.10.6 docker-pycreds-0.4.0 elastic-transport-8.12.0 elasticsearch-8.12.0 elasticsearch-dsl-8.12.0 enum34-1.1.10 exceptiongroup-1.2.0 h11-0.14.0 idna-3.6 ipaddr-2.2.0 isodate-0.6.1 jmespath-1.0.1 jsonpatch-1.33 jsonpath-rw-1.4.0 jsonpointer-2.4 lxml-5.1.0 netaddr-0.10.1 netifaces-0.11.0 outcome-1.3.0.post0 paramiko-3.4.0 ply-3.11 pyang-2.6.0 pyangbind-0.8.4.post1 pycparser-2.21 pyhocon-0.3.60 pynacl-1.5.0 pyparsing-3.1.1 pysocks-1.7.1 python-dateutil-2.8.2 regex-2023.12.25 requests-2.31.0 robotframework-7.0 robotframework-httplibrary-0.4.2 robotframework-pythonlibcore-4.3.0 robotframework-requests-0.9.3 robotframework-selenium2library-3.0.0 robotframework-seleniumlibrary-6.2.0 robotframework-sshlibrary-3.8.0 scapy-2.5.0 scp-0.14.5 selenium-4.17.0 six-1.16.0 sniffio-1.3.0 sortedcontainers-2.4.0 soupsieve-2.5 trio-0.24.0 trio-websocket-0.11.1 urllib3-2.1.0 waitress-2.1.2 websocket-client-1.7.0 webtest-3.0.0 wsproto-1.2.0 + pip freeze attrs==23.2.0 bcrypt==4.1.2 beautifulsoup4==4.12.3 certifi==2023.11.17 cffi==1.16.0 charset-normalizer==3.3.2 cryptography==42.0.0 decorator==5.1.1 distlib==0.3.8 docker-py==1.10.6 docker-pycreds==0.4.0 elastic-transport==8.12.0 elasticsearch==8.12.0 elasticsearch-dsl==8.12.0 enum34==1.1.10 exceptiongroup==1.2.0 filelock==3.13.1 h11==0.14.0 idna==3.6 ipaddr==2.2.0 isodate==0.6.1 jmespath==1.0.1 jsonpatch==1.33 jsonpath-rw==1.4.0 jsonpointer==2.4 lxml==5.1.0 netaddr==0.10.1 netifaces==0.11.0 outcome==1.3.0.post0 paramiko==3.4.0 platformdirs==4.1.0 ply==3.11 pyang==2.6.0 pyangbind==0.8.4.post1 pycparser==2.21 pyhocon==0.3.60 PyNaCl==1.5.0 pyparsing==3.1.1 PySocks==1.7.1 python-dateutil==2.8.2 regex==2023.12.25 requests==2.31.0 robotframework==7.0 robotframework-httplibrary==0.4.2 robotframework-pythonlibcore==4.3.0 robotframework-requests==0.9.3 robotframework-selenium2library==3.0.0 robotframework-seleniumlibrary==6.2.0 robotframework-sshlibrary==3.8.0 scapy==2.5.0 scp==0.14.5 selenium==4.17.0 six==1.16.0 sniffio==1.3.0 sortedcontainers==2.4.0 soupsieve==2.5 trio==0.24.0 trio-websocket==0.11.1 urllib3==2.1.0 virtualenv==20.25.0 waitress==2.1.2 WebOb==1.8.7 websocket-client==1.7.0 WebTest==3.0.0 wsproto==1.2.0 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'env.properties' [EnvInject] - Variables injected successfully. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins6799005127198175272.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient yq ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. botocore 1.34.25 requires urllib3<2.1,>=1.25.4; python_version >= "3.10", but you have urllib3 2.1.0 which is incompatible. lftools 0.37.8 requires openstacksdk<1.5.0, but you have openstacksdk 2.1.0 which is incompatible. lftools 0.37.8 requires urllib3<2.0.0, but you have urllib3 2.1.0 which is incompatible. lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH + ODL_SYSTEM=() + TOOLS_SYSTEM=() + OPENSTACK_SYSTEM=() + OPENSTACK_CONTROLLERS=() + mapfile -t ADDR ++ jq -r '.outputs[] | select(.output_key | match("^vm_[0-9]+_ips$")) | .output_value | .[]' ++ openstack stack show -f json -c outputs releng-daexim-csit-3node-clustering-basic-only-calcium-116 + for i in "${ADDR[@]}" ++ ssh 10.30.170.15 hostname -s Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + REMHOST=releng-16269-116-0-builder-0 + case ${REMHOST} in + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") + for i in "${ADDR[@]}" ++ ssh 10.30.170.12 hostname -s Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + REMHOST=releng-16269-116-0-builder-1 + case ${REMHOST} in + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") + for i in "${ADDR[@]}" ++ ssh 10.30.170.65 hostname -s Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + REMHOST=releng-16269-116-0-builder-2 + case ${REMHOST} in + ODL_SYSTEM=("${ODL_SYSTEM[@]}" "${i}") + echo NUM_ODL_SYSTEM=3 + echo NUM_TOOLS_SYSTEM=0 + '[' '' == yes ']' + NUM_OPENSTACK_SYSTEM=0 + echo NUM_OPENSTACK_SYSTEM=0 + '[' 0 -eq 2 ']' + echo ODL_SYSTEM_IP=10.30.170.15 ++ seq 0 2 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) + echo ODL_SYSTEM_1_IP=10.30.170.15 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) + echo ODL_SYSTEM_2_IP=10.30.170.12 + for i in $(seq 0 $(( ${#ODL_SYSTEM[@]} - 1 ))) + echo ODL_SYSTEM_3_IP=10.30.170.65 + echo TOOLS_SYSTEM_IP= ++ seq 0 -1 + openstack_index=0 + NUM_OPENSTACK_CONTROL_NODES=1 + echo NUM_OPENSTACK_CONTROL_NODES=1 ++ seq 0 0 + for i in $(seq 0 $((NUM_OPENSTACK_CONTROL_NODES - 1))) + echo OPENSTACK_CONTROL_NODE_1_IP= + NUM_OPENSTACK_COMPUTE_NODES=-1 + echo NUM_OPENSTACK_COMPUTE_NODES=-1 + '[' -1 -ge 2 ']' ++ seq 0 -2 + NUM_OPENSTACK_HAPROXY_NODES=0 + echo NUM_OPENSTACK_HAPROXY_NODES=0 ++ seq 0 -1 + echo 'Contents of slave_addresses.txt:' Contents of slave_addresses.txt: + cat slave_addresses.txt NUM_ODL_SYSTEM=3 NUM_TOOLS_SYSTEM=0 NUM_OPENSTACK_SYSTEM=0 ODL_SYSTEM_IP=10.30.170.15 ODL_SYSTEM_1_IP=10.30.170.15 ODL_SYSTEM_2_IP=10.30.170.12 ODL_SYSTEM_3_IP=10.30.170.65 TOOLS_SYSTEM_IP= NUM_OPENSTACK_CONTROL_NODES=1 OPENSTACK_CONTROL_NODE_1_IP= NUM_OPENSTACK_COMPUTE_NODES=-1 NUM_OPENSTACK_HAPROXY_NODES=0 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'slave_addresses.txt' [EnvInject] - Variables injected successfully. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/sh /tmp/jenkins2269825235138216789.sh Preparing for JRE Version 17 Karaf artifact is karaf Karaf project is integration Java home is /usr/lib/jvm/java-17-openjdk [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'set_variables.env' [EnvInject] - Variables injected successfully. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins11330498764216159719.sh Distribution bundle URL is https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip Distribution bundle is karaf-0.20.0.zip Distribution bundle version is 0.20.0 Distribution folder is karaf-0.20.0 Nexus prefix is https://nexus.opendaylight.org [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties file path 'detect_variables.env' [EnvInject] - Variables injected successfully. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins5685786943178615781.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: python-heatclient python-openstackclient lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH Copying common-functions.sh to /tmp Copying common-functions.sh to 10.30.170.15:/tmp Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. Copying common-functions.sh to 10.30.170.12:/tmp Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. Copying common-functions.sh to 10.30.170.65:/tmp Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins14163963035317390391.sh common-functions.sh is being sourced common-functions environment: MAVENCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ACTUALFEATURES: FEATURESCONF: /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg CUSTOMPROP: /tmp/karaf-0.20.0/etc/custom.properties LOGCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg MEMCONF: /tmp/karaf-0.20.0/bin/setenv CONTROLLERMEM: 2048m AKKACONF: /tmp/karaf-0.20.0/configuration/initial/akka.conf MODULESCONF: /tmp/karaf-0.20.0/configuration/initial/modules.conf MODULESHARDSCONF: /tmp/karaf-0.20.0/configuration/initial/module-shards.conf SUITES: ################################################# ## Configure Cluster and Start ## ################################################# ACTUALFEATURES: odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia SPACE_SEPARATED_FEATURES: odl-infrautils-ready odl-jolokia odl-daexim-all odl-netconf-topology odl-jolokia Locating script plan to use... Finished running script plans Configuring member-1 with IP address 10.30.170.15 Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + source /tmp/common-functions.sh karaf-0.20.0 common-functions.sh is being sourced ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] ++ echo 'common-functions.sh is being sourced' ++ BUNDLEFOLDER=karaf-0.20.0 ++ export MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ export FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ export CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ export LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ export MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ export CONTROLLERMEM= ++ CONTROLLERMEM= ++ export AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ export MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ export MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ print_common_env ++ cat common-functions environment: MAVENCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ACTUALFEATURES: FEATURESCONF: /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg CUSTOMPROP: /tmp/karaf-0.20.0/etc/custom.properties LOGCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg MEMCONF: /tmp/karaf-0.20.0/bin/setenv CONTROLLERMEM: AKKACONF: /tmp/karaf-0.20.0/configuration/initial/akka.conf MODULESCONF: /tmp/karaf-0.20.0/configuration/initial/modules.conf MODULESHARDSCONF: /tmp/karaf-0.20.0/configuration/initial/module-shards.conf SUITES: ++ SSH='ssh -t -t' ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' Changing to /tmp Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip + echo 'Changing to /tmp' + cd /tmp + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip' + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip --2024-01-23 07:32:16-- https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 264424598 (252M) [application/zip] Saving to: ‘karaf-0.20.0.zip’ 0K ........ ........ ........ ........ ........ ........ 1% 84.5M 3s 3072K ........ ........ ........ ........ ........ ........ 2% 101M 3s 6144K ........ ........ ........ ........ ........ ........ 3% 84.1M 3s 9216K ........ ........ ........ ........ ........ ........ 4% 108M 3s 12288K ........ ........ ........ ........ ........ ........ 5% 118M 2s 15360K ........ ........ ........ ........ ........ ........ 7% 235M 2s 18432K ........ ........ ........ ........ ........ ........ 8% 232M 2s 21504K ........ ........ ........ ........ ........ ........ 9% 214M 2s 24576K ........ ........ ........ ........ ........ ........ 10% 223M 2s 27648K ........ ........ ........ ........ ........ ........ 11% 228M 2s 30720K ........ ........ ........ ........ ........ ........ 13% 314M 2s 33792K ........ ........ ........ ........ ........ ........ 14% 304M 1s 36864K ........ ........ ........ ........ ........ ........ 15% 181M 1s 39936K ........ ........ ........ ........ ........ ........ 16% 227M 1s 43008K ........ ........ ........ ........ ........ ........ 17% 286M 1s 46080K ........ ........ ........ ........ ........ ........ 19% 260M 1s 49152K ........ ........ ........ ........ ........ ........ 20% 285M 1s 52224K ........ ........ ........ ........ ........ ........ 21% 280M 1s 55296K ........ ........ ........ ........ ........ ........ 22% 246M 1s 58368K ........ ........ ........ ........ ........ ........ 23% 307M 1s 61440K ........ ........ ........ ........ ........ ........ 24% 283M 1s 64512K ........ ........ ........ ........ ........ ........ 26% 290M 1s 67584K ........ ........ ........ ........ ........ ........ 27% 183M 1s 70656K ........ ........ ........ ........ ........ ........ 28% 270M 1s 73728K ........ ........ ........ ........ ........ ........ 29% 286M 1s 76800K ........ ........ ........ ........ ........ ........ 30% 275M 1s 79872K ........ ........ ........ ........ ........ ........ 32% 279M 1s 82944K ........ ........ ........ ........ ........ ........ 33% 290M 1s 86016K ........ ........ ........ ........ ........ ........ 34% 284M 1s 89088K ........ ........ ........ ........ ........ ........ 35% 284M 1s 92160K ........ ........ ........ ........ ........ ........ 36% 278M 1s 95232K ........ ........ ........ ........ ........ ........ 38% 278M 1s 98304K ........ ........ ........ ........ ........ ........ 39% 271M 1s 101376K ........ ........ ........ ........ ........ ........ 40% 276M 1s 104448K ........ ........ ........ ........ ........ ........ 41% 285M 1s 107520K ........ ........ ........ ........ ........ ........ 42% 283M 1s 110592K ........ ........ ........ ........ ........ ........ 44% 286M 1s 113664K ........ ........ ........ ........ ........ ........ 45% 261M 1s 116736K ........ ........ ........ ........ ........ ........ 46% 264M 1s 119808K ........ ........ ........ ........ ........ ........ 47% 249M 1s 122880K ........ ........ ........ ........ ........ ........ 48% 279M 1s 125952K ........ ........ ........ ........ ........ ........ 49% 282M 1s 129024K ........ ........ ........ ........ ........ ........ 51% 268M 1s 132096K ........ ........ ........ ........ ........ ........ 52% 272M 1s 135168K ........ ........ ........ ........ ........ ........ 53% 291M 1s 138240K ........ ........ ........ ........ ........ ........ 54% 302M 1s 141312K ........ ........ ........ ........ ........ ........ 55% 288M 0s 144384K ........ ........ ........ ........ ........ ........ 57% 284M 0s 147456K ........ ........ ........ ........ ........ ........ 58% 287M 0s 150528K ........ ........ ........ ........ ........ ........ 59% 287M 0s 153600K ........ ........ ........ ........ ........ ........ 60% 294M 0s 156672K ........ ........ ........ ........ ........ ........ 61% 309M 0s 159744K ........ ........ ........ ........ ........ ........ 63% 170M 0s 162816K ........ ........ ........ ........ ........ ........ 64% 330M 0s 165888K ........ ........ ........ ........ ........ ........ 65% 312M 0s 168960K ........ ........ ........ ........ ........ ........ 66% 295M 0s 172032K ........ ........ ........ ........ ........ ........ 67% 284M 0s 175104K ........ ........ ........ ........ ........ ........ 68% 302M 0s 178176K ........ ........ ........ ........ ........ ........ 70% 282M 0s 181248K ........ ........ ........ ........ ........ ........ 71% 274M 0s 184320K ........ ........ ........ ........ ........ ........ 72% 283M 0s 187392K ........ ........ ........ ........ ........ ........ 73% 288M 0s 190464K ........ ........ ........ ........ ........ ........ 74% 260M 0s 193536K ........ ........ ........ ........ ........ ........ 76% 281M 0s 196608K ........ ........ ........ ........ ........ ........ 77% 284M 0s 199680K ........ ........ ........ ........ ........ ........ 78% 281M 0s 202752K ........ ........ ........ ........ ........ ........ 79% 273M 0s 205824K ........ ........ ........ ........ ........ ........ 80% 295M 0s 208896K ........ ........ ........ ........ ........ ........ 82% 159M 0s 211968K ........ ........ ........ ........ ........ ........ 83% 289M 0s 215040K ........ ........ ........ ........ ........ ........ 84% 405M 0s 218112K ........ ........ ........ ........ ........ ........ 85% 308M 0s 221184K ........ ........ ........ ........ ........ ........ 86% 276M 0s 224256K ........ ........ ........ ........ ........ ........ 88% 301M 0s 227328K ........ ........ ........ ........ ........ ........ 89% 284M 0s 230400K ........ ........ ........ ........ ........ ........ 90% 157M 0s 233472K ........ ........ ........ ........ ........ ........ 91% 268M 0s 236544K ........ ........ ........ ........ ........ ........ 92% 283M 0s 239616K ........ ........ ........ ........ ........ ........ 93% 302M 0s 242688K ........ ........ ........ ........ ........ ........ 95% 292M 0s 245760K ........ ........ ........ ........ ........ ........ 96% 315M 0s 248832K ........ ........ ........ ........ ........ ........ 97% 273M 0s 251904K ........ ........ ........ ........ ........ ........ 98% 287M 0s 254976K ........ ........ ........ ........ ........ ........ 99% 293M 0s 258048K .. 100% 292M=1.0s 2024-01-23 07:32:17 (243 MB/s) - ‘karaf-0.20.0.zip’ saved [264424598/264424598] Extracting the new controller... + echo 'Extracting the new controller...' + unzip -q karaf-0.20.0.zip Adding external repositories... + echo 'Adding external repositories...' + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # If set to true, the following property will not allow any certificate to be used # when accessing Maven repositories through SSL # #org.ops4j.pax.url.mvn.certificateCheck= # # Path to the local Maven settings file. # The repositories defined in this file will be automatically added to the list # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property # below is not set. # The following locations are checked for the existence of the settings.xml file # * 1. looks for the specified url # * 2. if not found looks for ${user.home}/.m2/settings.xml # * 3. if not found looks for ${maven.home}/conf/settings.xml # * 4. if not found looks for ${M2_HOME}/conf/settings.xml # #org.ops4j.pax.url.mvn.settings= # # Path to the local Maven repository which is used to avoid downloading # artifacts when they already exist locally. # The value of this property will be extracted from the settings.xml file # above, or defaulted to: # System.getProperty( "user.home" ) + "/.m2/repository" # org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} # # Default this to false. It's just weird to use undocumented repos # org.ops4j.pax.url.mvn.useFallbackRepositories=false # # Uncomment if you don't wanna use the proxy settings # from the Maven conf/settings.xml file # # org.ops4j.pax.url.mvn.proxySupport=false # # Comma separated list of repositories scanned when resolving an artifact. # Those repositories will be checked before iterating through the # below list of repositories and even before the local repository # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snaphots # @noreleases : the repository does not contain any released artifacts # # The following property value will add the system folder as a repo. # org.ops4j.pax.url.mvn.defaultRepositories=\ file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false # # Comma separated list of repositories scanned when resolving an artifact. # The default list includes the following repositories: # http://repo1.maven.org/maven2@id=central # http://repository.springsource.com/maven/bundles/release@id=spring.ebr # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external # http://zodiac.springsource.com/maven/bundles/release@id=gemini # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases # To add repositories to the default ones, prepend '+' to the list of repositories # to add. # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snapshots # @noreleases : the repository does not contain any released artifacts # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended # org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... + [[ True == \T\r\u\e ]] + echo 'Configuring the startup features...' + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia,/g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + FEATURE_TEST_STRING=features-test + FEATURE_TEST_VERSION=0.20.0 + KARAF_VERSION=karaf4 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + [[ ! -z '' ]] + cat /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # Comma separated list of features repositories to register by default # featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/2e2f7245-ee82-4414-908d-2febb6610f0e.xml # # Comma separated list of features to install at startup # featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia, cb8f7765-cdca-452f-8af2-d48adc62ccfa # # Resource repositories (OBR) that the features resolver can use # to resolve requirements/capabilities # # The format of the resourceRepositories is # resourceRepositories=[xml:url|json:url],... # for Instance: # #resourceRepositories=xml:http://host/path/to/index.xml # or #resourceRepositories=json:http://host/path/to/index.json # # # Defines if the boot features are started in asynchronous mode (in a dedicated thread) # featuresBootAsynchronous=false # # Service requirements enforcement # # By default, the feature resolver checks the service requirements/capabilities of # bundles for new features (xml schema >= 1.3.0) in order to automatically installs # the required bundles. # The following flag can have those values: # - disable: service requirements are completely ignored # - default: service requirements are ignored for old features # - enforce: service requirements are always verified # #serviceRequirements=default # # Store cfg file for config element in feature # #configCfgStore=true # # Define if the feature service automatically refresh bundles # autoRefresh=true # # Configuration of features processing mechanism (overrides, blacklisting, modification of features) # XML file defines instructions related to features processing # versions.properties may declare properties to resolve placeholders in XML file # both files are relative to ${karaf.etc} # #featureProcessing=org.apache.karaf.features.xml #featureProcessingVersions=versions.properties + configure_karaf_log karaf4 '' + local -r karaf_version=karaf4 + local -r controllerdebugmap= + local logapi=log4j + grep log4j2 /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n log4j2.rootLogger.level = INFO #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 64MB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 + logapi=log4j2 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' + '[' log4j2 == log4j2 ']' + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' controllerdebugmap: cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' + unset IFS + echo 'controllerdebugmap: ' + '[' -n '' ']' + echo 'cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg' + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # Common pattern layout for appenders log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n # Root logger log4j2.rootLogger.level = INFO # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} # Filters for logs marked by org.opendaylight.odlparent.Markers log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL # Loggers configuration # Spifly logger log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN # Security audit logger log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Appenders configuration # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} # Rolling file appender log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i # uncomment to not force a disk flush #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 1GB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 # Audit file appender log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 # OSGi appender log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * # help with identification of maven-related problems with pax-url-aether #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN + set_java_vars /usr/lib/jvm/java-17-openjdk 2048m /tmp/karaf-0.20.0/bin/setenv + local -r java_home=/usr/lib/jvm/java-17-openjdk + local -r controllermem=2048m Configure java home: /usr/lib/jvm/java-17-openjdk max memory: 2048m memconf: /tmp/karaf-0.20.0/bin/setenv + local -r memconf=/tmp/karaf-0.20.0/bin/setenv + echo Configure + echo ' java home: /usr/lib/jvm/java-17-openjdk' + echo ' max memory: 2048m' + echo ' memconf: /tmp/karaf-0.20.0/bin/setenv' + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk}%g' /tmp/karaf-0.20.0/bin/setenv + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.20.0/bin/setenv cat /tmp/karaf-0.20.0/bin/setenv + echo 'cat /tmp/karaf-0.20.0/bin/setenv' + cat /tmp/karaf-0.20.0/bin/setenv #!/bin/sh # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf # script: client, instance, shell, start, status, stop, karaf # # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then # Actions go here... # fi # # general settings which should be applied for all scripts go here; please keep # in mind that it is possible that scripts might be executed more than once, e.g. # in example of the start script where the start script is executed first and the # karaf script afterwards. # # # The following section shows the possible configuration options for the default # karaf scripts # export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk} # Location of Java installation # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options # export EXTRA_JAVA_OPTS # Additional JVM options # export KARAF_HOME # Karaf home folder # export KARAF_DATA # Karaf data folder # export KARAF_BASE # Karaf base folder # export KARAF_ETC # Karaf etc folder # export KARAF_LOG # Karaf log folder # export KARAF_SYSTEM_OPTS # First citizen Karaf options # export KARAF_OPTS # Additional available Karaf options # export KARAF_DEBUG # Enable debug mode # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start # export KARAF_NOROOT # Prevent execution as root if set to true Set Java version + echo 'Set Java version' + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-17-openjdk/bin/java 1 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-17-openjdk/bin/java JDK default version ... + echo 'JDK default version ...' + java -version openjdk version "17.0.6-ea" 2023-01-17 LTS OpenJDK Runtime Environment (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS) OpenJDK 64-Bit Server VM (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS, mixed mode, sharing) Set JAVA_HOME + echo 'Set JAVA_HOME' + export JAVA_HOME=/usr/lib/jvm/java-17-openjdk + JAVA_HOME=/usr/lib/jvm/java-17-openjdk ++ readlink -e /usr/lib/jvm/java-17-openjdk/bin/java Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java + JAVA_RESOLVED=/usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java' Listing all open ports on controller system... + echo 'Listing all open ports on controller system...' + netstat -pnatu (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 0 10.30.170.15:59336 199.204.45.87:443 TIME_WAIT - tcp 0 0 10.30.170.15:22 10.30.171.25:46086 ESTABLISHED - tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::22 :::* LISTEN - udp 0 0 10.30.170.15:68 10.30.170.3:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - + '[' -f /tmp/custom_shard_config.txt ']' Configuring cluster + echo 'Configuring cluster' + /tmp/karaf-0.20.0/bin/configure_cluster.sh 1 10.30.170.15 10.30.170.12 10.30.170.65 ################################################ ## Configure Cluster ## ################################################ NOTE: Cluster configuration files not found. Copying from /tmp/karaf-0.20.0/system/org/opendaylight/controller/sal-clustering-config/8.0.3 Configuring unique name in akka.conf Configuring hostname in akka.conf Configuring data and rpc seed nodes in akka.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Configuring replication type in module-shards.conf ################################################ ## NOTE: Manually restart controller to ## ## apply configuration. ## ################################################ Dump akka.conf odl-cluster-data { akka { remote { artery { enabled = on transport = tcp canonical.hostname = "10.30.170.15" canonical.port = 2550 } } cluster { # Using artery. seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.15:2550", "akka://opendaylight-cluster-data@10.30.170.12:2550", "akka://opendaylight-cluster-data@10.30.170.65:2550"] roles = ["member-1"] # when under load we might trip a false positive on the failure detector # failure-detector { # heartbeat-interval = 4 s # acceptable-heartbeat-pause = 16s # } } persistence { # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by # modifying the following two properties. The directory location specified may be a relative or absolute path. # The relative path is always relative to KARAF_HOME. # snapshot-store.local.dir = "target/snapshots" # Use lz4 compression for LocalSnapshotStore snapshots snapshot-store.local.use-lz4-compression = false # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB snapshot-store.local.lz4-blocksize = 256KB } disable-default-actor-system-quarantined-event-handling = "false" } } Dump modules.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Dump module-shards.conf + echo 'Dump akka.conf' + cat /tmp/karaf-0.20.0/configuration/initial/akka.conf + echo 'Dump modules.conf' + cat /tmp/karaf-0.20.0/configuration/initial/modules.conf + echo 'Dump module-shards.conf' + cat /tmp/karaf-0.20.0/configuration/initial/module-shards.conf module-shards = [ { name = "default" shards = [ { name = "default" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "inventory" shards = [ { name="inventory" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "topology" shards = [ { name="topology" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "toaster" shards = [ { name="toaster" replicas = ["member-1", "member-2", "member-3"] } ] } ] Configuring member-2 with IP address 10.30.170.12 Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + source /tmp/common-functions.sh karaf-0.20.0 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] common-functions.sh is being sourced ++ echo 'common-functions.sh is being sourced' ++ BUNDLEFOLDER=karaf-0.20.0 ++ export MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ export FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ export CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ export LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ export MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ export CONTROLLERMEM= ++ CONTROLLERMEM= ++ export AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ export MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ export MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ print_common_env ++ cat common-functions environment: MAVENCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ACTUALFEATURES: FEATURESCONF: /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg CUSTOMPROP: /tmp/karaf-0.20.0/etc/custom.properties LOGCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg MEMCONF: /tmp/karaf-0.20.0/bin/setenv CONTROLLERMEM: AKKACONF: /tmp/karaf-0.20.0/configuration/initial/akka.conf MODULESCONF: /tmp/karaf-0.20.0/configuration/initial/modules.conf MODULESHARDSCONF: /tmp/karaf-0.20.0/configuration/initial/module-shards.conf SUITES: ++ SSH='ssh -t -t' ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' Changing to /tmp Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip + echo 'Changing to /tmp' + cd /tmp + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip' + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip --2024-01-23 07:32:21-- https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 264424598 (252M) [application/zip] Saving to: ‘karaf-0.20.0.zip’ 0K ........ ........ ........ ........ ........ ........ 1% 59.4M 4s 3072K ........ ........ ........ ........ ........ ........ 2% 109M 3s 6144K ........ ........ ........ ........ ........ ........ 3% 137M 3s 9216K ........ ........ ........ ........ ........ ........ 4% 132M 2s 12288K ........ ........ ........ ........ ........ ........ 5% 164M 2s 15360K ........ ........ ........ ........ ........ ........ 7% 160M 2s 18432K ........ ........ ........ ........ ........ ........ 8% 194M 2s 21504K ........ ........ ........ ........ ........ ........ 9% 235M 2s 24576K ........ ........ ........ ........ ........ ........ 10% 233M 2s 27648K ........ ........ ........ ........ ........ ........ 11% 260M 2s 30720K ........ ........ ........ ........ ........ ........ 13% 239M 1s 33792K ........ ........ ........ ........ ........ ........ 14% 296M 1s 36864K ........ ........ ........ ........ ........ ........ 15% 297M 1s 39936K ........ ........ ........ ........ ........ ........ 16% 250M 1s 43008K ........ ........ ........ ........ ........ ........ 17% 313M 1s 46080K ........ ........ ........ ........ ........ ........ 19% 280M 1s 49152K ........ ........ ........ ........ ........ ........ 20% 298M 1s 52224K ........ ........ ........ ........ ........ ........ 21% 297M 1s 55296K ........ ........ ........ ........ ........ ........ 22% 298M 1s 58368K ........ ........ ........ ........ ........ ........ 23% 305M 1s 61440K ........ ........ ........ ........ ........ ........ 24% 262M 1s 64512K ........ ........ ........ ........ ........ ........ 26% 289M 1s 67584K ........ ........ ........ ........ ........ ........ 27% 106M 1s 70656K ........ ........ ........ ........ ........ ........ 28% 246M 1s 73728K ........ ........ ........ ........ ........ ........ 29% 366M 1s 76800K ........ ........ ........ ........ ........ ........ 30% 293M 1s 79872K ........ ........ ........ ........ ........ ........ 32% 299M 1s 82944K ........ ........ ........ ........ ........ ........ 33% 320M 1s 86016K ........ ........ ........ ........ ........ ........ 34% 327M 1s 89088K ........ ........ ........ ........ ........ ........ 35% 301M 1s 92160K ........ ........ ........ ........ ........ ........ 36% 294M 1s 95232K ........ ........ ........ ........ ........ ........ 38% 308M 1s 98304K ........ ........ ........ ........ ........ ........ 39% 281M 1s 101376K ........ ........ ........ ........ ........ ........ 40% 278M 1s 104448K ........ ........ ........ ........ ........ ........ 41% 260M 1s 107520K ........ ........ ........ ........ ........ ........ 42% 296M 1s 110592K ........ ........ ........ ........ ........ ........ 44% 353M 1s 113664K ........ ........ ........ ........ ........ ........ 45% 346M 1s 116736K ........ ........ ........ ........ ........ ........ 46% 205M 1s 119808K ........ ........ ........ ........ ........ ........ 47% 332M 1s 122880K ........ ........ ........ ........ ........ ........ 48% 352M 1s 125952K ........ ........ ........ ........ ........ ........ 49% 323M 1s 129024K ........ ........ ........ ........ ........ ........ 51% 279M 1s 132096K ........ ........ ........ ........ ........ ........ 52% 312M 1s 135168K ........ ........ ........ ........ ........ ........ 53% 331M 1s 138240K ........ ........ ........ ........ ........ ........ 54% 288M 0s 141312K ........ ........ ........ ........ ........ ........ 55% 330M 0s 144384K ........ ........ ........ ........ ........ ........ 57% 312M 0s 147456K ........ ........ ........ ........ ........ ........ 58% 280M 0s 150528K ........ ........ ........ ........ ........ ........ 59% 207M 0s 153600K ........ ........ ........ ........ ........ ........ 60% 177M 0s 156672K ........ ........ ........ ........ ........ ........ 61% 302M 0s 159744K ........ ........ ........ ........ ........ ........ 63% 301M 0s 162816K ........ ........ ........ ........ ........ ........ 64% 337M 0s 165888K ........ ........ ........ ........ ........ ........ 65% 318M 0s 168960K ........ ........ ........ ........ ........ ........ 66% 284M 0s 172032K ........ ........ ........ ........ ........ ........ 67% 300M 0s 175104K ........ ........ ........ ........ ........ ........ 68% 306M 0s 178176K ........ ........ ........ ........ ........ ........ 70% 307M 0s 181248K ........ ........ ........ ........ ........ ........ 71% 250M 0s 184320K ........ ........ ........ ........ ........ ........ 72% 270M 0s 187392K ........ ........ ........ ........ ........ ........ 73% 252M 0s 190464K ........ ........ ........ ........ ........ ........ 74% 258M 0s 193536K ........ ........ ........ ........ ........ ........ 76% 259M 0s 196608K ........ ........ ........ ........ ........ ........ 77% 250M 0s 199680K ........ ........ ........ ........ ........ ........ 78% 267M 0s 202752K ........ ........ ........ ........ ........ ........ 79% 85.0M 0s 205824K ........ ........ ........ ........ ........ ........ 80% 212M 0s 208896K ........ ........ ........ ........ ........ ........ 82% 217M 0s 211968K ........ ........ ........ ........ ........ ........ 83% 229M 0s 215040K ........ ........ ........ ........ ........ ........ 84% 237M 0s 218112K ........ ........ ........ ........ ........ ........ 85% 233M 0s 221184K ........ ........ ........ ........ ........ ........ 86% 221M 0s 224256K ........ ........ ........ ........ ........ ........ 88% 230M 0s 227328K ........ ........ ........ ........ ........ ........ 89% 231M 0s 230400K ........ ........ ........ ........ ........ ........ 90% 229M 0s 233472K ........ ........ ........ ........ ........ ........ 91% 223M 0s 236544K ........ ........ ........ ........ ........ ........ 92% 233M 0s 239616K ........ ........ ........ ........ ........ ........ 93% 270M 0s 242688K ........ ........ ........ ........ ........ ........ 95% 236M 0s 245760K ........ ........ ........ ........ ........ ........ 96% 228M 0s 248832K ........ ........ ........ ........ ........ ........ 97% 205M 0s 251904K ........ ........ ........ ........ ........ ........ 98% 207M 0s 254976K ........ ........ ........ ........ ........ ........ 99% 179M 0s 258048K .. 100% 368M=1.1s 2024-01-23 07:32:22 (233 MB/s) - ‘karaf-0.20.0.zip’ saved [264424598/264424598] Extracting the new controller... + echo 'Extracting the new controller...' + unzip -q karaf-0.20.0.zip Adding external repositories... + echo 'Adding external repositories...' + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # If set to true, the following property will not allow any certificate to be used # when accessing Maven repositories through SSL # #org.ops4j.pax.url.mvn.certificateCheck= # # Path to the local Maven settings file. # The repositories defined in this file will be automatically added to the list # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property # below is not set. # The following locations are checked for the existence of the settings.xml file # * 1. looks for the specified url # * 2. if not found looks for ${user.home}/.m2/settings.xml # * 3. if not found looks for ${maven.home}/conf/settings.xml # * 4. if not found looks for ${M2_HOME}/conf/settings.xml # #org.ops4j.pax.url.mvn.settings= # # Path to the local Maven repository which is used to avoid downloading # artifacts when they already exist locally. # The value of this property will be extracted from the settings.xml file # above, or defaulted to: # System.getProperty( "user.home" ) + "/.m2/repository" # org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} # # Default this to false. It's just weird to use undocumented repos # org.ops4j.pax.url.mvn.useFallbackRepositories=false # # Uncomment if you don't wanna use the proxy settings # from the Maven conf/settings.xml file # # org.ops4j.pax.url.mvn.proxySupport=false # # Comma separated list of repositories scanned when resolving an artifact. # Those repositories will be checked before iterating through the # below list of repositories and even before the local repository # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snaphots # @noreleases : the repository does not contain any released artifacts # # The following property value will add the system folder as a repo. # org.ops4j.pax.url.mvn.defaultRepositories=\ file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false # # Comma separated list of repositories scanned when resolving an artifact. # The default list includes the following repositories: # http://repo1.maven.org/maven2@id=central # http://repository.springsource.com/maven/bundles/release@id=spring.ebr # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external # http://zodiac.springsource.com/maven/bundles/release@id=gemini # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases # To add repositories to the default ones, prepend '+' to the list of repositories # to add. # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snapshots # @noreleases : the repository does not contain any released artifacts # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended # org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... + [[ True == \T\r\u\e ]] + echo 'Configuring the startup features...' + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia,/g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + FEATURE_TEST_STRING=features-test + FEATURE_TEST_VERSION=0.20.0 + KARAF_VERSION=karaf4 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + [[ ! -z '' ]] + cat /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # Comma separated list of features repositories to register by default # featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/2e2f7245-ee82-4414-908d-2febb6610f0e.xml # # Comma separated list of features to install at startup # featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia, cb8f7765-cdca-452f-8af2-d48adc62ccfa # # Resource repositories (OBR) that the features resolver can use # to resolve requirements/capabilities # # The format of the resourceRepositories is # resourceRepositories=[xml:url|json:url],... # for Instance: # #resourceRepositories=xml:http://host/path/to/index.xml # or #resourceRepositories=json:http://host/path/to/index.json # # # Defines if the boot features are started in asynchronous mode (in a dedicated thread) # featuresBootAsynchronous=false # # Service requirements enforcement # # By default, the feature resolver checks the service requirements/capabilities of # bundles for new features (xml schema >= 1.3.0) in order to automatically installs # the required bundles. # The following flag can have those values: # - disable: service requirements are completely ignored # - default: service requirements are ignored for old features # - enforce: service requirements are always verified # #serviceRequirements=default # # Store cfg file for config element in feature # #configCfgStore=true # # Define if the feature service automatically refresh bundles # autoRefresh=true # # Configuration of features processing mechanism (overrides, blacklisting, modification of features) # XML file defines instructions related to features processing # versions.properties may declare properties to resolve placeholders in XML file # both files are relative to ${karaf.etc} # #featureProcessing=org.apache.karaf.features.xml #featureProcessingVersions=versions.properties + configure_karaf_log karaf4 '' + local -r karaf_version=karaf4 + local -r controllerdebugmap= + local logapi=log4j + grep log4j2 /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n log4j2.rootLogger.level = INFO #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 64MB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 + logapi=log4j2 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' + '[' log4j2 == log4j2 ']' + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' controllerdebugmap: cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + unset IFS + echo 'controllerdebugmap: ' + '[' -n '' ']' + echo 'cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg' + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # Common pattern layout for appenders log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n # Root logger log4j2.rootLogger.level = INFO # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} # Filters for logs marked by org.opendaylight.odlparent.Markers log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL # Loggers configuration # Spifly logger log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN # Security audit logger log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Appenders configuration # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} # Rolling file appender log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i # uncomment to not force a disk flush #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 1GB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 # Audit file appender log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 # OSGi appender log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * # help with identification of maven-related problems with pax-url-aether #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN + set_java_vars /usr/lib/jvm/java-17-openjdk 2048m /tmp/karaf-0.20.0/bin/setenv + local -r java_home=/usr/lib/jvm/java-17-openjdk Configure java home: /usr/lib/jvm/java-17-openjdk max memory: 2048m memconf: /tmp/karaf-0.20.0/bin/setenv + local -r controllermem=2048m + local -r memconf=/tmp/karaf-0.20.0/bin/setenv + echo Configure + echo ' java home: /usr/lib/jvm/java-17-openjdk' + echo ' max memory: 2048m' + echo ' memconf: /tmp/karaf-0.20.0/bin/setenv' + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk}%g' /tmp/karaf-0.20.0/bin/setenv + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.20.0/bin/setenv cat /tmp/karaf-0.20.0/bin/setenv + echo 'cat /tmp/karaf-0.20.0/bin/setenv' + cat /tmp/karaf-0.20.0/bin/setenv #!/bin/sh # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf # script: client, instance, shell, start, status, stop, karaf # # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then # Actions go here... # fi # # general settings which should be applied for all scripts go here; please keep # in mind that it is possible that scripts might be executed more than once, e.g. # in example of the start script where the start script is executed first and the # karaf script afterwards. # # # The following section shows the possible configuration options for the default # karaf scripts # export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk} # Location of Java installation # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options # export EXTRA_JAVA_OPTS # Additional JVM options # export KARAF_HOME # Karaf home folder # export KARAF_DATA # Karaf data folder # export KARAF_BASE # Karaf base folder # export KARAF_ETC # Karaf etc folder # export KARAF_LOG # Karaf log folder # export KARAF_SYSTEM_OPTS # First citizen Karaf options # export KARAF_OPTS # Additional available Karaf options # export KARAF_DEBUG # Enable debug mode # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start # export KARAF_NOROOT # Prevent execution as root if set to true Set Java version + echo 'Set Java version' + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-17-openjdk/bin/java 1 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-17-openjdk/bin/java JDK default version ... + echo 'JDK default version ...' + java -version openjdk version "17.0.6-ea" 2023-01-17 LTS OpenJDK Runtime Environment (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS) OpenJDK 64-Bit Server VM (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS, mixed mode, sharing) Set JAVA_HOME + echo 'Set JAVA_HOME' + export JAVA_HOME=/usr/lib/jvm/java-17-openjdk + JAVA_HOME=/usr/lib/jvm/java-17-openjdk ++ readlink -e /usr/lib/jvm/java-17-openjdk/bin/java Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java + JAVA_RESOLVED=/usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java' Listing all open ports on controller system... + echo 'Listing all open ports on controller system...' + netstat -pnatu (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 164 10.30.170.12:22 10.30.171.25:51486 ESTABLISHED - tcp 0 0 10.30.170.12:50028 199.204.45.87:443 TIME_WAIT - tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::22 :::* LISTEN - udp 0 0 10.30.170.12:68 10.30.170.2:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - + '[' -f /tmp/custom_shard_config.txt ']' Configuring cluster + echo 'Configuring cluster' + /tmp/karaf-0.20.0/bin/configure_cluster.sh 2 10.30.170.15 10.30.170.12 10.30.170.65 ################################################ ## Configure Cluster ## ################################################ NOTE: Cluster configuration files not found. Copying from /tmp/karaf-0.20.0/system/org/opendaylight/controller/sal-clustering-config/8.0.3 Configuring unique name in akka.conf Configuring hostname in akka.conf Configuring data and rpc seed nodes in akka.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Configuring replication type in module-shards.conf ################################################ ## NOTE: Manually restart controller to ## ## apply configuration. ## ################################################ Dump akka.conf + echo 'Dump akka.conf' + cat /tmp/karaf-0.20.0/configuration/initial/akka.conf odl-cluster-data { akka { remote { artery { enabled = on transport = tcp canonical.hostname = "10.30.170.12" canonical.port = 2550 } } cluster { # Using artery. seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.15:2550", "akka://opendaylight-cluster-data@10.30.170.12:2550", "akka://opendaylight-cluster-data@10.30.170.65:2550"] roles = ["member-2"] # when under load we might trip a false positive on the failure detector # failure-detector { # heartbeat-interval = 4 s # acceptable-heartbeat-pause = 16s # } } persistence { # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by # modifying the following two properties. The directory location specified may be a relative or absolute path. # The relative path is always relative to KARAF_HOME. # snapshot-store.local.dir = "target/snapshots" # Use lz4 compression for LocalSnapshotStore snapshots snapshot-store.local.use-lz4-compression = false # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB snapshot-store.local.lz4-blocksize = 256KB } disable-default-actor-system-quarantined-event-handling = "false" } } Dump modules.conf + echo 'Dump modules.conf' + cat /tmp/karaf-0.20.0/configuration/initial/modules.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Dump module-shards.conf + echo 'Dump module-shards.conf' + cat /tmp/karaf-0.20.0/configuration/initial/module-shards.conf module-shards = [ { name = "default" shards = [ { name = "default" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "inventory" shards = [ { name="inventory" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "topology" shards = [ { name="topology" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "toaster" shards = [ { name="toaster" replicas = ["member-1", "member-2", "member-3"] } ] } ] Configuring member-3 with IP address 10.30.170.65 Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. common-functions.sh is being sourced + source /tmp/common-functions.sh karaf-0.20.0 ++ [[ /tmp/common-functions.sh == \/\t\m\p\/\c\o\n\f\i\g\u\r\a\t\i\o\n\-\s\c\r\i\p\t\.\s\h ]] ++ echo 'common-functions.sh is being sourced' ++ BUNDLEFOLDER=karaf-0.20.0 ++ export MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ MAVENCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ++ export FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ FEATURESCONF=/tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ++ export CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ CUSTOMPROP=/tmp/karaf-0.20.0/etc/custom.properties ++ export LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ LOGCONF=/tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ++ export MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ MEMCONF=/tmp/karaf-0.20.0/bin/setenv ++ export CONTROLLERMEM= ++ CONTROLLERMEM= ++ export AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ AKKACONF=/tmp/karaf-0.20.0/configuration/initial/akka.conf ++ export MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ MODULESCONF=/tmp/karaf-0.20.0/configuration/initial/modules.conf ++ export MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ MODULESHARDSCONF=/tmp/karaf-0.20.0/configuration/initial/module-shards.conf ++ print_common_env ++ cat common-functions environment: MAVENCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ACTUALFEATURES: FEATURESCONF: /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg CUSTOMPROP: /tmp/karaf-0.20.0/etc/custom.properties LOGCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg MEMCONF: /tmp/karaf-0.20.0/bin/setenv CONTROLLERMEM: AKKACONF: /tmp/karaf-0.20.0/configuration/initial/akka.conf MODULESCONF: /tmp/karaf-0.20.0/configuration/initial/modules.conf MODULESHARDSCONF: /tmp/karaf-0.20.0/configuration/initial/module-shards.conf SUITES: ++ SSH='ssh -t -t' ++ extra_services_cntl=' dnsmasq.service httpd.service libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service rabbitmq-server.service ' ++ extra_services_cmp=' libvirtd.service openvswitch.service ovs-vswitchd.service ovsdb-server.service ' Changing to /tmp Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip + echo 'Changing to /tmp' + cd /tmp + echo 'Downloading the distribution from https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip' + wget --progress=dot:mega https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip --2024-01-23 07:32:26-- https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip Resolving nexus.opendaylight.org (nexus.opendaylight.org)... 199.204.45.87, 2604:e100:1:0:f816:3eff:fe45:48d6 Connecting to nexus.opendaylight.org (nexus.opendaylight.org)|199.204.45.87|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 264424598 (252M) [application/zip] Saving to: ‘karaf-0.20.0.zip’ 0K ........ ........ ........ ........ ........ ........ 1% 86.2M 3s 3072K ........ ........ ........ ........ ........ ........ 2% 139M 2s 6144K ........ ........ ........ ........ ........ ........ 3% 90.0M 2s 9216K ........ ........ ........ ........ ........ ........ 4% 205M 2s 12288K ........ ........ ........ ........ ........ ........ 5% 178M 2s 15360K ........ ........ ........ ........ ........ ........ 7% 225M 2s 18432K ........ ........ ........ ........ ........ ........ 8% 213M 2s 21504K ........ ........ ........ ........ ........ ........ 9% 183M 2s 24576K ........ ........ ........ ........ ........ ........ 10% 133M 2s 27648K ........ ........ ........ ........ ........ ........ 11% 222M 1s 30720K ........ ........ ........ ........ ........ ........ 13% 238M 1s 33792K ........ ........ ........ ........ ........ ........ 14% 251M 1s 36864K ........ ........ ........ ........ ........ ........ 15% 256M 1s 39936K ........ ........ ........ ........ ........ ........ 16% 245M 1s 43008K ........ ........ ........ ........ ........ ........ 17% 261M 1s 46080K ........ ........ ........ ........ ........ ........ 19% 253M 1s 49152K ........ ........ ........ ........ ........ ........ 20% 214M 1s 52224K ........ ........ ........ ........ ........ ........ 21% 249M 1s 55296K ........ ........ ........ ........ ........ ........ 22% 233M 1s 58368K ........ ........ ........ ........ ........ ........ 23% 285M 1s 61440K ........ ........ ........ ........ ........ ........ 24% 301M 1s 64512K ........ ........ ........ ........ ........ ........ 26% 309M 1s 67584K ........ ........ ........ ........ ........ ........ 27% 304M 1s 70656K ........ ........ ........ ........ ........ ........ 28% 306M 1s 73728K ........ ........ ........ ........ ........ ........ 29% 305M 1s 76800K ........ ........ ........ ........ ........ ........ 30% 311M 1s 79872K ........ ........ ........ ........ ........ ........ 32% 286M 1s 82944K ........ ........ ........ ........ ........ ........ 33% 297M 1s 86016K ........ ........ ........ ........ ........ ........ 34% 305M 1s 89088K ........ ........ ........ ........ ........ ........ 35% 300M 1s 92160K ........ ........ ........ ........ ........ ........ 36% 297M 1s 95232K ........ ........ ........ ........ ........ ........ 38% 304M 1s 98304K ........ ........ ........ ........ ........ ........ 39% 289M 1s 101376K ........ ........ ........ ........ ........ ........ 40% 294M 1s 104448K ........ ........ ........ ........ ........ ........ 41% 300M 1s 107520K ........ ........ ........ ........ ........ ........ 42% 303M 1s 110592K ........ ........ ........ ........ ........ ........ 44% 181M 1s 113664K ........ ........ ........ ........ ........ ........ 45% 353M 1s 116736K ........ ........ ........ ........ ........ ........ 46% 317M 1s 119808K ........ ........ ........ ........ ........ ........ 47% 303M 1s 122880K ........ ........ ........ ........ ........ ........ 48% 299M 1s 125952K ........ ........ ........ ........ ........ ........ 49% 314M 1s 129024K ........ ........ ........ ........ ........ ........ 51% 296M 1s 132096K ........ ........ ........ ........ ........ ........ 52% 297M 1s 135168K ........ ........ ........ ........ ........ ........ 53% 298M 1s 138240K ........ ........ ........ ........ ........ ........ 54% 308M 0s 141312K ........ ........ ........ ........ ........ ........ 55% 284M 0s 144384K ........ ........ ........ ........ ........ ........ 57% 286M 0s 147456K ........ ........ ........ ........ ........ ........ 58% 302M 0s 150528K ........ ........ ........ ........ ........ ........ 59% 280M 0s 153600K ........ ........ ........ ........ ........ ........ 60% 311M 0s 156672K ........ ........ ........ ........ ........ ........ 61% 293M 0s 159744K ........ ........ ........ ........ ........ ........ 63% 298M 0s 162816K ........ ........ ........ ........ ........ ........ 64% 296M 0s 165888K ........ ........ ........ ........ ........ ........ 65% 292M 0s 168960K ........ ........ ........ ........ ........ ........ 66% 296M 0s 172032K ........ ........ ........ ........ ........ ........ 67% 295M 0s 175104K ........ ........ ........ ........ ........ ........ 68% 274M 0s 178176K ........ ........ ........ ........ ........ ........ 70% 283M 0s 181248K ........ ........ ........ ........ ........ ........ 71% 312M 0s 184320K ........ ........ ........ ........ ........ ........ 72% 294M 0s 187392K ........ ........ ........ ........ ........ ........ 73% 306M 0s 190464K ........ ........ ........ ........ ........ ........ 74% 311M 0s 193536K ........ ........ ........ ........ ........ ........ 76% 334M 0s 196608K ........ ........ ........ ........ ........ ........ 77% 292M 0s 199680K ........ ........ ........ ........ ........ ........ 78% 339M 0s 202752K ........ ........ ........ ........ ........ ........ 79% 282M 0s 205824K ........ ........ ........ ........ ........ ........ 80% 300M 0s 208896K ........ ........ ........ ........ ........ ........ 82% 307M 0s 211968K ........ ........ ........ ........ ........ ........ 83% 309M 0s 215040K ........ ........ ........ ........ ........ ........ 84% 290M 0s 218112K ........ ........ ........ ........ ........ ........ 85% 304M 0s 221184K ........ ........ ........ ........ ........ ........ 86% 292M 0s 224256K ........ ........ ........ ........ ........ ........ 88% 329M 0s 227328K ........ ........ ........ ........ ........ ........ 89% 294M 0s 230400K ........ ........ ........ ........ ........ ........ 90% 307M 0s 233472K ........ ........ ........ ........ ........ ........ 91% 298M 0s 236544K ........ ........ ........ ........ ........ ........ 92% 305M 0s 239616K ........ ........ ........ ........ ........ ........ 93% 293M 0s 242688K ........ ........ ........ ........ ........ ........ 95% 295M 0s 245760K ........ ........ ........ ........ ........ ........ 96% 295M 0s 248832K ........ ........ ........ ........ ........ ........ 97% 279M 0s 251904K ........ ........ ........ ........ ........ ........ 98% 290M 0s 254976K ........ ........ ........ ........ ........ ........ 99% 307M 0s 258048K .. 100% 230M=1.0s 2024-01-23 07:32:27 (260 MB/s) - ‘karaf-0.20.0.zip’ saved [264424598/264424598] Extracting the new controller... + echo 'Extracting the new controller...' + unzip -q karaf-0.20.0.zip Adding external repositories... + echo 'Adding external repositories...' + sed -ie 's%org.ops4j.pax.url.mvn.repositories=%org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases%g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # If set to true, the following property will not allow any certificate to be used # when accessing Maven repositories through SSL # #org.ops4j.pax.url.mvn.certificateCheck= # # Path to the local Maven settings file. # The repositories defined in this file will be automatically added to the list # of default repositories if the 'org.ops4j.pax.url.mvn.repositories' property # below is not set. # The following locations are checked for the existence of the settings.xml file # * 1. looks for the specified url # * 2. if not found looks for ${user.home}/.m2/settings.xml # * 3. if not found looks for ${maven.home}/conf/settings.xml # * 4. if not found looks for ${M2_HOME}/conf/settings.xml # #org.ops4j.pax.url.mvn.settings= # # Path to the local Maven repository which is used to avoid downloading # artifacts when they already exist locally. # The value of this property will be extracted from the settings.xml file # above, or defaulted to: # System.getProperty( "user.home" ) + "/.m2/repository" # org.ops4j.pax.url.mvn.localRepository=${karaf.home}/${karaf.default.repository} # # Default this to false. It's just weird to use undocumented repos # org.ops4j.pax.url.mvn.useFallbackRepositories=false # # Uncomment if you don't wanna use the proxy settings # from the Maven conf/settings.xml file # # org.ops4j.pax.url.mvn.proxySupport=false # # Comma separated list of repositories scanned when resolving an artifact. # Those repositories will be checked before iterating through the # below list of repositories and even before the local repository # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snaphots # @noreleases : the repository does not contain any released artifacts # # The following property value will add the system folder as a repo. # org.ops4j.pax.url.mvn.defaultRepositories=\ file:${karaf.home}/${karaf.default.repository}@id=system.repository@snapshots,\ file:${karaf.data}/kar@id=kar.repository@multi@snapshots,\ file:${karaf.base}/${karaf.default.repository}@id=child.system.repository@snapshots # Use the default local repo (e.g.~/.m2/repository) as a "remote" repo #org.ops4j.pax.url.mvn.defaultLocalRepoAsRemote=false # # Comma separated list of repositories scanned when resolving an artifact. # The default list includes the following repositories: # http://repo1.maven.org/maven2@id=central # http://repository.springsource.com/maven/bundles/release@id=spring.ebr # http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external # http://zodiac.springsource.com/maven/bundles/release@id=gemini # http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases # https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases # https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases # To add repositories to the default ones, prepend '+' to the list of repositories # to add. # A repository url can be appended with zero or more of the following flags: # @snapshots : the repository contains snapshots # @noreleases : the repository does not contain any released artifacts # @id=repository.id : the id for the repository, just like in the settings.xml this is optional but recommended # org.ops4j.pax.url.mvn.repositories=https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot@id=opendaylight-snapshot@snapshots, https://nexus.opendaylight.org/content/repositories/public@id=opendaylight-mirror, http://repo1.maven.org/maven2@id=central, http://repository.springsource.com/maven/bundles/release@id=spring.ebr.release, http://repository.springsource.com/maven/bundles/external@id=spring.ebr.external, http://zodiac.springsource.com/maven/bundles/release@id=gemini, http://repository.apache.org/content/groups/snapshots-group@id=apache@snapshots@noreleases, https://oss.sonatype.org/content/repositories/snapshots@id=sonatype.snapshots.deploy@snapshots@noreleases, https://oss.sonatype.org/content/repositories/ops4j-snapshots@id=ops4j.sonatype.snapshots.deploy@snapshots@noreleases ### ^^^ No remote repositories. This is the only ODL change compared to Karaf defaults.Configuring the startup features... + [[ True == \T\r\u\e ]] + echo 'Configuring the startup features...' + sed -ie 's/\(featuresBoot=\|featuresBoot =\)/featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia,/g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + FEATURE_TEST_STRING=features-test + FEATURE_TEST_VERSION=0.20.0 + KARAF_VERSION=karaf4 + [[ integration == \i\n\t\e\g\r\a\t\i\o\n ]] + sed -ie 's%\(featuresRepositories=\|featuresRepositories =\)%featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features,%g' /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg + [[ ! -z '' ]] + cat /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # # Comma separated list of features repositories to register by default # featuresRepositories = mvn:org.opendaylight.integration/features-test/0.20.0/xml/features,mvn:org.apache.karaf.decanter/apache-karaf-decanter/1.2.0/xml/features, file:${karaf.etc}/2e2f7245-ee82-4414-908d-2febb6610f0e.xml # # Comma separated list of features to install at startup # featuresBoot = odl-infrautils-ready,odl-jolokia,odl-daexim-all,odl-netconf-topology,odl-jolokia, cb8f7765-cdca-452f-8af2-d48adc62ccfa # # Resource repositories (OBR) that the features resolver can use # to resolve requirements/capabilities # # The format of the resourceRepositories is # resourceRepositories=[xml:url|json:url],... # for Instance: # #resourceRepositories=xml:http://host/path/to/index.xml # or #resourceRepositories=json:http://host/path/to/index.json # # # Defines if the boot features are started in asynchronous mode (in a dedicated thread) # featuresBootAsynchronous=false # # Service requirements enforcement # # By default, the feature resolver checks the service requirements/capabilities of # bundles for new features (xml schema >= 1.3.0) in order to automatically installs # the required bundles. # The following flag can have those values: # - disable: service requirements are completely ignored # - default: service requirements are ignored for old features # - enforce: service requirements are always verified # #serviceRequirements=default # # Store cfg file for config element in feature # #configCfgStore=true # # Define if the feature service automatically refresh bundles # autoRefresh=true # # Configuration of features processing mechanism (overrides, blacklisting, modification of features) # XML file defines instructions related to features processing # versions.properties may declare properties to resolve placeholders in XML file # both files are relative to ${karaf.etc} # #featureProcessing=org.apache.karaf.features.xml #featureProcessingVersions=versions.properties + configure_karaf_log karaf4 '' + local -r karaf_version=karaf4 + local -r controllerdebugmap= + local logapi=log4j + grep log4j2 /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n log4j2.rootLogger.level = INFO #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 64MB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE Configuring the karaf log... karaf_version: karaf4, logapi: log4j2 + logapi=log4j2 + echo 'Configuring the karaf log... karaf_version: karaf4, logapi: log4j2' + '[' log4j2 == log4j2 ']' + sed -ie 's/log4j2.appender.rolling.policies.size.size = 64MB/log4j2.appender.rolling.policies.size.size = 1GB/g' /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + orgmodule=org.opendaylight.yangtools.yang.parser.repo.YangTextSchemaContextResolver + orgmodule_=org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN' controllerdebugmap: + echo 'log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN' + unset IFS + echo 'controllerdebugmap: ' cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg + '[' -n '' ']' + echo 'cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg' + cat /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg ################################################################################ # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # ################################################################################ # Common pattern layout for appenders log4j2.pattern = %d{ISO8601} | %-5p | %-16t | %-32c{1} | %X{bundle.id} - %X{bundle.name} - %X{bundle.version} | %m%n # Root logger log4j2.rootLogger.level = INFO # uncomment to use asynchronous loggers, which require mvn:com.lmax/disruptor/3.3.2 library #log4j2.rootLogger.type = asyncRoot #log4j2.rootLogger.includeLocation = false log4j2.rootLogger.appenderRef.RollingFile.ref = RollingFile log4j2.rootLogger.appenderRef.PaxOsgi.ref = PaxOsgi log4j2.rootLogger.appenderRef.Console.ref = Console log4j2.rootLogger.appenderRef.Console.filter.threshold.type = ThresholdFilter log4j2.rootLogger.appenderRef.Console.filter.threshold.level = ${karaf.log.console:-OFF} # Filters for logs marked by org.opendaylight.odlparent.Markers log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.type = ContextMapFilter log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.type = KeyValuePair log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.key = slf4j.marker log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.pair1.value = CONFIDENTIAL log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.operator = or log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMatch = DENY log4j2.rootLogger.appenderRef.RollingFile.filter.confidential.onMismatch = NEUTRAL # Loggers configuration # Spifly logger log4j2.logger.spifly.name = org.apache.aries.spifly log4j2.logger.spifly.level = WARN # Security audit logger log4j2.logger.audit.name = org.apache.karaf.jaas.modules.audit log4j2.logger.audit.level = INFO log4j2.logger.audit.additivity = false log4j2.logger.audit.appenderRef.AuditRollingFile.ref = AuditRollingFile # Appenders configuration # Console appender not used by default (see log4j2.rootLogger.appenderRefs) log4j2.appender.console.type = Console log4j2.appender.console.name = Console log4j2.appender.console.layout.type = PatternLayout log4j2.appender.console.layout.pattern = ${log4j2.pattern} # Rolling file appender log4j2.appender.rolling.type = RollingRandomAccessFile log4j2.appender.rolling.name = RollingFile log4j2.appender.rolling.fileName = ${karaf.data}/log/karaf.log log4j2.appender.rolling.filePattern = ${karaf.data}/log/karaf.log.%i # uncomment to not force a disk flush #log4j2.appender.rolling.immediateFlush = false log4j2.appender.rolling.append = true log4j2.appender.rolling.layout.type = PatternLayout log4j2.appender.rolling.layout.pattern = ${log4j2.pattern} log4j2.appender.rolling.policies.type = Policies log4j2.appender.rolling.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.rolling.policies.size.size = 1GB log4j2.appender.rolling.strategy.type = DefaultRolloverStrategy log4j2.appender.rolling.strategy.max = 7 # Audit file appender log4j2.appender.audit.type = RollingRandomAccessFile log4j2.appender.audit.name = AuditRollingFile log4j2.appender.audit.fileName = ${karaf.data}/security/audit.log log4j2.appender.audit.filePattern = ${karaf.data}/security/audit.log.%i log4j2.appender.audit.append = true log4j2.appender.audit.layout.type = PatternLayout log4j2.appender.audit.layout.pattern = ${log4j2.pattern} log4j2.appender.audit.policies.type = Policies log4j2.appender.audit.policies.size.type = SizeBasedTriggeringPolicy log4j2.appender.audit.policies.size.size = 8MB log4j2.appender.audit.strategy.type = DefaultRolloverStrategy log4j2.appender.audit.strategy.max = 7 # OSGi appender log4j2.appender.osgi.type = PaxOsgi log4j2.appender.osgi.name = PaxOsgi log4j2.appender.osgi.filter = * # help with identification of maven-related problems with pax-url-aether #log4j2.logger.aether.name = shaded.org.eclipse.aether #log4j2.logger.aether.level = TRACE #log4j2.logger.http-headers.name = shaded.org.apache.http.headers #log4j2.logger.http-headers.level = DEBUG #log4j2.logger.maven.name = org.ops4j.pax.url.mvn #log4j2.logger.maven.level = TRACE log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.name = WARN log4j2.logger.org_opendaylight_yangtools_yang_parser_repo_YangTextSchemaContextResolver.level = WARN + set_java_vars /usr/lib/jvm/java-17-openjdk 2048m /tmp/karaf-0.20.0/bin/setenv + local -r java_home=/usr/lib/jvm/java-17-openjdk + local -r controllermem=2048m Configure java home: /usr/lib/jvm/java-17-openjdk max memory: 2048m memconf: /tmp/karaf-0.20.0/bin/setenv + local -r memconf=/tmp/karaf-0.20.0/bin/setenv + echo Configure + echo ' java home: /usr/lib/jvm/java-17-openjdk' + echo ' max memory: 2048m' + echo ' memconf: /tmp/karaf-0.20.0/bin/setenv' + sed -ie 's%^# export JAVA_HOME%export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk}%g' /tmp/karaf-0.20.0/bin/setenv + sed -ie 's/JAVA_MAX_MEM="2048m"/JAVA_MAX_MEM=2048m/g' /tmp/karaf-0.20.0/bin/setenv cat /tmp/karaf-0.20.0/bin/setenv + echo 'cat /tmp/karaf-0.20.0/bin/setenv' + cat /tmp/karaf-0.20.0/bin/setenv #!/bin/sh # # Licensed to the Apache Software Foundation (ASF) under one or more # contributor license agreements. See the NOTICE file distributed with # this work for additional information regarding copyright ownership. # The ASF licenses this file to You under the Apache License, Version 2.0 # (the "License"); you may not use this file except in compliance with # the License. You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law or agreed to in writing, software # distributed under the License is distributed on an "AS IS" BASIS, # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # See the License for the specific language governing permissions and # limitations under the License. # # # handle specific scripts; the SCRIPT_NAME is exactly the name of the Karaf # script: client, instance, shell, start, status, stop, karaf # # if [ "${KARAF_SCRIPT}" == "SCRIPT_NAME" ]; then # Actions go here... # fi # # general settings which should be applied for all scripts go here; please keep # in mind that it is possible that scripts might be executed more than once, e.g. # in example of the start script where the start script is executed first and the # karaf script afterwards. # # # The following section shows the possible configuration options for the default # karaf scripts # export JAVA_HOME=${JAVA_HOME:-/usr/lib/jvm/java-17-openjdk} # Location of Java installation # export JAVA_OPTS # Generic JVM options, for instance, where you can pass the memory configuration # export JAVA_NON_DEBUG_OPTS # Additional non-debug JVM options # export EXTRA_JAVA_OPTS # Additional JVM options # export KARAF_HOME # Karaf home folder # export KARAF_DATA # Karaf data folder # export KARAF_BASE # Karaf base folder # export KARAF_ETC # Karaf etc folder # export KARAF_LOG # Karaf log folder # export KARAF_SYSTEM_OPTS # First citizen Karaf options # export KARAF_OPTS # Additional available Karaf options # export KARAF_DEBUG # Enable debug mode # export KARAF_REDIRECT # Enable/set the std/err redirection when using bin/start # export KARAF_NOROOT # Prevent execution as root if set to true Set Java version + echo 'Set Java version' + sudo /usr/sbin/alternatives --install /usr/bin/java java /usr/lib/jvm/java-17-openjdk/bin/java 1 + sudo /usr/sbin/alternatives --set java /usr/lib/jvm/java-17-openjdk/bin/java JDK default version ... + echo 'JDK default version ...' + java -version openjdk version "17.0.6-ea" 2023-01-17 LTS OpenJDK Runtime Environment (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS) OpenJDK 64-Bit Server VM (Red_Hat-17.0.6.0.9-0.3.ea.el8) (build 17.0.6-ea+9-LTS, mixed mode, sharing) Set JAVA_HOME + echo 'Set JAVA_HOME' + export JAVA_HOME=/usr/lib/jvm/java-17-openjdk + JAVA_HOME=/usr/lib/jvm/java-17-openjdk ++ readlink -e /usr/lib/jvm/java-17-openjdk/bin/java Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java Listing all open ports on controller system... + JAVA_RESOLVED=/usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java + echo 'Java binary pointed at by JAVA_HOME: /usr/lib/jvm/java-17-openjdk-17.0.6.0.9-0.3.ea.el8.x86_64/bin/java' + echo 'Listing all open ports on controller system...' + netstat -pnatu (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 0 10.30.170.65:22 10.30.171.25:36896 ESTABLISHED - tcp 0 0 10.30.170.65:34328 199.204.45.87:443 TIME_WAIT - tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::22 :::* LISTEN - udp 0 0 10.30.170.65:68 10.30.170.3:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - Configuring cluster + '[' -f /tmp/custom_shard_config.txt ']' + echo 'Configuring cluster' + /tmp/karaf-0.20.0/bin/configure_cluster.sh 3 10.30.170.15 10.30.170.12 10.30.170.65 ################################################ ## Configure Cluster ## ################################################ NOTE: Cluster configuration files not found. Copying from /tmp/karaf-0.20.0/system/org/opendaylight/controller/sal-clustering-config/8.0.3 Configuring unique name in akka.conf Configuring hostname in akka.conf Configuring data and rpc seed nodes in akka.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Configuring replication type in module-shards.conf ################################################ ## NOTE: Manually restart controller to ## ## apply configuration. ## ################################################ Dump akka.conf + echo 'Dump akka.conf' + cat /tmp/karaf-0.20.0/configuration/initial/akka.conf odl-cluster-data { akka { remote { artery { enabled = on transport = tcp canonical.hostname = "10.30.170.65" canonical.port = 2550 } } cluster { # Using artery. seed-nodes = ["akka://opendaylight-cluster-data@10.30.170.15:2550", "akka://opendaylight-cluster-data@10.30.170.12:2550", "akka://opendaylight-cluster-data@10.30.170.65:2550"] roles = ["member-3"] # when under load we might trip a false positive on the failure detector # failure-detector { # heartbeat-interval = 4 s # acceptable-heartbeat-pause = 16s # } } persistence { # By default the snapshots/journal directories live in KARAF_HOME. You can choose to put it somewhere else by # modifying the following two properties. The directory location specified may be a relative or absolute path. # The relative path is always relative to KARAF_HOME. # snapshot-store.local.dir = "target/snapshots" # Use lz4 compression for LocalSnapshotStore snapshots snapshot-store.local.use-lz4-compression = false # Size of blocks for lz4 compression: 64KB, 256KB, 1MB or 4MB snapshot-store.local.lz4-blocksize = 256KB } disable-default-actor-system-quarantined-event-handling = "false" } } Dump modules.conf + echo 'Dump modules.conf' + cat /tmp/karaf-0.20.0/configuration/initial/modules.conf modules = [ { name = "inventory" namespace = "urn:opendaylight:inventory" shard-strategy = "module" }, { name = "topology" namespace = "urn:TBD:params:xml:ns:yang:network-topology" shard-strategy = "module" }, { name = "toaster" namespace = "http://netconfcentral.org/ns/toaster" shard-strategy = "module" } ] Dump module-shards.conf + echo 'Dump module-shards.conf' + cat /tmp/karaf-0.20.0/configuration/initial/module-shards.conf module-shards = [ { name = "default" shards = [ { name = "default" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "inventory" shards = [ { name="inventory" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "topology" shards = [ { name="topology" replicas = ["member-1", "member-2", "member-3"] } ] }, { name = "toaster" shards = [ { name="toaster" replicas = ["member-1", "member-2", "member-3"] } ] } ] Locating config plan to use... Finished running config plans Starting member-1 with IP address 10.30.170.15 Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. Redirecting karaf console output to karaf_console.log Starting controller... start: Redirecting Karaf output to /tmp/karaf-0.20.0/data/log/karaf_console.log Starting member-2 with IP address 10.30.170.12 Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. Redirecting karaf console output to karaf_console.log Starting controller... start: Redirecting Karaf output to /tmp/karaf-0.20.0/data/log/karaf_console.log Starting member-3 with IP address 10.30.170.65 Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. Redirecting karaf console output to karaf_console.log Starting controller... start: Redirecting Karaf output to /tmp/karaf-0.20.0/data/log/karaf_console.log [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins18005747922195878089.sh common-functions.sh is being sourced common-functions environment: MAVENCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.url.mvn.cfg ACTUALFEATURES: FEATURESCONF: /tmp/karaf-0.20.0/etc/org.apache.karaf.features.cfg CUSTOMPROP: /tmp/karaf-0.20.0/etc/custom.properties LOGCONF: /tmp/karaf-0.20.0/etc/org.ops4j.pax.logging.cfg MEMCONF: /tmp/karaf-0.20.0/bin/setenv CONTROLLERMEM: 2048m AKKACONF: /tmp/karaf-0.20.0/configuration/initial/akka.conf MODULESCONF: /tmp/karaf-0.20.0/configuration/initial/modules.conf MODULESHARDSCONF: /tmp/karaf-0.20.0/configuration/initial/module-shards.conf SUITES: + echo '#################################################' ################################################# + echo '## Verify Cluster is UP ##' ## Verify Cluster is UP ## + echo '#################################################' ################################################# + create_post_startup_script + cat + copy_and_run_post_startup_script + seed_index=1 ++ seq 1 3 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_1_IP + echo 'Execute the post startup script on controller 10.30.170.15' Execute the post startup script on controller 10.30.170.15 + scp /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/post-startup-script.sh 10.30.170.15:/tmp/ Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + ssh 10.30.170.15 'bash /tmp/post-startup-script.sh 1' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. tcp6 0 0 :::8101 :::* LISTEN Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 2024-01-23T07:33:06,908 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Controller is UP 2024-01-23T07:33:06,908 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Listing all open ports on controller system... (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 164 10.30.170.15:22 10.30.171.25:48714 ESTABLISHED - tcp 0 0 10.30.170.15:59336 199.204.45.87:443 TIME_WAIT - tcp6 0 0 :::8101 :::* LISTEN 5774/java tcp6 0 0 127.0.0.1:1099 :::* LISTEN 5774/java tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 127.0.0.1:46387 :::* LISTEN 5774/java tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::8181 :::* LISTEN 5774/java tcp6 0 0 10.30.170.15:2550 :::* LISTEN 5774/java tcp6 0 0 :::22 :::* LISTEN - tcp6 0 0 127.0.0.1:44444 :::* LISTEN 5774/java tcp6 0 0 10.30.170.15:45218 10.30.170.65:2550 ESTABLISHED 5774/java tcp6 0 0 10.30.170.15:48412 10.30.170.12:2550 ESTABLISHED 5774/java tcp6 0 0 127.0.0.1:41968 127.0.0.1:44444 ESTABLISHED 5774/java tcp6 0 0 127.0.0.1:41952 127.0.0.1:44444 TIME_WAIT - tcp6 32 0 10.30.170.15:50190 199.204.45.87:443 CLOSE_WAIT 5774/java tcp6 0 0 10.30.170.15:2550 10.30.170.12:52874 ESTABLISHED 5774/java tcp6 0 0 127.0.0.1:44444 127.0.0.1:41968 ESTABLISHED 5774/java tcp6 0 0 10.30.170.15:2550 10.30.170.65:52476 ESTABLISHED 5774/java tcp6 0 0 10.30.170.15:45228 10.30.170.65:2550 ESTABLISHED 5774/java tcp6 0 0 127.0.0.1:56956 127.0.0.1:1099 TIME_WAIT - tcp6 0 0 10.30.170.15:2550 10.30.170.65:52460 ESTABLISHED 5774/java tcp6 0 0 10.30.170.15:48410 10.30.170.12:2550 ESTABLISHED 5774/java tcp6 0 0 10.30.170.15:2550 10.30.170.12:52884 ESTABLISHED 5774/java udp 0 0 10.30.170.15:68 10.30.170.3:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - looking for "BindException: Address already in use" in log file looking for "server is unhealthy" in log file + '[' 1 == 0 ']' + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_2_IP + echo 'Execute the post startup script on controller 10.30.170.12' Execute the post startup script on controller 10.30.170.12 + scp /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/post-startup-script.sh 10.30.170.12:/tmp/ Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + ssh 10.30.170.12 'bash /tmp/post-startup-script.sh 2' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. tcp6 0 0 :::8101 :::* LISTEN Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 2024-01-23T07:33:06,906 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Controller is UP 2024-01-23T07:33:06,906 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Listing all open ports on controller system... (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 164 10.30.170.12:22 10.30.171.25:34700 ESTABLISHED - tcp 0 0 10.30.170.12:50028 199.204.45.87:443 TIME_WAIT - tcp6 0 0 127.0.0.1:1099 :::* LISTEN 5764/java tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 127.0.0.1:35091 :::* LISTEN 5764/java tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::8181 :::* LISTEN 5764/java tcp6 0 0 10.30.170.12:2550 :::* LISTEN 5764/java tcp6 0 0 :::22 :::* LISTEN - tcp6 0 0 127.0.0.1:44444 :::* LISTEN 5764/java tcp6 0 0 :::8101 :::* LISTEN 5764/java tcp6 0 0 10.30.170.12:2550 10.30.170.15:48410 ESTABLISHED 5764/java tcp6 0 0 10.30.170.12:53464 10.30.170.65:2550 ESTABLISHED 5764/java tcp6 0 0 10.30.170.12:2550 10.30.170.65:52940 ESTABLISHED 5764/java tcp6 0 0 10.30.170.12:2550 10.30.170.65:52926 ESTABLISHED 5764/java tcp6 0 0 10.30.170.12:52874 10.30.170.15:2550 ESTABLISHED 5764/java tcp6 0 0 10.30.170.12:52884 10.30.170.15:2550 ESTABLISHED 5764/java tcp6 32 0 10.30.170.12:48110 199.204.45.87:443 CLOSE_WAIT 5764/java tcp6 0 0 10.30.170.12:53470 10.30.170.65:2550 ESTABLISHED 5764/java tcp6 0 0 127.0.0.1:49806 127.0.0.1:1099 TIME_WAIT - tcp6 0 0 127.0.0.1:60748 127.0.0.1:44444 TIME_WAIT - tcp6 0 0 127.0.0.1:60756 127.0.0.1:44444 TIME_WAIT - tcp6 0 0 10.30.170.12:2550 10.30.170.15:48412 ESTABLISHED 5764/java udp 0 0 10.30.170.12:68 10.30.170.2:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - looking for "BindException: Address already in use" in log file looking for "server is unhealthy" in log file + '[' 2 == 0 ']' + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_3_IP + echo 'Execute the post startup script on controller 10.30.170.65' Execute the post startup script on controller 10.30.170.65 + scp /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/post-startup-script.sh 10.30.170.65:/tmp/ Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + ssh 10.30.170.65 'bash /tmp/post-startup-script.sh 3' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. tcp6 0 0 :::8101 :::* LISTEN Waiting up to 3 minutes for controller to come up, checking every 5 seconds... 2024-01-23T07:33:06,735 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Controller is UP 2024-01-23T07:33:06,735 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 209 - org.opendaylight.infrautils.ready-api - 6.0.4 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] Listing all open ports on controller system... (Not all processes could be identified, non-owned process info will not be shown, you would have to be root to see it all.) Active Internet connections (servers and established) Proto Recv-Q Send-Q Local Address Foreign Address State PID/Program name tcp 0 0 0.0.0.0:111 0.0.0.0:* LISTEN - tcp 0 0 0.0.0.0:22 0.0.0.0:* LISTEN - tcp 0 0 10.30.170.65:34328 199.204.45.87:443 TIME_WAIT - tcp 0 164 10.30.170.65:22 10.30.171.25:47468 ESTABLISHED - tcp6 0 0 127.0.0.1:37385 :::* LISTEN 5770/java tcp6 0 0 127.0.0.1:1099 :::* LISTEN 5770/java tcp6 0 0 :::111 :::* LISTEN - tcp6 0 0 :::5555 :::* LISTEN - tcp6 0 0 :::8181 :::* LISTEN 5770/java tcp6 0 0 10.30.170.65:2550 :::* LISTEN 5770/java tcp6 0 0 :::22 :::* LISTEN - tcp6 0 0 127.0.0.1:44444 :::* LISTEN 5770/java tcp6 0 0 :::8101 :::* LISTEN 5770/java tcp6 0 0 10.30.170.65:52460 10.30.170.15:2550 ESTABLISHED 5770/java tcp6 0 0 127.0.0.1:44068 127.0.0.1:44444 TIME_WAIT - tcp6 32 0 10.30.170.65:45720 199.204.45.87:443 CLOSE_WAIT 5770/java tcp6 0 0 10.30.170.65:2550 10.30.170.12:53470 ESTABLISHED 5770/java tcp6 0 0 127.0.0.1:44066 127.0.0.1:44444 TIME_WAIT - tcp6 0 0 127.0.0.1:59272 127.0.0.1:1099 TIME_WAIT - tcp6 0 0 10.30.170.65:52940 10.30.170.12:2550 ESTABLISHED 5770/java tcp6 0 0 10.30.170.65:52476 10.30.170.15:2550 ESTABLISHED 5770/java tcp6 0 0 10.30.170.65:2550 10.30.170.15:45218 ESTABLISHED 5770/java tcp6 0 0 10.30.170.65:52926 10.30.170.12:2550 ESTABLISHED 5770/java tcp6 0 0 10.30.170.65:2550 10.30.170.12:53464 ESTABLISHED 5770/java tcp6 0 0 10.30.170.65:2550 10.30.170.15:45228 ESTABLISHED 5770/java udp 0 0 10.30.170.65:68 10.30.170.3:67 ESTABLISHED - udp 0 0 0.0.0.0:111 0.0.0.0:* - udp 0 0 127.0.0.1:323 0.0.0.0:* - udp6 0 0 :::111 :::* - udp6 0 0 ::1:323 :::* - looking for "BindException: Address already in use" in log file looking for "server is unhealthy" in log file + '[' 0 == 0 ']' + seed_index=1 + dump_controller_threads ++ seq 1 3 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_1_IP + echo 'Let'\''s take the karaf thread dump' Let's take the karaf thread dump + ssh 10.30.170.15 'sudo ps aux' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_before.log ++ grep -v grep ++ cut -f2 '-d ' ++ tr -s ' ' + pid=5774 + echo 'karaf main: org.apache.karaf.main.Main, pid:5774' karaf main: org.apache.karaf.main.Main, pid:5774 + ssh 10.30.170.15 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 5774' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_2_IP + echo 'Let'\''s take the karaf thread dump' Let's take the karaf thread dump + ssh 10.30.170.12 'sudo ps aux' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_before.log ++ grep -v grep ++ cut -f2 '-d ' ++ tr -s ' ' + pid=5764 + echo 'karaf main: org.apache.karaf.main.Main, pid:5764' karaf main: org.apache.karaf.main.Main, pid:5764 + ssh 10.30.170.12 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 5764' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_3_IP + echo 'Let'\''s take the karaf thread dump' Let's take the karaf thread dump + ssh 10.30.170.65 'sudo ps aux' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. ++ tr -s ' ' ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_before.log ++ grep -v grep ++ cut -f2 '-d ' + pid=5770 + echo 'karaf main: org.apache.karaf.main.Main, pid:5770' karaf main: org.apache.karaf.main.Main, pid:5770 + ssh 10.30.170.65 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 5770' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + '[' 0 -gt 0 ']' + echo 'Generating controller variables...' Generating controller variables... ++ seq 1 3 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_1_IP + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.15' + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_2_IP + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.15 -v ODL_SYSTEM_2_IP:10.30.170.12' + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_3_IP + odl_variables=' -v ODL_SYSTEM_1_IP:10.30.170.15 -v ODL_SYSTEM_2_IP:10.30.170.12 -v ODL_SYSTEM_3_IP:10.30.170.65' + echo 'Generating mininet variables...' Generating mininet variables... ++ seq 1 0 + get_test_suites SUITES + local __suite_list=SUITES + echo 'Locating test plan to use...' Locating test plan to use... + testplan_filepath=/w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test/csit/testplans/daexim-clustering-basic-calcium.txt + '[' '!' -f /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test/csit/testplans/daexim-clustering-basic-calcium.txt ']' + testplan_filepath=/w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test/csit/testplans/daexim-clustering-basic.txt + '[' disabled '!=' disabled ']' + echo 'Changing the testplan path...' Changing the testplan path... + sed s:integration:/w/workspace/daexim-csit-3node-clustering-basic-only-calcium: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/test/csit/testplans/daexim-clustering-basic.txt + cat testplan.txt test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot + '[' -z '' ']' ++ grep -E -v '(^[[:space:]]*#|^[[:space:]]*$)' testplan.txt ++ tr '\012' ' ' + suite_list='test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot ' + eval 'SUITES='\''test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot '\''' ++ SUITES='test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot ' + echo 'Starting Robot test suites test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot ...' Starting Robot test suites test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot ... + robot -N daexim-clustering-basic.txt --removekeywords wuks -e exclude -e skip_if_calcium -v BUNDLEFOLDER:karaf-0.20.0 -v BUNDLE_URL:https://nexus.opendaylight.org/content/repositories//autorelease-7479/org/opendaylight/integration/karaf/0.20.0/karaf-0.20.0.zip -v CONTROLLER:10.30.170.15 -v CONTROLLER1:10.30.170.12 -v CONTROLLER2:10.30.170.65 -v CONTROLLER_USER:jenkins -v JAVA_HOME:/usr/lib/jvm/java-17-openjdk -v JDKVERSION:openjdk17 -v JENKINS_WORKSPACE:/w/workspace/daexim-csit-3node-clustering-basic-only-calcium -v MININET: -v MININET1: -v MININET2: -v MININET_USER:jenkins -v NEXUSURL_PREFIX:https://nexus.opendaylight.org -v NUM_ODL_SYSTEM:3 -v NUM_TOOLS_SYSTEM:0 -v ODL_STREAM:calcium -v ODL_SYSTEM_IP:10.30.170.15 -v ODL_SYSTEM_1_IP:10.30.170.15 -v ODL_SYSTEM_2_IP:10.30.170.12 -v ODL_SYSTEM_3_IP:10.30.170.65 -v ODL_SYSTEM_USER:jenkins -v TOOLS_SYSTEM_IP: -v TOOLS_SYSTEM_USER:jenkins -v USER_HOME:/home/jenkins -v IS_KARAF_APPL:True -v WORKSPACE:/tmp test/csit/suites/daexim/010-special-export.robot test/csit/suites/daexim/110-cluster-local-export-basic.robot test/csit/suites/daexim/120-cluster-export-basic.robot test/csit/suites/daexim/130-cluster-import-basic.robot ============================================================================== daexim-clustering-basic.txt ============================================================================== daexim-clustering-basic.txt.010-Special-Export :: Test suite for verifying ... ============================================================================== Create and Cancel Export :: schedule and cancel export of a cluster | PASS | ------------------------------------------------------------------------------ Schedule Absolute Time Export With UTC :: Schedule export at a par... | PASS | ------------------------------------------------------------------------------ Schedule Absolute Time Export With Localtime :: Schedule export at... | PASS | ------------------------------------------------------------------------------ Schedule Absolute Time Export In Past :: Schedule export at a part... | PASS | ------------------------------------------------------------------------------ Create Module Exclude Export :: schedule export with exclude optio... | PASS | ------------------------------------------------------------------------------ Create Wildcard Exclude Export :: schedule export with wildstar ex... | PASS | ------------------------------------------------------------------------------ daexim-clustering-basic.txt.010-Special-Export :: Test suite for v... | PASS | 6 tests, 6 passed, 0 failed ============================================================================== daexim-clustering-basic.txt.110-Cluster-Local-Export-Basic :: Test suite fo... ============================================================================== Create Basic Local Export :: schedule a basic export/backup on a c... [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. | PASS | ------------------------------------------------------------------------------ daexim-clustering-basic.txt.110-Cluster-Local-Export-Basic :: Test... | PASS | 1 test, 1 passed, 0 failed ============================================================================== daexim-clustering-basic.txt.120-Cluster-Export-Basic :: Test suite for veri... ============================================================================== Create Basic Export :: schedule a basic export/backup on a cluster... [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. | PASS | ------------------------------------------------------------------------------ daexim-clustering-basic.txt.120-Cluster-Export-Basic :: Test suite... | PASS | 1 test, 1 passed, 0 failed ============================================================================== daexim-clustering-basic.txt.130-Cluster-Import-Basic :: Test suite for veri... ============================================================================== Create Basic Import :: schedule a basic import/restore with data c... [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. [ WARN ] Keyword 'RequestsLibrary.To Json' is deprecated. Please use ${resp.json()} instead. Have a look at the improved HTML output as pretty printing replacement. | PASS | ------------------------------------------------------------------------------ daexim-clustering-basic.txt.130-Cluster-Import-Basic :: Test suite... | PASS | 1 test, 1 passed, 0 failed ============================================================================== daexim-clustering-basic.txt | PASS | 9 tests, 9 passed, 0 failed ============================================================================== Output: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/output.xml Log: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/log.html Report: /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/report.html + echo 'Examining the files in data/log and checking filesize' Examining the files in data/log and checking filesize + ssh 10.30.170.15 'ls -altr /tmp/karaf-0.20.0/data/log/' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. total 964 drwxrwxr-x. 2 jenkins jenkins 48 Jan 23 07:32 . -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:32 karaf_console.log drwxrwxr-x. 9 jenkins jenkins 149 Jan 23 07:33 .. -rw-rw-r--. 1 jenkins jenkins 683218 Jan 23 07:37 karaf.log + ssh 10.30.170.15 'du -hs /tmp/karaf-0.20.0/data/log/*' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. 4.0K /tmp/karaf-0.20.0/data/log/karaf_console.log 960K /tmp/karaf-0.20.0/data/log/karaf.log + ssh 10.30.170.12 'ls -altr /tmp/karaf-0.20.0/data/log/' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. total 964 drwxrwxr-x. 2 jenkins jenkins 48 Jan 23 07:32 . -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:32 karaf_console.log drwxrwxr-x. 9 jenkins jenkins 149 Jan 23 07:33 .. -rw-rw-r--. 1 jenkins jenkins 668418 Jan 23 07:37 karaf.log + ssh 10.30.170.12 'du -hs /tmp/karaf-0.20.0/data/log/*' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. 4.0K /tmp/karaf-0.20.0/data/log/karaf_console.log 960K /tmp/karaf-0.20.0/data/log/karaf.log + ssh 10.30.170.65 'ls -altr /tmp/karaf-0.20.0/data/log/' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. total 964 drwxrwxr-x. 2 jenkins jenkins 48 Jan 23 07:32 . -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:32 karaf_console.log drwxrwxr-x. 9 jenkins jenkins 149 Jan 23 07:33 .. -rw-rw-r--. 1 jenkins jenkins 819297 Jan 23 07:37 karaf.log + ssh 10.30.170.65 'du -hs /tmp/karaf-0.20.0/data/log/*' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. 4.0K /tmp/karaf-0.20.0/data/log/karaf_console.log 960K /tmp/karaf-0.20.0/data/log/karaf.log + set +e ++ seq 1 3 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_1_IP + echo 'Let'\''s take the karaf thread dump again' Let's take the karaf thread dump again + ssh 10.30.170.15 'sudo ps aux' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_after.log ++ grep -v grep ++ tr -s ' ' ++ cut -f2 '-d ' + pid=11285 + echo 'karaf main: org.apache.karaf.main.Main, pid:11285' karaf main: org.apache.karaf.main.Main, pid:11285 + ssh 10.30.170.15 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 11285' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + echo 'killing karaf process...' killing karaf process... + ssh 10.30.170.15 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_2_IP + echo 'Let'\''s take the karaf thread dump again' Let's take the karaf thread dump again + ssh 10.30.170.12 'sudo ps aux' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. ++ tr -s ' ' ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_after.log ++ grep -v grep ++ cut -f2 '-d ' + pid=8164 + echo 'karaf main: org.apache.karaf.main.Main, pid:8164' karaf main: org.apache.karaf.main.Main, pid:8164 + ssh 10.30.170.12 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 8164' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + echo 'killing karaf process...' killing karaf process... + ssh 10.30.170.12 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_3_IP + echo 'Let'\''s take the karaf thread dump again' Let's take the karaf thread dump again + ssh 10.30.170.65 'sudo ps aux' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. ++ grep org.apache.karaf.main.Main /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/ps_after.log ++ grep -v grep ++ cut -f2 '-d ' ++ tr -s ' ' + pid=11145 + echo 'karaf main: org.apache.karaf.main.Main, pid:11145' karaf main: org.apache.karaf.main.Main, pid:11145 + ssh 10.30.170.65 '/usr/lib/jvm/java-17-openjdk/bin/jstack -l 11145' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + echo 'killing karaf process...' killing karaf process... + ssh 10.30.170.65 bash -c 'ps axf | grep karaf | grep -v grep | awk '\''{print "kill -9 " $1}'\'' | sh' Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + sleep 5 ++ seq 1 3 + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_1_IP + echo 'Compressing karaf.log 1' Compressing karaf.log 1 + ssh 10.30.170.15 gzip --best /tmp/karaf-0.20.0/data/log/karaf.log Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + echo 'Fetching compressed karaf.log 1' Fetching compressed karaf.log 1 + scp 10.30.170.15:/tmp/karaf-0.20.0/data/log/karaf.log.gz odl1_karaf.log.gz Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + ssh 10.30.170.15 rm -f /tmp/karaf-0.20.0/data/log/karaf.log.gz Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + scp 10.30.170.15:/tmp/karaf-0.20.0/data/log/karaf_console.log odl1_karaf_console.log Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + ssh 10.30.170.15 rm -f /tmp/karaf-0.20.0/data/log/karaf_console.log Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. + echo 'Fetch GC logs' Fetch GC logs + mkdir -p gclogs-1 + scp '10.30.170.15:/tmp/karaf-0.20.0/data/log/*.log' gclogs-1/ Warning: Permanently added '10.30.170.15' (ECDSA) to the list of known hosts. scp: /tmp/karaf-0.20.0/data/log/*.log: No such file or directory + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_2_IP + echo 'Compressing karaf.log 2' Compressing karaf.log 2 + ssh 10.30.170.12 gzip --best /tmp/karaf-0.20.0/data/log/karaf.log Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + echo 'Fetching compressed karaf.log 2' Fetching compressed karaf.log 2 + scp 10.30.170.12:/tmp/karaf-0.20.0/data/log/karaf.log.gz odl2_karaf.log.gz Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + ssh 10.30.170.12 rm -f /tmp/karaf-0.20.0/data/log/karaf.log.gz Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + scp 10.30.170.12:/tmp/karaf-0.20.0/data/log/karaf_console.log odl2_karaf_console.log Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + ssh 10.30.170.12 rm -f /tmp/karaf-0.20.0/data/log/karaf_console.log Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. + echo 'Fetch GC logs' Fetch GC logs + mkdir -p gclogs-2 + scp '10.30.170.12:/tmp/karaf-0.20.0/data/log/*.log' gclogs-2/ Warning: Permanently added '10.30.170.12' (ECDSA) to the list of known hosts. scp: /tmp/karaf-0.20.0/data/log/*.log: No such file or directory + for i in $(seq 1 "${NUM_ODL_SYSTEM}") + CONTROLLERIP=ODL_SYSTEM_3_IP + echo 'Compressing karaf.log 3' Compressing karaf.log 3 + ssh 10.30.170.65 gzip --best /tmp/karaf-0.20.0/data/log/karaf.log Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + echo 'Fetching compressed karaf.log 3' Fetching compressed karaf.log 3 + scp 10.30.170.65:/tmp/karaf-0.20.0/data/log/karaf.log.gz odl3_karaf.log.gz Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + ssh 10.30.170.65 rm -f /tmp/karaf-0.20.0/data/log/karaf.log.gz Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + scp 10.30.170.65:/tmp/karaf-0.20.0/data/log/karaf_console.log odl3_karaf_console.log Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + ssh 10.30.170.65 rm -f /tmp/karaf-0.20.0/data/log/karaf_console.log Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. + echo 'Fetch GC logs' Fetch GC logs + mkdir -p gclogs-3 + scp '10.30.170.65:/tmp/karaf-0.20.0/data/log/*.log' gclogs-3/ Warning: Permanently added '10.30.170.65' (ECDSA) to the list of known hosts. scp: /tmp/karaf-0.20.0/data/log/*.log: No such file or directory + echo 'Examine copied files' Examine copied files + ls -lt total 8144 drwxrwxr-x. 2 jenkins jenkins 6 Jan 23 07:38 gclogs-3 -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:38 odl3_karaf_console.log -rw-rw-r--. 1 jenkins jenkins 62854 Jan 23 07:38 odl3_karaf.log.gz drwxrwxr-x. 2 jenkins jenkins 6 Jan 23 07:38 gclogs-2 -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:38 odl2_karaf_console.log -rw-rw-r--. 1 jenkins jenkins 53570 Jan 23 07:38 odl2_karaf.log.gz drwxrwxr-x. 2 jenkins jenkins 6 Jan 23 07:38 gclogs-1 -rw-rw-r--. 1 jenkins jenkins 1720 Jan 23 07:38 odl1_karaf_console.log -rw-rw-r--. 1 jenkins jenkins 54252 Jan 23 07:38 odl1_karaf.log.gz -rw-rw-r--. 1 jenkins jenkins 163567 Jan 23 07:38 karaf_3_11145_threads_after.log -rw-rw-r--. 1 jenkins jenkins 16554 Jan 23 07:38 ps_after.log -rw-rw-r--. 1 jenkins jenkins 138016 Jan 23 07:38 karaf_2_8164_threads_after.log -rw-rw-r--. 1 jenkins jenkins 137462 Jan 23 07:37 karaf_1_11285_threads_after.log -rw-rw-r--. 1 jenkins jenkins 248234 Jan 23 07:37 report.html -rw-rw-r--. 1 jenkins jenkins 1052305 Jan 23 07:37 log.html -rw-rw-r--. 1 jenkins jenkins 5929923 Jan 23 07:37 output.xml -rw-rw-r--. 1 jenkins jenkins 220 Jan 23 07:33 testplan.txt -rw-rw-r--. 1 jenkins jenkins 125017 Jan 23 07:33 karaf_3_5770_threads_before.log -rw-rw-r--. 1 jenkins jenkins 16587 Jan 23 07:33 ps_before.log -rw-rw-r--. 1 jenkins jenkins 123899 Jan 23 07:33 karaf_2_5764_threads_before.log -rw-rw-r--. 1 jenkins jenkins 127058 Jan 23 07:33 karaf_1_5774_threads_before.log -rw-rw-r--. 1 jenkins jenkins 3043 Jan 23 07:32 post-startup-script.sh -rw-rw-r--. 1 jenkins jenkins 225 Jan 23 07:32 startup-script.sh -rw-rw-r--. 1 jenkins jenkins 3236 Jan 23 07:32 configuration-script.sh -rw-rw-r--. 1 jenkins jenkins 266 Jan 23 07:32 detect_variables.env -rw-rw-r--. 1 jenkins jenkins 86 Jan 23 07:32 set_variables.env -rw-rw-r--. 1 jenkins jenkins 310 Jan 23 07:32 slave_addresses.txt -rw-rw-r--. 1 jenkins jenkins 570 Jan 23 07:31 requirements.txt -rw-rw-r--. 1 jenkins jenkins 26 Jan 23 07:31 env.properties -rw-rw-r--. 1 jenkins jenkins 336 Jan 23 07:28 stack-parameters.yaml drwxrwxr-x. 7 jenkins jenkins 4096 Jan 23 07:27 test drwxrwxr-x. 2 jenkins jenkins 6 Jan 23 07:27 test@tmp + true [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/sh /tmp/jenkins14137343011511467832.sh Cleaning up Robot installation... $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 5336 killed; [ssh-agent] Stopped. Recording plot data Robot results publisher started... -Parsing output xml: Done! -Copying log files to build dir: Done! -Assigning results to build: ERROR: Build step failed with exception java.lang.NullPointerException Build step 'Publish Robot Framework test results' marked build as failure [PostBuildScript] - [INFO] Executing post build scripts. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins12387921924805924643.sh Archiving csit artifacts mv: cannot stat '*_1.png': No such file or directory mv: cannot stat '/tmp/odl1_*': No such file or directory mv: cannot stat '*_2.png': No such file or directory mv: cannot stat '/tmp/odl2_*': No such file or directory mv: cannot stat '*_3.png': No such file or directory mv: cannot stat '/tmp/odl3_*': No such file or directory % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 188 0 188 0 0 2540 0 --:--:-- --:--:-- --:--:-- 2540 Archive: robot-plugin.zip End-of-central-directory signature not found. Either this file is not a zipfile, or it constitutes one disk of a multi-part archive. In the latter case the central directory and zipfile comment will be found on the last disk(s) of this archive. unzip: cannot find zipfile directory in one of robot-plugin.zip or robot-plugin.zip.zip, and cannot find robot-plugin.zip.ZIP, period. mv: cannot stat '*.log.gz': No such file or directory mv: cannot stat '*.csv': No such file or directory mv: cannot stat '*.png': No such file or directory [PostBuildScript] - [INFO] Executing post build scripts. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins12032562541253027879.sh [PostBuildScript] - [INFO] Executing post build scripts. [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content OS_CLOUD=vex OS_STACK_NAME=releng-daexim-csit-3node-clustering-basic-only-calcium-116 [EnvInject] - Variables injected successfully. provisioning config files... copy managed file [clouds-yaml] to file:/home/jenkins/.config/openstack/clouds.yaml [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins12458137015614557966.sh ---> openstack-stack-delete.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools[openstack] kubernetes python-heatclient python-openstackclient lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH INFO: Retrieving stack cost for: releng-daexim-csit-3node-clustering-basic-only-calcium-116 DEBUG: Successfully retrieved stack cost: total: 0.33 INFO: Deleting stack releng-daexim-csit-3node-clustering-basic-only-calcium-116 Successfully deleted stack releng-daexim-csit-3node-clustering-basic-only-calcium-116 [PostBuildScript] - [INFO] Executing post build scripts. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins1835388918367910397.sh ---> sysstat.sh [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins2892496830481723414.sh ---> package-listing.sh ++ facter osfamily ++ tr '[:upper:]' '[:lower:]' + OS_FAMILY=redhat + workspace=/w/workspace/daexim-csit-3node-clustering-basic-only-calcium + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/daexim-csit-3node-clustering-basic-only-calcium ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + rpm -qa + sort + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/daexim-csit-3node-clustering-basic-only-calcium ']' + mkdir -p /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/archives/ [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins4862088679626191040.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH INFO: Running in OpenStack, capturing instance metadata [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins3519324945038285874.sh provisioning config files... Could not find credentials [logs] for daexim-csit-3node-clustering-basic-only-calcium #116 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/daexim-csit-3node-clustering-basic-only-calcium@tmp/config236345943501787032tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins13566403704243178341.sh ---> create-netrc.sh WARN: Log server credential not found. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins13811783294473058840.sh ---> python-tools-install.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins7677131581402798492.sh ---> sudo-logs.sh Archiving 'sudo' log.. [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash /tmp/jenkins10047072069481274290.sh ---> job-cost.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. lftools 0.37.8 requires openstacksdk<1.5.0, but you have openstacksdk 2.1.0 which is incompatible. lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH DEBUG: total: 0.33 INFO: Retrieving Stack Cost... INFO: Retrieving Pricing Info for: v3-standard-2 INFO: Archiving Costs [daexim-csit-3node-clustering-basic-only-calcium] $ /bin/bash -l /tmp/jenkins6309719915516154890.sh ---> logs-deploy.sh Setup pyenv: system 3.8.13 3.9.13 * 3.10.6 (set by /w/workspace/daexim-csit-3node-clustering-basic-only-calcium/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-SlGu from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing: lftools ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. python-openstackclient 6.4.0 requires openstacksdk>=2.0.0, but you have openstacksdk 1.4.0 which is incompatible. lf-activate-venv(): INFO: Adding /tmp/venv-SlGu/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/daexim-csit-3node-clustering-basic-only-calcium/116/ INFO: archiving logs to S3 ---> uname -a: Linux prd-centos8-robot-2c-8g-19201.novalocal 4.18.0-532.el8.x86_64 #1 SMP Thu Dec 21 12:11:59 UTC 2023 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Byte Order: Little Endian CPU(s): 2 On-line CPU(s) list: 0,1 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 2 NUMA node(s): 1 Vendor ID: AuthenticAMD CPU family: 23 Model: 49 Model name: AMD EPYC-Rome Processor Stepping: 0 CPU MHz: 2800.000 BogoMIPS: 5600.00 Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 32K L1i cache: 32K L2 cache: 512K L3 cache: 16384K NUMA node0 CPU(s): 0,1 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities ---> nproc: 2 ---> df -h: Filesystem Size Used Avail Use% Mounted on devtmpfs 3.8G 0 3.8G 0% /dev tmpfs 3.8G 0 3.8G 0% /dev/shm tmpfs 3.8G 17M 3.8G 1% /run tmpfs 3.8G 0 3.8G 0% /sys/fs/cgroup /dev/vda1 40G 9.4G 31G 24% / tmpfs 770M 0 770M 0% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 7697 582 5004 19 2110 6818 Swap: 1023 0 1023 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:34:43:17 brd ff:ff:ff:ff:ff:ff altname enp0s3 altname ens3 inet 10.30.171.25/23 brd 10.30.171.255 scope global dynamic noprefixroute eth0 valid_lft 85580sec preferred_lft 85580sec inet6 fe80::f816:3eff:fe34:4317/64 scope link valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 4.18.0-532.el8.x86_64 (centos-stream-8-robot-f84d986f-3f43-4447-86d7-70e5891bac13.noval) 01/23/2024 _x86_64_ (2 CPU) 07:26:01 LINUX RESTART (2 CPU) 07:27:01 AM tps rtps wtps bread/s bwrtn/s 07:28:01 AM 70.04 13.95 56.10 2500.90 5625.31 07:29:01 AM 93.50 0.08 93.42 2.27 8250.04 07:30:01 AM 33.71 0.58 33.13 26.26 3941.08 07:31:01 AM 33.66 0.82 32.84 124.63 2187.05 07:32:01 AM 36.87 0.17 36.70 5.86 2544.92 07:33:01 AM 39.59 0.03 39.56 3.47 5891.62 07:34:01 AM 6.75 0.02 6.73 0.13 318.23 07:35:01 AM 1.90 0.00 1.90 0.00 135.60 07:36:01 AM 0.55 0.00 0.55 0.00 58.29 07:37:01 AM 0.17 0.00 0.17 0.00 15.06 07:38:01 AM 0.25 0.03 0.22 1.87 10.56 07:39:01 AM 20.90 0.32 20.58 44.13 859.57 Average: 28.16 1.33 26.83 225.80 2486.55 07:27:01 AM kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 07:28:01 AM 5348484 7018376 2533972 32.15 2688 1859220 751748 8.42 203772 2034684 105928 07:29:01 AM 5237840 7039292 2644616 33.55 2688 1978216 693668 7.77 186904 2115304 88388 07:30:01 AM 5223964 7038952 2658492 33.73 2688 1991300 687508 7.70 194732 2118800 47344 07:31:01 AM 5255356 7074484 2627100 33.33 2688 1996304 657976 7.37 214292 2069120 4 07:32:01 AM 4956796 6973272 2925660 37.12 2688 2185580 737260 8.26 230428 2323664 159592 07:33:01 AM 5033544 7041668 2848912 36.14 2688 2177856 660544 7.40 232580 2240592 16 07:34:01 AM 4995472 7004896 2886984 36.63 2688 2179212 712420 7.98 232756 2280508 652 07:35:01 AM 4988852 7001164 2893604 36.71 2688 2182060 712420 7.98 232760 2287336 1276 07:36:01 AM 4988412 7001068 2894044 36.72 2688 2182436 712420 7.98 232760 2287808 148 07:37:01 AM 4988212 7001224 2894244 36.72 2688 2182776 712420 7.98 232760 2288028 104 07:38:01 AM 5028992 7045148 2853464 36.20 2688 2185904 658084 7.37 233148 2247228 2920 07:39:01 AM 5111116 6965224 2771340 35.16 2688 2028928 749240 8.39 429976 1991976 26260 Average: 5096420 7017064 2786036 35.34 2688 2094149 703809 7.88 238072 2190421 36053 07:27:01 AM IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 07:28:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:28:01 AM eth0 162.08 117.56 1086.79 33.89 0.00 0.00 0.00 0.00 07:29:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:29:01 AM eth0 36.12 29.45 344.85 6.37 0.00 0.00 0.00 0.00 07:30:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:30:01 AM eth0 14.05 10.20 144.63 2.05 0.00 0.00 0.00 0.00 07:31:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:31:01 AM eth0 22.66 14.68 6.65 4.47 0.00 0.00 0.00 0.00 07:32:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:32:01 AM eth0 34.27 30.89 311.35 6.91 0.00 0.00 0.00 0.00 07:33:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:33:01 AM eth0 595.87 510.78 122.72 106.38 0.00 0.00 0.00 0.00 07:34:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:34:01 AM eth0 31.16 24.96 14.45 5.30 0.00 0.00 0.00 0.00 07:35:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:35:01 AM eth0 32.61 33.52 10.00 7.15 0.00 0.00 0.00 0.00 07:36:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:36:01 AM eth0 12.53 10.48 2.67 2.16 0.00 0.00 0.00 0.00 07:37:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:37:01 AM eth0 9.41 9.40 2.04 1.80 0.00 0.00 0.00 0.00 07:38:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:38:01 AM eth0 28.86 22.63 14.15 4.87 0.00 0.00 0.00 0.00 07:39:01 AM lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 07:39:01 AM eth0 46.14 37.58 69.08 33.03 0.00 0.00 0.00 0.00 Average: lo 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Average: eth0 85.47 71.01 177.46 17.86 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 4.18.0-532.el8.x86_64 (centos-stream-8-robot-f84d986f-3f43-4447-86d7-70e5891bac13.noval) 01/23/2024 _x86_64_ (2 CPU) 07:26:01 LINUX RESTART (2 CPU) 07:27:01 AM CPU %user %nice %system %iowait %steal %idle 07:28:01 AM all 27.88 0.00 4.56 6.14 0.10 61.32 07:28:01 AM 0 33.83 0.00 4.33 6.08 0.10 55.66 07:28:01 AM 1 21.93 0.00 4.80 6.20 0.10 66.97 07:29:01 AM all 30.24 0.00 4.47 7.26 0.13 57.90 07:29:01 AM 0 28.02 0.00 4.30 7.02 0.13 60.53 07:29:01 AM 1 32.46 0.00 4.63 7.50 0.13 55.27 07:30:01 AM all 11.42 0.00 1.47 13.41 0.07 73.63 07:30:01 AM 0 9.64 0.00 1.85 20.79 0.05 67.67 07:30:01 AM 1 13.21 0.00 1.09 6.03 0.08 79.59 07:31:01 AM all 12.24 0.00 1.57 1.12 0.06 85.01 07:31:01 AM 0 23.01 0.00 1.82 1.05 0.05 74.06 07:31:01 AM 1 1.44 0.00 1.32 1.19 0.07 95.98 07:32:01 AM all 24.65 0.00 3.78 3.23 0.09 68.25 07:32:01 AM 0 38.35 0.00 4.77 3.44 0.12 53.33 07:32:01 AM 1 10.95 0.00 2.80 3.02 0.07 83.16 07:33:01 AM all 8.76 0.00 2.08 1.23 0.06 87.86 07:33:01 AM 0 10.09 0.00 2.01 1.01 0.07 86.83 07:33:01 AM 1 7.44 0.00 2.15 1.46 0.05 88.91 07:34:01 AM all 7.47 0.00 0.48 0.13 0.06 91.87 07:34:01 AM 0 8.31 0.00 0.52 0.23 0.05 90.89 07:34:01 AM 1 6.64 0.00 0.44 0.02 0.07 92.84 07:35:01 AM all 15.21 0.00 0.55 0.03 0.07 84.15 07:35:01 AM 0 17.08 0.00 0.35 0.00 0.07 82.50 07:35:01 AM 1 13.35 0.00 0.74 0.05 0.07 85.79 07:36:01 AM all 4.74 0.00 0.20 0.00 0.05 95.01 07:36:01 AM 0 3.04 0.00 0.20 0.00 0.03 96.72 07:36:01 AM 1 6.44 0.00 0.20 0.00 0.07 93.29 07:37:01 AM all 4.42 0.00 0.19 0.00 0.07 95.32 07:37:01 AM 0 5.31 0.00 0.20 0.00 0.07 94.43 07:37:01 AM 1 3.53 0.00 0.18 0.00 0.07 96.22 07:38:01 AM all 7.14 0.00 0.33 0.01 0.06 92.46 07:38:01 AM 0 10.90 0.00 0.35 0.02 0.07 88.66 07:38:01 AM 1 3.38 0.00 0.30 0.00 0.05 96.27 07:38:01 AM CPU %user %nice %system %iowait %steal %idle 07:39:01 AM all 24.26 0.00 2.79 0.32 0.08 72.55 07:39:01 AM 0 30.93 0.00 2.69 0.28 0.08 66.02 07:39:01 AM 1 17.59 0.00 2.89 0.35 0.08 79.09 Average: all 14.88 0.00 1.88 2.74 0.07 80.42 Average: 0 18.22 0.00 1.95 3.33 0.07 76.42 Average: 1 11.54 0.00 1.80 2.16 0.08 84.43