Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/117008 Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on prd-ubuntu2204-docker-4c-16g-11877 (ubuntu2204-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master [ssh-agent] Looking for ssh-agent implementation... [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) $ ssh-agent SSH_AUTH_SOCK=/tmp/ssh-XXXXXXMwx9tG/agent.1567 SSH_AGENT_PID=1569 [ssh-agent] Started. Running ssh-add (command line suppressed) Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_7331332191867088970.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_7331332191867088970.key) [ssh-agent] Using credentials jenkins (jenkins-ssh) The recommended git tool is: NONE using credential jenkins-ssh Wiping out workspace first. Cloning the remote Git repository Cloning repository git://devvexx.opendaylight.org/mirror/transportpce > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce > git --version # timeout=10 > git --version # 'git version 2.34.1' using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce using GIT_SSH to set credentials jenkins-ssh Verifying host key using known hosts file You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/08/117008/23 # timeout=10 > git rev-parse 2428cba03f3dfe9ac7d4235aa37165325aaf8b49^{commit} # timeout=10 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script Checking out Revision 2428cba03f3dfe9ac7d4235aa37165325aaf8b49 (refs/changes/08/117008/23) > git config core.sparsecheckout # timeout=10 > git checkout -f 2428cba03f3dfe9ac7d4235aa37165325aaf8b49 # timeout=10 Commit message: "Consolidation of TAPI PCE & func tests" > git rev-parse FETCH_HEAD^{commit} # timeout=10 > git rev-list --no-walk 57e50e05598fe5e4d8a076b20dcf6eb0e1925c45 # timeout=10 > git remote # timeout=10 > git submodule init # timeout=10 > git submodule sync # timeout=10 > git config --get remote.origin.url # timeout=10 > git submodule init # timeout=10 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 ERROR: No submodules found. provisioning config files... copy managed file [npmrc] to file:/home/jenkins/.npmrc copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins9061938855269453474.sh ---> python-tools-install.sh Setup pyenv: * system (set by /opt/pyenv/version) * 3.8.20 (set by /opt/pyenv/version) * 3.9.20 (set by /opt/pyenv/version) 3.10.15 3.11.10 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-l63k lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-l63k/bin to PATH Generating Requirements File ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. httplib2 0.31.0 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. Python 3.11.10 pip 25.3 from /tmp/venv-l63k/lib/python3.11/site-packages/pip (python 3.11) appdirs==1.4.4 argcomplete==3.6.3 aspy.yaml==1.3.0 attrs==25.4.0 autopage==0.5.2 beautifulsoup4==4.14.2 boto3==1.40.76 botocore==1.40.76 bs4==0.0.2 cachetools==6.2.2 certifi==2025.11.12 cffi==2.0.0 cfgv==3.4.0 chardet==5.2.0 charset-normalizer==3.4.4 click==8.3.1 cliff==4.11.0 cmd2==2.7.0 cryptography==3.3.2 debtcollector==3.0.0 decorator==5.2.1 defusedxml==0.7.1 Deprecated==1.3.1 distlib==0.4.0 dnspython==2.8.0 docker==7.1.0 dogpile.cache==1.5.0 durationpy==0.10 email-validator==2.3.0 filelock==3.20.0 future==1.0.0 gitdb==4.0.12 GitPython==3.1.45 google-auth==2.43.0 httplib2==0.31.0 identify==2.6.15 idna==3.11 importlib-resources==1.5.0 iso8601==2.1.0 Jinja2==3.1.6 jmespath==1.0.1 jsonpatch==1.33 jsonpointer==3.0.0 jsonschema==4.25.1 jsonschema-specifications==2025.9.1 keystoneauth1==5.12.0 kubernetes==34.1.0 lftools==0.37.15 lxml==6.0.2 markdown-it-py==4.0.0 MarkupSafe==3.0.3 mdurl==0.1.2 msgpack==1.1.2 multi_key_dict==2.0.3 munch==4.0.0 netaddr==1.3.0 niet==1.4.2 nodeenv==1.9.1 oauth2client==4.1.3 oauthlib==3.3.1 openstacksdk==4.8.0 os-service-types==1.8.1 osc-lib==4.2.0 oslo.config==10.1.0 oslo.context==6.1.0 oslo.i18n==6.7.0 oslo.log==7.2.1 oslo.serialization==5.8.0 oslo.utils==9.1.0 packaging==25.0 pbr==7.0.3 platformdirs==4.5.0 prettytable==3.17.0 psutil==7.1.3 pyasn1==0.6.1 pyasn1_modules==0.4.2 pycparser==2.23 pygerrit2==2.0.15 PyGithub==2.8.1 Pygments==2.19.2 PyJWT==2.10.1 PyNaCl==1.6.1 pyparsing==2.4.7 pyperclip==1.11.0 pyrsistent==0.20.0 python-cinderclient==9.8.0 python-dateutil==2.9.0.post0 python-heatclient==4.3.0 python-jenkins==1.8.3 python-keystoneclient==5.7.0 python-magnumclient==4.9.0 python-openstackclient==8.2.0 python-swiftclient==4.9.0 PyYAML==6.0.3 referencing==0.37.0 requests==2.32.5 requests-oauthlib==2.0.0 requestsexceptions==1.4.0 rfc3986==2.0.0 rich==14.2.0 rich-argparse==1.7.2 rpds-py==0.29.0 rsa==4.9.1 ruamel.yaml==0.18.16 ruamel.yaml.clib==0.2.15 s3transfer==0.14.0 simplejson==3.20.2 six==1.17.0 smmap==5.0.2 soupsieve==2.8 stevedore==5.5.0 tabulate==0.9.0 toml==0.10.2 tomlkit==0.13.3 tqdm==4.67.1 typing_extensions==4.15.0 tzdata==2025.2 urllib3==1.26.20 virtualenv==20.35.4 wcwidth==0.2.14 websocket-client==1.9.0 wrapt==2.0.1 xdg==6.0.0 xmltodict==1.0.2 yq==3.4.3 [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PYTHON=python3 [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins11739390628447202717.sh ---> tox-install.sh + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-eEbk + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ sed 's/^[ *]* //' ++ awk '{ print $1 }' ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ grep -E '^[0-9.]*[0-9]$' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ sort -V +++ tail -n 1 ++ py_version_xyz=3.11.10 ++ [[ -z 3.11.10 ]] ++ echo 3.11.10 ++ return 0 + pyenv local 3.11.10 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.10 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] + [[ ! -f /tmp/.toxenv ]] + [[ -n '' ]] + python3 -m venv /tmp/venv-eEbk + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-eEbk' lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-eEbk + echo /tmp/venv-eEbk + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) + local 'pip_opts=--upgrade --quiet' + pip_opts='--upgrade --quiet --trusted-host pypi.org' + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' + [[ -n '' ]] + [[ -n '' ]] + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' lf-activate-venv(): INFO: Attempting to install with network-safe options... + /tmp/venv-eEbk/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv + echo 'lf-activate-venv(): INFO: Base packages installed successfully' lf-activate-venv(): INFO: Base packages installed successfully + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 + /tmp/venv-eEbk/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-eEbk/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-eEbk/bin to PATH + PATH=/tmp/venv-eEbk/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + python3 --version Python 3.11.10 + python3 -m pip --version pip 25.3 from /tmp/venv-eEbk/lib/python3.11/site-packages/pip (python 3.11) + python3 -m pip freeze cachetools==6.2.2 chardet==5.2.0 colorama==0.4.6 distlib==0.4.0 filelock==3.20.0 packaging==25.0 platformdirs==4.5.0 pluggy==1.6.0 pyproject-api==1.10.0 tox==4.32.0 urllib3==1.26.20 virtualenv==20.35.4 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins8143574629356992380.sh [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content PARALLEL=True [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins15397258004421221861.sh ---> tox-run.sh + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox + cd /w/workspace/transportpce-tox-verify-transportpce-master/. + source /home/jenkins/lf-env.sh + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 ++ mktemp -d /tmp/venv-XXXX + lf_venv=/tmp/venv-Rzgp + local venv_file=/tmp/.os_lf_venv + local python=python3 + local options + local set_path=true + local install_args= ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 + true + case $1 in + venv_file=/tmp/.toxenv + shift 2 + true + case $1 in + shift + break + case $python in + local pkg_list= + [[ -d /opt/pyenv ]] + echo 'Setup pyenv:' Setup pyenv: + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + pyenv versions system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) + command -v pyenv ++ pyenv init - --no-rehash + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; fi; done; echo "${paths[*]}"'\'')" export PATH="/opt/pyenv/shims:${PATH}" export PYENV_SHELL=bash source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' pyenv() { local command command="${1:-}" if [ "$#" -gt 0 ]; then shift fi case "$command" in rehash|shell) eval "$(pyenv "sh-$command" "$@")" ;; *) command pyenv "$command" "$@" ;; esac }' +++ bash --norc -ec 'IFS=:; paths=($PATH); for i in ${!paths[@]}; do if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; fi; done; echo "${paths[*]}"' ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ export PYENV_SHELL=bash ++ PYENV_SHELL=bash ++ source /opt/pyenv/libexec/../completions/pyenv.bash +++ complete -F _pyenv pyenv ++ lf-pyver python3 ++ local py_version_xy=python3 ++ local py_version_xyz= ++ pyenv versions ++ local command ++ command=versions ++ '[' 1 -gt 0 ']' ++ shift ++ case "$command" in ++ command pyenv versions ++ sed 's/^[ *]* //' ++ grep -E '^[0-9.]*[0-9]$' ++ awk '{ print $1 }' ++ [[ ! -s /tmp/.pyenv_versions ]] +++ grep '^3' /tmp/.pyenv_versions +++ tail -n 1 +++ sort -V ++ py_version_xyz=3.11.10 ++ [[ -z 3.11.10 ]] ++ echo 3.11.10 ++ return 0 + pyenv local 3.11.10 + local command + command=local + '[' 2 -gt 0 ']' + shift + case "$command" in + command pyenv local 3.11.10 + for arg in "$@" + case $arg in + pkg_list+='tox ' + for arg in "$@" + case $arg in + pkg_list+='virtualenv ' + for arg in "$@" + case $arg in + pkg_list+='urllib3~=1.26.15 ' + [[ -f /tmp/.toxenv ]] ++ cat /tmp/.toxenv + lf_venv=/tmp/venv-eEbk + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-eEbk from' file:/tmp/.toxenv lf-activate-venv(): INFO: Reuse venv:/tmp/venv-eEbk from file:/tmp/.toxenv + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) + local 'pip_opts=--upgrade --quiet' + pip_opts='--upgrade --quiet --trusted-host pypi.org' + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' + [[ -n '' ]] + [[ -n '' ]] + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' lf-activate-venv(): INFO: Attempting to install with network-safe options... + /tmp/venv-eEbk/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv + echo 'lf-activate-venv(): INFO: Base packages installed successfully' lf-activate-venv(): INFO: Base packages installed successfully + [[ -z tox virtualenv urllib3~=1.26.15 ]] + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 + /tmp/venv-eEbk/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 + type python3 + true + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-eEbk/bin to PATH' lf-activate-venv(): INFO: Adding /tmp/venv-eEbk/bin to PATH + PATH=/tmp/venv-eEbk/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + return 0 + [[ -d /opt/pyenv ]] + echo '---> Setting up pyenv' ---> Setting up pyenv + export PYENV_ROOT=/opt/pyenv + PYENV_ROOT=/opt/pyenv + export PATH=/opt/pyenv/bin:/tmp/venv-eEbk/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin + PATH=/opt/pyenv/bin:/tmp/venv-eEbk/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin ++ pwd + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master + export PYTHONPATH + export TOX_TESTENV_PASSENV=PYTHONPATH + TOX_TESTENV_PASSENV=PYTHONPATH + tox --version 4.32.0 from /tmp/venv-eEbk/lib/python3.11/site-packages/tox/__init__.py + PARALLEL=True + TOX_OPTIONS_LIST= + [[ -n '' ]] + case ${PARALLEL,,} in + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' + tox --parallel auto --parallel-live + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log docs: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: freeze> python -m pip freeze --all buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt checkbashisms: pip==25.3,setuptools==80.9.0 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + checkbashisms: OK ✔ in 3.11 seconds pre-commit: install_deps> python -I -m pip install pre-commit pre-commit: freeze> python -m pip freeze --all pre-commit: cfgv==3.4.0,distlib==0.4.0,filelock==3.20.0,identify==2.6.15,nodeenv==1.9.1,pip==25.3,platformdirs==4.5.0,pre_commit==4.4.0,PyYAML==6.0.3,setuptools==80.9.0,virtualenv==20.35.4 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' /usr/bin/cpan pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Initializing environment for https://github.com/perltidy/perltidy. buildcontroller: freeze> python -m pip freeze --all [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... buildcontroller: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh + update-java-alternatives -l java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 update-alternatives: error: no alternatives for jaotc update-alternatives: error: no alternatives for rmic + java -version + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; + JAVA_VER=21 + echo 21 21 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; + javac -version + JAVAC_VER=21 + echo 21 21 ok, java is 21 or newer + [ 21 -ge 21 ] + [ 21 -ge 21 ] + echo ok, java is 21 or newer + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.11/binaries/apache-maven-3.9.11-bin.tar.gz -P /tmp 2025-11-19 08:31:14 URL:https://dlcdn.apache.org/maven/maven-3/3.9.11/binaries/apache-maven-3.9.11-bin.tar.gz [9160848/9160848] -> "/tmp/apache-maven-3.9.11-bin.tar.gz" [1] + sudo mkdir -p /opt + sudo tar xf /tmp/apache-maven-3.9.11-bin.tar.gz -C /opt + sudo ln -s /opt/apache-maven-3.9.11 /opt/maven + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn + mvn --version Apache Maven 3.9.11 (3e54c93a704957b63ee3494413a2b544fd3d825b) Maven home: /opt/maven Java version: 21.0.8, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 Default locale: en, platform encoding: UTF-8 OS name: "linux", version: "5.15.0-153-generic", arch: "amd64", family: "unix" NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... [INFO] Installing environment for https://github.com/perltidy/perltidy. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... docs-linkcheck: freeze> python -m pip freeze --all docs-linkcheck: alabaster==1.0.0,attrs==25.4.0,babel==2.17.0,blockdiag==3.0.0,certifi==2025.11.12,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.60.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs-conf==0.9.0,MarkupSafe==3.0.3,matplotlib==3.10.7,numpy==2.3.5,nwdiag==3.0.0,packaging==25.0,pillow==12.0.0,pip==25.3,Pygments==2.19.2,pyparsing==3.2.5,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals-py==3.1.0,seqdiag==3.0.0,setuptools==80.9.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.2,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.5.0,webcolors==25.10.0 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck docs: freeze> python -m pip freeze --all docs: alabaster==1.0.0,attrs==25.4.0,babel==2.17.0,blockdiag==3.0.0,certifi==2025.11.12,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.60.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs-conf==0.9.0,MarkupSafe==3.0.3,matplotlib==3.10.7,numpy==2.3.5,nwdiag==3.0.0,packaging==25.0,pillow==12.0.0,pip==25.3,Pygments==2.19.2,pyparsing==3.2.5,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals-py==3.1.0,seqdiag==3.0.0,setuptools==80.9.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-rtd-theme==3.0.2,sphinx-tabs==3.4.7,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.5.0,webcolors==25.10.0 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html docs: OK ✔ in 31.3 seconds pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' docs-linkcheck: OK ✔ in 32.53 seconds pylint: freeze> python -m pip freeze --all pylint: astroid==4.0.2,dill==0.4.0,isort==7.0.0,mccabe==0.7.0,pip==25.3,platformdirs==4.5.0,pylint==4.0.3,setuptools==80.9.0,tomlkit==0.13.3 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + trim trailing whitespace.................................................Passed Tabs remover.............................................................Passed autopep8.................................................................Passed perltidy.................................................................Passed pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. [INFO] Installing environment for https://github.com/jorisroovers/gitlint. [INFO] Once installed this environment will be reused. [INFO] This may take a few minutes... gitlint..................................................................Passed ------------------------------------ Your code has been rated at 10.00/10 pre-commit: OK ✔ in 49.49 seconds pylint: OK ✔ in 31.04 seconds buildcontroller: OK ✔ in 1 minute 56.33 seconds build_karaf_tests190: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt build_karaf_tests190: freeze> python -m pip freeze --all build_karaf_tests71: freeze> python -m pip freeze --all build_karaf_tests221: freeze> python -m pip freeze --all build_karaf_tests121: freeze> python -m pip freeze --all build_karaf_tests190: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 build_karaf_tests190: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh build_karaf_tests71: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh build karaf in karafoc with ./karafoc.env build karaf in karaf71 with ./karaf71.env NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable build_karaf_tests221: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh build karaf in karaf221 with ./karaf221.env build_karaf_tests121: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh build karaf in karaf121 with ./karaf121.env NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.io=ALL-UNNAMED --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.lang.invoke=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.net=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED --add-opens=java.base/java.nio.charset=ALL-UNNAMED --add-opens=java.base/java.nio.file=ALL-UNNAMED --add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.util.jar=ALL-UNNAMED --add-opens=java.base/java.util.stream=ALL-UNNAMED --add-opens=java.base/java.util.zip=ALL-UNNAMED --add-opens java.base/sun.nio.ch=ALL-UNNAMED --add-opens java.base/sun.nio.fs=ALL-UNNAMED -Xlog:disable [ERROR] Failed to execute goal org.apache.maven.plugins:maven-install-plugin:3.1.4:install (default-install) on project transportpce-karaf: Failed to install metadata org.opendaylight.transportpce:transportpce-karaf:12.0.0-SNAPSHOT/maven-metadata.xml: Could not read metadata /home/jenkins/.m2/repository/org/opendaylight/transportpce/transportpce-karaf/12.0.0-SNAPSHOT/maven-metadata-local.xml: input contained no data -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException build_karaf_tests190: OK ✔ in 1 minute 2.87 seconds build_karaf_tests71: OK ✔ in 1 minute 2.89 seconds buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt [ERROR] Failed to execute goal org.apache.maven.plugins:maven-install-plugin:3.1.4:install (default-install) on project transportpce-karaf: Failed to install metadata org.opendaylight.transportpce:transportpce-karaf:12.0.0-SNAPSHOT/maven-metadata.xml: Could not read metadata /home/jenkins/.m2/repository/org/opendaylight/transportpce/transportpce-karaf/12.0.0-SNAPSHOT/maven-metadata-local.xml: input contained no data -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException build_karaf_tests121: OK ✔ in 1 minute 5.74 seconds build_karaf_tests221: OK ✔ in 1 minute 5.74 seconds testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt buildlighty: freeze> python -m pip freeze --all sims: freeze> python -m pip freeze --all buildlighty: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh sims: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh Using lighynode version 20.1.0.5 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED sims: OK ✔ in 12.95 seconds buildlighty: OK ✔ in 37.38 seconds testsPCE: freeze> python -m pip freeze --all testsPCE: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,click==8.3.1,contourpy==1.3.3,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.7,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.60.1,gnpy4tpce==2.4.7,idna==3.11,iniconfig==2.3.0,injector==0.22.0,invoke==2.2.1,itsdangerous==2.2.0,Jinja2==3.1.6,kiwisolver==1.4.9,lxml==6.0.2,MarkupSafe==3.0.3,matplotlib==3.10.7,netconf-client==3.5.0,networkx==2.8.8,numpy==1.26.4,packaging==25.0,pandas==1.5.3,paramiko==4.0.0,pbr==5.11.1,pillow==12.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pyparsing==3.2.5,pytest==9.0.1,python-dateutil==2.9.0.post0,pytz==2025.2,requests==2.32.5,scipy==1.16.3,setuptools==50.3.2,six==1.17.0,urllib3==2.5.0,Werkzeug==2.0.3,xlrd==1.2.0 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pytest -q transportpce_tests/pce/test01_pce.py .................... [100%] 20 passed in 111.57s (0:01:51) pytest -q transportpce_tests/pce/test02_pce_400G.py ............ [100%] 12 passed in 47.25s pytest -q transportpce_tests/pce/test03_gnpy.py ........ [100%] 8 passed in 38.04s pytest -q transportpce_tests/pce/test04_pce_bug_fix.py ... [100%] 3 passed in 38.18s testsPCE: OK ✔ in 4 minutes 48.11 seconds tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests190: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests_tapi: freeze> python -m pip freeze --all tests121: freeze> python -m pip freeze --all tests190: freeze> python -m pip freeze --all tests_tapi: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi using environment variables from ./karaf221.env pytest -q transportpce_tests/tapi/test01_abstracted_topology.py tests121: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 using environment variables from ./karaf121.env pytest -q transportpce_tests/1.2.1/test01_portmapping.py tests190: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests190: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh oc using environment variables from ./karafoc.env pytest -q transportpce_tests/oc/test01_portmapping.py ........... [100%] 10 passed in 86.00s (0:01:25) pytest -q transportpce_tests/oc/test02_topology.py .................................. [100%] 21 passed in 133.35s (0:02:13) pytest -q transportpce_tests/1.2.1/test02_topo_portmapping.py .......................... [100%] 14 passed in 67.03s (0:01:07) pytest -q transportpce_tests/oc/test03_renderer.py ................ [100%] 6 passed in 61.25s (0:01:01) pytest -q transportpce_tests/1.2.1/test03_topology.py .................... [100%] 19 passed in 59.64s tests190: OK ✔ in 3 minutes 41.86 seconds tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt .tests71: freeze> python -m pip freeze --all tests71: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 using environment variables from ./karaf71.env pytest -q transportpce_tests/7.1/test01_portmapping.py .............................. [100%] 12 passed in 52.52s pytest -q transportpce_tests/7.1/test02_otn_renderer.py .................................................... [100%] 44 passed in 150.90s (0:02:30) pytest -q transportpce_tests/1.2.1/test04_renderer_service_path_nominal.py ....................................................................... [100%] 24 passed in 84.43s (0:01:24) pytest -q transportpce_tests/1.2.1/test05_olm.py . [100%] 62 passed in 158.13s (0:02:38) pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py ........... [100%] 51 passed in 501.17s (0:08:21) .pytest -q transportpce_tests/tapi/test02_full_topology.py ......................................................................... [100%] 48 passed in 155.06s (0:02:35) pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py ............................ [100%] 40 passed in 205.20s (0:03:25) pytest -q transportpce_tests/1.2.1/test06_end2end.py ................. [100%] 22 passed in 77.50s (0:01:17) .................................................... [100%] 36 passed in 335.67s (0:05:35) pytest -q transportpce_tests/tapi/test03_tapi_device_change_notifications.py ......................................................................................... [100%] 71 passed in 510.64s (0:08:30) pytest -q transportpce_tests/tapi/test04_topo_extension.py . [100%] 54 passed in 717.04s (0:11:57) ................... [100%] 19 passed in 142.32s (0:02:22) pytest -q transportpce_tests/tapi/test05_pce_tapi.py ......................E [100%] ==================================== ERRORS ==================================== ______ ERROR at teardown of TransportPCEtest.test_22_get_no_tapi_services ______ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: > sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connection.py:198: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection raise err _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('localhost', 8183), timeout = 30, source_address = None socket_options = [(6, 1, 1)] def create_connection( address: tuple[str, int], timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, source_address: tuple[str, int] | None = None, socket_options: _TYPE_SOCKET_OPTIONS | None = None, ) -> socket.socket: """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`socket.getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. An host of '' or port 0 tells the OS to use the default. """ host, port = address if host.startswith("["): host = host.strip("[]") err = None # Using the value from allowed_gai_family() in the context of getaddrinfo lets # us select whether to work with IPv4 DNS records, IPv6 records, or both. # The original create_connection function always returns all records. family = allowed_gai_family() try: host.encode("idna") except UnicodeError: raise LocationParseError(f"'{host}', label empty or too long") from None for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket.socket(af, socktype, proto) # If provided, set socket level options before connecting. _set_socket_options(sock, socket_options) if timeout is not _DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError The above exception was the direct cause of the following exception: self = method = 'GET' url = '/rests/data/ietf-network:networks/network=otn-topology?content=config' body = None headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=otn-topology', query='content=config', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connectionpool.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request conn.request( ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connection.py:494: in request self.endheaders() /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output self.send(msg) /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send self.connect() ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connection.py:325: in connect self.sock = self._new_conn() ^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def _new_conn(self) -> socket.socket: """Establish a socket connection and set nodelay settings on it. :return: New socket connection. """ try: sock = connection.create_connection( (self._dns_host, self.port), self.timeout, source_address=self.source_address, socket_options=self.socket_options, ) except socket.gaierror as e: raise NameResolutionError(self.host, self, e) from e except SocketTimeout as e: raise ConnectTimeoutError( self, f"Connection to {self.host} timed out. (connect timeout={self.timeout})", ) from e except OSError as e: > raise NewConnectionError( self, f"Failed to establish a new connection: {e}" ) from e E urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 111] Connection refused ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connection.py:213: NewConnectionError The above exception was the direct cause of the following exception: self = request = , stream = False timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) ../.tox/tests_tapi/lib/python3.11/site-packages/requests/adapters.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'GET' url = '/rests/data/ietf-network:networks/network=otn-topology?content=config' response = None error = NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused') _pool = _stacktrace = def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Self: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8183): Max retries exceeded with url: /rests/data/ietf-network:networks/network=otn-topology?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests_tapi/lib/python3.11/site-packages/urllib3/util/retry.py:519: MaxRetryError During handling of the above exception, another exception occurred: cls = @classmethod def tearDownClass(cls): test_utils_generate_tapi_topo.cleanup_and_disconnect_devices() # pylint: disable=not-an-iterable for process in cls.processes: test_utils.shutdown_process(process) print("all processes killed") test_utils.copy_karaf_log(cls.__name__) > test_utils_generate_tapi_topo.uninstall_Tapi_Feature() transportpce_tests/tapi/test05_pce_tapi.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/common/test_utils_generate_tapi_topo.py:383: in uninstall_Tapi_Feature response = test_utils.get_ietf_network_request('otn-topology', 'config') ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ transportpce_tests/common/test_utils.py:562: in get_ietf_network_request response = get_request(url[RESTCONF_VERSION].format(*format_args)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ transportpce_tests/common/test_utils.py:117: in get_request return requests.request( ../.tox/tests_tapi/lib/python3.11/site-packages/requests/api.py:59: in request return session.request(method=method, url=url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ../.tox/tests_tapi/lib/python3.11/site-packages/requests/sessions.py:589: in request resp = self.send(prep, **send_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ../.tox/tests_tapi/lib/python3.11/site-packages/requests/sessions.py:703: in send r = adapter.send(request, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = , stream = False timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest ` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) ` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. raise SSLError(e, request=request) > raise ConnectionError(e, request=request) E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8183): Max retries exceeded with url: /rests/data/ietf-network:networks/network=otn-topology?content=config (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused')) ../.tox/tests_tapi/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError --------------------------- Captured stdout teardown --------------------------- Searching for patterns in karaf.log... Pattern found! Node XPDR-A1 correctly deleted from tpce topology... Searching for patterns in karaf.log... Pattern found! Node SPDR-SA1 correctly deleted from tpce topology... all processes killed ODL log file stored uninstalling feature odl-transportpce-tapi client: JAVA_HOME not set; results may vary --------------------------- Captured stderr teardown --------------------------- Failed to get the session. =========================== short test summary info ============================ ERROR transportpce_tests/tapi/test05_pce_tapi.py::TransportPCEtest::test_22_get_no_tapi_services 22 passed, 1 error in 453.44s (0:07:33) tests71: OK ✔ in 7 minutes 34.06 seconds tests121: OK ✔ in 22 minutes 42.01 seconds tests_tapi: exit 1 (1944.92 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi pid=7180 tests_tapi: FAIL ✖ in 32 minutes 32.66 seconds tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests221: freeze> python -m pip freeze --all tests221: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 using environment variables from ./karaf221.env pytest -q transportpce_tests/2.2.1/test01_portmapping.py ................................... [100%] 35 passed in 76.36s (0:01:16) pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py .F..F. [100%] =================================== FAILURES =================================== _ TestTransportPCETopoPortmapping.test_02_compareOpenroadmTopologyPortMapping_rdm _ self = def test_02_compareOpenroadmTopologyPortMapping_rdm(self): resTopo = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(resTopo['status_code'], requests.codes.ok) nbMapCumul = 0 nbMappings = 0 for node in resTopo['network'][0]['node']: nodeId = node['node-id'] # pylint: disable=consider-using-f-string print("nodeId={}".format(nodeId)) nodeMapId = nodeId.split("-")[0] + "-" + nodeId.split("-")[1] print("nodeMapId={}".format(nodeMapId)) response = test_utils.get_portmapping_node_attr(nodeMapId, "node-info", None) > self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 transportpce_tests/2.2.1/test02_topo_portmapping.py:65: AssertionError ----------------------------- Captured stdout call ----------------------------- nodeId=ROADM-A1-SRG3 nodeMapId=ROADM-A1 nodeId=ROADM-A1-DEG1 nodeMapId=ROADM-A1 nodeId=TAPI-SBI-ABS-NODE nodeMapId=TAPI-SBI _ TestTransportPCETopoPortmapping.test_05_compareOpenroadmTopologyPortMapping_xpdr _ self = def test_05_compareOpenroadmTopologyPortMapping_xpdr(self): > self.test_02_compareOpenroadmTopologyPortMapping_rdm() transportpce_tests/2.2.1/test02_topo_portmapping.py:92: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ transportpce_tests/2.2.1/test02_topo_portmapping.py:65: in test_02_compareOpenroadmTopologyPortMapping_rdm self.assertEqual(response['status_code'], requests.codes.ok) E AssertionError: 409 != 200 ----------------------------- Captured stdout call ----------------------------- nodeId=TAPI-SBI-ABS-NODE nodeMapId=TAPI-SBI =========================== short test summary info ============================ FAILED transportpce_tests/2.2.1/test02_topo_portmapping.py::TestTransportPCETopoPortmapping::test_02_compareOpenroadmTopologyPortMapping_rdm FAILED transportpce_tests/2.2.1/test02_topo_portmapping.py::TestTransportPCETopoPortmapping::test_05_compareOpenroadmTopologyPortMapping_xpdr 2 failed, 4 passed in 45.72s tests221: exit 1 (122.66 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 pid=27529 tests221: FAIL ✖ in 2 minutes 10.45 seconds tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt tests_hybrid: freeze> python -m pip freeze --all tests_hybrid: bcrypt==5.0.0,certifi==2025.11.12,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.3,dict2xml==1.7.7,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==25.0,paramiko==4.0.0,pip==25.3,pluggy==1.6.0,psutil==7.1.3,pycparser==2.23,Pygments==2.19.2,PyNaCl==1.6.1,pytest==9.0.1,requests==2.32.5,setuptools==80.9.0,urllib3==2.5.0 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid using environment variables from ./karaf221.env pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py ..............F...F...F..F...F..F...F..F...F....... [100%] =================================== FAILURES =================================== _ TestTransportPCEDeviceChangeNotifications.test_15_check_update_openroadm_topo _ self = def test_15_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] nb_updated_tp = 0 for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:237: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_15_check_update_openroadm_topo _ TestTransportPCEDeviceChangeNotifications.test_19_check_update_openroadm_topo_ok _ self = def test_19_check_update_openroadm_topo_ok(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:300: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_19_check_update_openroadm_topo_ok _ TestTransportPCEDeviceChangeNotifications.test_23_check_update_openroadm_topo _ self = def test_23_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] nb_updated_tp = 0 for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:351: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_23_check_update_openroadm_topo _ TestTransportPCEDeviceChangeNotifications.test_26_check_update_openroadm_topo_ok _ self = def test_26_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:393: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def test_19_check_update_openroadm_topo_ok(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:300: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_26_check_update_openroadm_topo_ok _ TestTransportPCEDeviceChangeNotifications.test_30_check_update_openroadm_topo _ self = def test_30_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] nb_updated_tp = 0 for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:435: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_30_check_update_openroadm_topo _ TestTransportPCEDeviceChangeNotifications.test_33_check_update_openroadm_topo_ok _ self = def test_33_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def test_19_check_update_openroadm_topo_ok(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:300: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_33_check_update_openroadm_topo_ok _ TestTransportPCEDeviceChangeNotifications.test_37_check_update_openroadm_topo _ self = def test_37_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] nb_updated_tp = 0 for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:517: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_37_check_update_openroadm_topo _ TestTransportPCEDeviceChangeNotifications.test_40_check_update_openroadm_topo_ok _ self = def test_40_check_update_openroadm_topo_ok(self): > self.test_19_check_update_openroadm_topo_ok() transportpce_tests/hybrid/test01_device_change_notifications.py:557: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = def test_19_check_update_openroadm_topo_ok(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:300: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_40_check_update_openroadm_topo_ok _ TestTransportPCEDeviceChangeNotifications.test_44_check_update_openroadm_topo _ self = def test_44_check_update_openroadm_topo(self): response = test_utils.get_ietf_network_request('openroadm-topology', 'config') self.assertEqual(response['status_code'], requests.codes.ok) node_list = response['network'][0]['node'] nb_updated_tp = 0 for node in node_list: self.assertEqual(node['org-openroadm-common-network:operational-state'], 'inService') self.assertEqual(node['org-openroadm-common-network:administrative-state'], 'inService') > tp_list = node['ietf-network-topology:termination-point'] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ E KeyError: 'ietf-network-topology:termination-point' transportpce_tests/hybrid/test01_device_change_notifications.py:599: KeyError ----------------------------- Captured stdout call ----------------------------- execution of test_44_check_update_openroadm_topo =========================== short test summary info ============================ FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_15_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_19_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_23_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_26_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_30_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_33_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_37_check_update_openroadm_topo FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_40_check_update_openroadm_topo_ok FAILED transportpce_tests/hybrid/test01_device_change_notifications.py::TestTransportPCEDeviceChangeNotifications::test_44_check_update_openroadm_topo 9 failed, 42 passed in 318.96s (0:05:18) tests_hybrid: exit 1 (319.28 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid pid=29234 buildcontroller: OK (116.33=setup[8.99]+cmd[107.34] seconds) sims: OK (12.95=setup[10.14]+cmd[2.81] seconds) build_karaf_tests121: OK (65.74=setup[8.41]+cmd[57.33] seconds) testsPCE: OK (288.11=setup[51.75]+cmd[236.36] seconds) tests121: OK (1362.01=setup[7.78]+cmd[1354.23] seconds) build_karaf_tests221: OK (65.74=setup[8.39]+cmd[57.35] seconds) tests_tapi: FAIL code 1 (1952.66=setup[7.74]+cmd[1944.92] seconds) tests221: FAIL code 1 (130.45=setup[7.79]+cmd[122.66] seconds) build_karaf_tests71: OK (62.89=setup[8.30]+cmd[54.59] seconds) tests71: OK (454.06=setup[9.26]+cmd[444.81] seconds) build_karaf_tests190: OK (62.86=setup[8.30]+cmd[54.56] seconds) tests190: OK (221.86=setup[7.85]+cmd[214.01] seconds) tests_hybrid: FAIL code 1 (326.58=setup[7.30]+cmd[319.28] seconds) buildlighty: OK (37.38=setup[10.17]+cmd[27.22] seconds) docs: OK (31.30=setup[28.20]+cmd[3.10] seconds) docs-linkcheck: OK (32.53=setup[27.64]+cmd[4.89] seconds) checkbashisms: OK (3.10=setup[1.84]+cmd[0.01,0.06,1.19] seconds) pre-commit: OK (49.49=setup[2.83]+cmd[0.01,0.01,38.51,8.13] seconds) pylint: OK (31.04=setup[4.05]+cmd[26.99] seconds) evaluation failed :( (2879.93 seconds) + tox_status=255 + echo '---> Completed tox runs' ---> Completed tox runs + for i in .tox/*/log ++ echo .tox/build_karaf_tests121/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests121 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 + for i in .tox/*/log ++ echo .tox/build_karaf_tests190/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests190 + cp -r .tox/build_karaf_tests190/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests190 + for i in .tox/*/log ++ echo .tox/build_karaf_tests221/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests221 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests221 + for i in .tox/*/log ++ echo .tox/build_karaf_tests71/log ++ awk -F/ '{print $2}' + tox_env=build_karaf_tests71 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests71 + for i in .tox/*/log ++ echo .tox/buildcontroller/log ++ awk -F/ '{print $2}' + tox_env=buildcontroller + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildcontroller + for i in .tox/*/log ++ echo .tox/buildlighty/log ++ awk -F/ '{print $2}' + tox_env=buildlighty + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildlighty + for i in .tox/*/log ++ echo .tox/checkbashisms/log ++ awk -F/ '{print $2}' + tox_env=checkbashisms + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/checkbashisms + for i in .tox/*/log ++ echo .tox/docs-linkcheck/log ++ awk -F/ '{print $2}' + tox_env=docs-linkcheck + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs-linkcheck + for i in .tox/*/log ++ echo .tox/docs/log ++ awk -F/ '{print $2}' + tox_env=docs + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs + for i in .tox/*/log ++ echo .tox/pre-commit/log ++ awk -F/ '{print $2}' + tox_env=pre-commit + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pre-commit + for i in .tox/*/log ++ echo .tox/pylint/log ++ awk -F/ '{print $2}' + tox_env=pylint + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pylint + for i in .tox/*/log ++ echo .tox/sims/log ++ awk -F/ '{print $2}' + tox_env=sims + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/sims + for i in .tox/*/log ++ echo .tox/tests121/log ++ awk -F/ '{print $2}' + tox_env=tests121 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests121 + for i in .tox/*/log ++ echo .tox/tests190/log ++ awk -F/ '{print $2}' + tox_env=tests190 + cp -r .tox/tests190/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests190 + for i in .tox/*/log ++ echo .tox/tests221/log ++ awk -F/ '{print $2}' + tox_env=tests221 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests221 + for i in .tox/*/log ++ echo .tox/tests71/log ++ awk -F/ '{print $2}' + tox_env=tests71 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests71 + for i in .tox/*/log ++ echo .tox/testsPCE/log ++ awk -F/ '{print $2}' + tox_env=testsPCE + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/testsPCE + for i in .tox/*/log ++ echo .tox/tests_hybrid/log ++ awk -F/ '{print $2}' + tox_env=tests_hybrid + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_hybrid + for i in .tox/*/log ++ echo .tox/tests_tapi/log ++ awk -F/ '{print $2}' + tox_env=tests_tapi + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_tapi + DOC_DIR=docs/_build/html + [[ -d docs/_build/html ]] + echo '---> Archiving generated docs' ---> Archiving generated docs + mv docs/_build/html /w/workspace/transportpce-tox-verify-transportpce-master/archives/docs + echo '---> tox-run.sh ends' ---> tox-run.sh ends + test 255 -eq 0 + exit 255 ++ '[' 1 = 1 ']' ++ '[' -x /usr/bin/clear_console ']' ++ /usr/bin/clear_console -q Build step 'Execute shell' marked build as failure $ ssh-agent -k unset SSH_AUTH_SOCK; unset SSH_AGENT_PID; echo Agent pid 1569 killed; [ssh-agent] Stopped. [PostBuildScript] - [INFO] Executing post build scripts. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15881339848459733368.sh ---> sysstat.sh [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins8134851671663175807.sh ---> package-listing.sh ++ tr '[:upper:]' '[:lower:]' ++ facter osfamily + OS_FAMILY=debian + workspace=/w/workspace/transportpce-tox-verify-transportpce-master + START_PACKAGES=/tmp/packages_start.txt + END_PACKAGES=/tmp/packages_end.txt + DIFF_PACKAGES=/tmp/packages_diff.txt + PACKAGES=/tmp/packages_start.txt + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' + PACKAGES=/tmp/packages_end.txt + case "${OS_FAMILY}" in + dpkg -l + grep '^ii' + '[' -f /tmp/packages_start.txt ']' + '[' -f /tmp/packages_end.txt ']' + diff /tmp/packages_start.txt /tmp/packages_end.txt + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins5945560408877141387.sh ---> capture-instance-metadata.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l63k from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-l63k/bin to PATH INFO: Running in OpenStack, capturing instance metadata [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins14395711978244484393.sh provisioning config files... Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #3924 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config10247606172113214275tmp Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] Run condition [Regular expression match] enabling perform for step [Provide Configuration files] provisioning config files... copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials [EnvInject] - Injecting environment variables from a build step. [EnvInject] - Injecting as environment variables the properties content SERVER_ID=logs [EnvInject] - Variables injected successfully. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins3769389461863269349.sh ---> create-netrc.sh WARN: Log server credential not found. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11302779692649785145.sh ---> python-tools-install.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l63k from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools lf-activate-venv(): INFO: Adding /tmp/venv-l63k/bin to PATH [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins14193664736600603314.sh ---> sudo-logs.sh Archiving 'sudo' log.. [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins2682053981243802361.sh ---> job-cost.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l63k from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-l63k/bin to PATH INFO: No Stack... INFO: Retrieving Pricing Info for: v3-standard-4 INFO: Archiving Costs [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins12439237527923640369.sh ---> logs-deploy.sh Setup pyenv: system 3.8.20 3.9.20 3.10.15 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) lf-activate-venv(): INFO: Reuse venv:/tmp/venv-l63k from file:/tmp/.os_lf_venv lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) lf-activate-venv(): INFO: Attempting to install with network-safe options... lf-activate-venv(): INFO: Base packages installed successfully lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 lf-activate-venv(): INFO: Adding /tmp/venv-l63k/bin to PATH WARNING: Nexus logging server not set INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/3924/ INFO: archiving logs to S3 ---> uname -a: Linux prd-ubuntu2204-docker-4c-16g-11877 5.15.0-153-generic #163-Ubuntu SMP Thu Aug 7 16:37:18 UTC 2025 x86_64 x86_64 x86_64 GNU/Linux ---> lscpu: Architecture: x86_64 CPU op-mode(s): 32-bit, 64-bit Address sizes: 40 bits physical, 48 bits virtual Byte Order: Little Endian CPU(s): 4 On-line CPU(s) list: 0-3 Vendor ID: AuthenticAMD Model name: AMD EPYC-Rome Processor CPU family: 23 Model: 49 Thread(s) per core: 1 Core(s) per socket: 1 Socket(s): 4 Stepping: 0 BogoMIPS: 5600.00 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities Virtualization: AMD-V Hypervisor vendor: KVM Virtualization type: full L1d cache: 128 KiB (4 instances) L1i cache: 128 KiB (4 instances) L2 cache: 2 MiB (4 instances) L3 cache: 64 MiB (4 instances) NUMA node(s): 1 NUMA node0 CPU(s): 0-3 Vulnerability Gather data sampling: Not affected Vulnerability Indirect target selection: Not affected Vulnerability Itlb multihit: Not affected Vulnerability L1tf: Not affected Vulnerability Mds: Not affected Vulnerability Meltdown: Not affected Vulnerability Mmio stale data: Not affected Vulnerability Reg file data sampling: Not affected Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled Vulnerability Spec rstack overflow: Mitigation; SMT disabled Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization Vulnerability Spectre v2: Mitigation; Retpolines; IBPB disabled; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected Vulnerability Srbds: Not affected Vulnerability Tsx async abort: Not affected ---> nproc: 4 ---> df -h: Filesystem Size Used Avail Use% Mounted on tmpfs 1.6G 1.1M 1.6G 1% /run /dev/vda1 78G 17G 61G 22% / tmpfs 7.9G 0 7.9G 0% /dev/shm tmpfs 5.0M 0 5.0M 0% /run/lock /dev/vda15 105M 6.1M 99M 6% /boot/efi tmpfs 1.6G 4.0K 1.6G 1% /run/user/1001 ---> free -m: total used free shared buff/cache available Mem: 15989 704 11680 3 3604 14942 Swap: 1023 1 1022 ---> ip addr: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 link/ether fa:16:3e:a5:fa:c8 brd ff:ff:ff:ff:ff:ff altname enp0s3 inet 10.30.170.39/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 valid_lft 83347sec preferred_lft 83347sec inet6 fe80::f816:3eff:fea5:fac8/64 scope link valid_lft forever preferred_lft forever 3: docker0: mtu 1458 qdisc noqueue state DOWN group default link/ether 16:d6:b1:f6:af:78 brd ff:ff:ff:ff:ff:ff inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 valid_lft forever preferred_lft forever ---> sar -b -r -n DEV: Linux 5.15.0-153-generic (prd-ubuntu2204-docker-4c-16g-11877) 11/19/25 _x86_64_ (4 CPU) 08:29:08 LINUX RESTART (4 CPU) 08:30:26 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 08:40:01 98.09 4.55 87.54 6.01 548.09 39568.16 26024.46 08:50:26 61.93 25.25 34.67 2.01 381.07 1891.23 1995.97 09:00:26 6.10 0.31 5.52 0.27 10.04 166.63 154.99 09:10:26 7.65 0.02 7.23 0.40 0.63 255.64 207.41 Average: 43.06 7.75 33.18 2.13 233.18 10075.43 6843.75 08:30:26 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 08:40:01 386644 11671092 4247564 25.94 258120 10769212 6005896 34.47 2189464 12965728 5184 08:50:26 3637684 6510492 9425992 57.57 263912 2639340 10793256 61.95 2030332 10168856 3020 09:00:26 4191532 7078756 8858484 54.10 264788 2652876 9592788 55.06 2041284 9598740 108 09:10:26 6834672 9745288 6194648 37.83 265860 2675200 6903532 39.63 2046732 6972812 392 Average: 3762633 8751407 7181672 43.86 263170 4684157 8323868 47.78 2076953 9926534 2176 08:30:26 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 08:40:01 lo 3.27 3.27 2.51 2.51 0.00 0.00 0.00 0.00 08:40:01 ens3 124.89 94.82 1974.80 10.58 0.00 0.00 0.00 0.00 08:40:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 08:50:26 lo 43.11 43.11 25.88 25.88 0.00 0.00 0.00 0.00 08:50:26 ens3 2.12 1.90 0.52 0.44 0.00 0.00 0.00 0.00 08:50:26 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:00:26 lo 29.34 29.34 10.84 10.84 0.00 0.00 0.00 0.00 09:00:26 ens3 1.07 0.89 0.26 1.11 0.00 0.00 0.00 0.00 09:00:26 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 09:10:26 lo 9.77 9.77 6.20 6.20 0.00 0.00 0.00 0.00 09:10:26 ens3 0.83 0.46 0.21 0.14 0.00 0.00 0.00 0.00 09:10:26 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 Average: lo 21.79 21.79 11.60 11.60 0.00 0.00 0.00 0.00 Average: ens3 30.94 23.54 473.26 2.96 0.00 0.00 0.00 0.00 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 ---> sar -P ALL: Linux 5.15.0-153-generic (prd-ubuntu2204-docker-4c-16g-11877) 11/19/25 _x86_64_ (4 CPU) 08:29:08 LINUX RESTART (4 CPU) 08:30:26 CPU %user %nice %system %iowait %steal %idle 08:40:01 all 49.71 0.00 2.66 1.44 0.10 46.09 08:40:01 0 51.93 0.00 2.80 1.65 0.10 43.52 08:40:01 1 53.16 0.00 2.76 1.41 0.10 42.55 08:40:01 2 47.42 0.00 2.58 1.40 0.10 48.50 08:40:01 3 46.32 0.00 2.50 1.30 0.10 49.79 08:50:26 all 60.32 0.00 2.50 0.12 0.11 36.94 08:50:26 0 60.23 0.00 2.50 0.14 0.12 37.01 08:50:26 1 60.72 0.00 2.51 0.08 0.12 36.58 08:50:26 2 59.33 0.00 2.61 0.18 0.11 37.77 08:50:26 3 61.01 0.00 2.39 0.09 0.11 36.40 09:00:26 all 10.76 0.00 0.69 0.32 0.25 87.98 09:00:26 0 10.77 0.00 0.61 0.70 0.26 87.66 09:00:26 1 10.92 0.00 0.72 0.05 0.18 88.13 09:00:26 2 11.08 0.00 0.67 0.08 0.24 87.93 09:00:26 3 10.27 0.00 0.75 0.47 0.33 88.18 09:10:26 all 15.92 0.00 0.64 0.04 0.07 83.34 09:10:26 0 15.55 0.00 0.60 0.03 0.07 83.76 09:10:26 1 15.70 0.00 0.72 0.07 0.07 83.44 09:10:26 2 15.95 0.00 0.58 0.03 0.07 83.38 09:10:26 3 16.48 0.00 0.65 0.03 0.07 82.77 Average: all 34.30 0.00 1.62 0.47 0.13 63.48 Average: 0 34.73 0.00 1.62 0.62 0.13 62.89 Average: 1 35.21 0.00 1.68 0.39 0.12 62.61 Average: 2 33.57 0.00 1.61 0.41 0.13 64.28 Average: 3 33.68 0.00 1.57 0.46 0.15 64.13