07:50:52 Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/120307 07:50:52 Running as SYSTEM 07:50:52 [EnvInject] - Loading node environment variables. 07:50:52 Building remotely on prd-ubuntu2204-docker-4c-16g-20850 (ubuntu2204-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master 07:50:52 [ssh-agent] Looking for ssh-agent implementation... 07:50:52 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 07:50:52 $ ssh-agent 07:50:52 SSH_AUTH_SOCK=/tmp/ssh-XXXXXXJwAO7t/agent.1574 07:50:52 SSH_AGENT_PID=1576 07:50:52 [ssh-agent] Started. 07:50:52 Running ssh-add (command line suppressed) 07:50:52 Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_16672803107039662428.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_16672803107039662428.key) 07:50:52 [ssh-agent] Using credentials jenkins (jenkins-ssh) 07:50:52 The recommended git tool is: NONE 07:50:54 using credential jenkins-ssh 07:50:54 Wiping out workspace first. 07:50:54 Cloning the remote Git repository 07:50:55 Cloning repository git://devvexx.opendaylight.org/mirror/transportpce 07:50:55 > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 07:50:55 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 07:50:55 > git --version # timeout=10 07:50:55 > git --version # 'git version 2.34.1' 07:50:55 using GIT_SSH to set credentials jenkins-ssh 07:50:55 Verifying host key using known hosts file 07:50:55 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 07:50:55 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 07:50:58 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 07:50:58 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 07:50:59 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 07:50:59 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 07:50:59 using GIT_SSH to set credentials jenkins-ssh 07:50:59 Verifying host key using known hosts file 07:50:59 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 07:50:59 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/07/120307/9 # timeout=10 07:50:59 > git rev-parse ae8ea2c56154e51c3e9d77bca123c6b5a7f883e7^{commit} # timeout=10 07:50:59 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script 07:50:59 Checking out Revision ae8ea2c56154e51c3e9d77bca123c6b5a7f883e7 (refs/changes/07/120307/9) 07:50:59 > git config core.sparsecheckout # timeout=10 07:50:59 > git checkout -f ae8ea2c56154e51c3e9d77bca123c6b5a7f883e7 # timeout=10 07:50:59 Commit message: "TAPI - Normalize SortedRange by merging intervals" 07:50:59 > git rev-parse FETCH_HEAD^{commit} # timeout=10 07:51:00 > git rev-list --no-walk a8344161044afb611a389216b8361ea78f4e4ad5 # timeout=10 07:51:00 > git remote # timeout=10 07:51:00 > git submodule init # timeout=10 07:51:00 > git submodule sync # timeout=10 07:51:00 > git config --get remote.origin.url # timeout=10 07:51:00 > git submodule init # timeout=10 07:51:00 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 07:51:00 ERROR: No submodules found. 07:51:03 provisioning config files... 07:51:03 copy managed file [npmrc] to file:/home/jenkins/.npmrc 07:51:03 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 07:51:03 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins5981142478672720690.sh 07:51:03 ---> python-tools-install.sh 07:51:03 Setup pyenv: 07:51:03 * system (set by /opt/pyenv/version) 07:51:03 * 3.8.20 (set by /opt/pyenv/version) 07:51:03 * 3.9.20 (set by /opt/pyenv/version) 07:51:03 3.10.15 07:51:03 3.11.10 07:51:08 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-g0aN 07:51:08 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 07:51:08 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 07:51:08 lf-activate-venv(): INFO: Attempting to install with network-safe options... 07:51:12 lf-activate-venv(): INFO: Base packages installed successfully 07:51:12 lf-activate-venv(): INFO: Installing additional packages: lftools 07:51:45 lf-activate-venv(): INFO: Adding /tmp/venv-g0aN/bin to PATH 07:51:45 Generating Requirements File 07:52:06 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 07:52:06 httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. 07:52:06 Python 3.11.10 07:52:07 pip 26.0.1 from /tmp/venv-g0aN/lib/python3.11/site-packages/pip (python 3.11) 07:52:07 appdirs==1.4.4 07:52:07 argcomplete==3.6.3 07:52:07 aspy.yaml==1.3.0 07:52:07 attrs==25.4.0 07:52:07 autopage==0.6.0 07:52:07 beautifulsoup4==4.14.3 07:52:07 boto3==1.42.57 07:52:07 botocore==1.42.57 07:52:07 bs4==0.0.2 07:52:07 certifi==2026.2.25 07:52:07 cffi==2.0.0 07:52:07 cfgv==3.5.0 07:52:07 chardet==6.0.0.post1 07:52:07 charset-normalizer==3.4.4 07:52:07 click==8.3.1 07:52:07 cliff==4.13.2 07:52:07 cmd2==3.2.2 07:52:07 cryptography==3.3.2 07:52:07 debtcollector==3.0.0 07:52:07 decorator==5.2.1 07:52:07 defusedxml==0.7.1 07:52:07 Deprecated==1.3.1 07:52:07 distlib==0.4.0 07:52:07 dnspython==2.8.0 07:52:07 docker==7.1.0 07:52:07 dogpile.cache==1.5.0 07:52:07 durationpy==0.10 07:52:07 email-validator==2.3.0 07:52:07 filelock==3.24.3 07:52:07 future==1.0.0 07:52:07 gitdb==4.0.12 07:52:07 GitPython==3.1.46 07:52:07 httplib2==0.30.2 07:52:07 identify==2.6.16 07:52:07 idna==3.11 07:52:07 importlib-resources==1.5.0 07:52:07 iso8601==2.1.0 07:52:07 Jinja2==3.1.6 07:52:07 jmespath==1.1.0 07:52:07 jsonpatch==1.33 07:52:07 jsonpointer==3.0.0 07:52:07 jsonschema==4.26.0 07:52:07 jsonschema-specifications==2025.9.1 07:52:07 keystoneauth1==5.13.1 07:52:07 kubernetes==35.0.0 07:52:07 lftools==0.37.21 07:52:07 lxml==6.0.2 07:52:07 markdown-it-py==4.0.0 07:52:07 MarkupSafe==3.0.3 07:52:07 mdurl==0.1.2 07:52:07 msgpack==1.1.2 07:52:07 multi_key_dict==2.0.3 07:52:07 munch==4.0.0 07:52:07 netaddr==1.3.0 07:52:07 niet==1.4.2 07:52:07 nodeenv==1.10.0 07:52:07 oauth2client==4.1.3 07:52:07 oauthlib==3.3.1 07:52:07 openstacksdk==4.10.0 07:52:07 os-service-types==1.8.2 07:52:07 osc-lib==4.4.0 07:52:07 oslo.config==10.3.0 07:52:07 oslo.context==6.3.0 07:52:07 oslo.i18n==6.7.2 07:52:07 oslo.log==8.1.0 07:52:07 oslo.serialization==5.9.1 07:52:07 oslo.utils==10.0.0 07:52:07 packaging==26.0 07:52:07 pbr==7.0.3 07:52:07 platformdirs==4.9.2 07:52:07 prettytable==3.17.0 07:52:07 psutil==7.2.2 07:52:07 pyasn1==0.6.2 07:52:07 pyasn1_modules==0.4.2 07:52:07 pycparser==3.0 07:52:07 pygerrit2==2.0.15 07:52:07 PyGithub==2.8.1 07:52:07 Pygments==2.19.2 07:52:07 PyJWT==2.11.0 07:52:07 PyNaCl==1.6.2 07:52:07 pyparsing==2.4.7 07:52:07 pyperclip==1.11.0 07:52:07 pyrsistent==0.20.0 07:52:07 python-cinderclient==9.8.0 07:52:07 python-dateutil==2.9.0.post0 07:52:07 python-discovery==1.0.0 07:52:07 python-heatclient==5.0.0 07:52:07 python-jenkins==1.8.3 07:52:07 python-keystoneclient==5.7.0 07:52:07 python-magnumclient==4.9.0 07:52:07 python-openstackclient==9.0.0 07:52:07 python-swiftclient==4.10.0 07:52:07 PyYAML==6.0.3 07:52:07 referencing==0.37.0 07:52:07 requests==2.32.5 07:52:07 requests-oauthlib==2.0.0 07:52:07 requestsexceptions==1.4.0 07:52:07 rfc3986==2.0.0 07:52:07 rich==14.3.3 07:52:07 rich-argparse==1.7.2 07:52:07 rpds-py==0.30.0 07:52:07 rsa==4.9.1 07:52:07 ruamel.yaml==0.19.1 07:52:07 ruamel.yaml.clib==0.2.15 07:52:07 s3transfer==0.16.0 07:52:07 simplejson==3.20.2 07:52:07 six==1.17.0 07:52:07 smmap==5.0.2 07:52:07 soupsieve==2.8.3 07:52:07 stevedore==5.7.0 07:52:07 tabulate==0.9.0 07:52:07 toml==0.10.2 07:52:07 tomlkit==0.14.0 07:52:07 tqdm==4.67.3 07:52:07 typing_extensions==4.15.0 07:52:07 urllib3==1.26.20 07:52:07 virtualenv==21.0.0 07:52:07 wcwidth==0.6.0 07:52:07 websocket-client==1.9.0 07:52:07 wrapt==2.1.1 07:52:07 xdg==6.0.0 07:52:07 xmltodict==1.0.4 07:52:07 yq==3.4.3 07:52:07 [EnvInject] - Injecting environment variables from a build step. 07:52:07 [EnvInject] - Injecting as environment variables the properties content 07:52:07 PYTHON=python3 07:52:07 07:52:07 [EnvInject] - Variables injected successfully. 07:52:07 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins1856673535101516982.sh 07:52:07 ---> tox-install.sh 07:52:07 + source /home/jenkins/lf-env.sh 07:52:07 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 07:52:07 ++ mktemp -d /tmp/venv-XXXX 07:52:07 + lf_venv=/tmp/venv-J0D4 07:52:07 + local venv_file=/tmp/.os_lf_venv 07:52:07 + local python=python3 07:52:07 + local options 07:52:07 + local set_path=true 07:52:07 + local install_args= 07:52:07 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 07:52:07 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 07:52:07 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 07:52:07 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 07:52:07 + true 07:52:07 + case $1 in 07:52:07 + venv_file=/tmp/.toxenv 07:52:07 + shift 2 07:52:07 + true 07:52:07 + case $1 in 07:52:07 + shift 07:52:07 + break 07:52:07 + case $python in 07:52:07 + local pkg_list= 07:52:07 + [[ -d /opt/pyenv ]] 07:52:07 + echo 'Setup pyenv:' 07:52:07 Setup pyenv: 07:52:07 + export PYENV_ROOT=/opt/pyenv 07:52:07 + PYENV_ROOT=/opt/pyenv 07:52:07 + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:07 + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:07 + pyenv versions 07:52:07 system 07:52:07 3.8.20 07:52:07 3.9.20 07:52:07 3.10.15 07:52:07 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 07:52:07 + command -v pyenv 07:52:07 ++ pyenv init - --no-rehash 07:52:07 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 07:52:07 for i in ${!paths[@]}; do 07:52:07 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 07:52:07 fi; done; 07:52:07 echo "${paths[*]}"'\'')" 07:52:07 export PATH="/opt/pyenv/shims:${PATH}" 07:52:07 export PYENV_SHELL=bash 07:52:07 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 07:52:07 pyenv() { 07:52:07 local command 07:52:07 command="${1:-}" 07:52:07 if [ "$#" -gt 0 ]; then 07:52:07 shift 07:52:07 fi 07:52:07 07:52:07 case "$command" in 07:52:07 rehash|shell) 07:52:07 eval "$(pyenv "sh-$command" "$@")" 07:52:07 ;; 07:52:07 *) 07:52:07 command pyenv "$command" "$@" 07:52:07 ;; 07:52:07 esac 07:52:07 }' 07:52:07 +++ bash --norc -ec 'IFS=:; paths=($PATH); 07:52:07 for i in ${!paths[@]}; do 07:52:07 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 07:52:07 fi; done; 07:52:07 echo "${paths[*]}"' 07:52:07 ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:07 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:07 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:07 ++ export PYENV_SHELL=bash 07:52:07 ++ PYENV_SHELL=bash 07:52:07 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 07:52:07 +++ complete -F _pyenv pyenv 07:52:07 ++ lf-pyver python3 07:52:07 ++ local py_version_xy=python3 07:52:07 ++ local py_version_xyz= 07:52:07 ++ pyenv versions 07:52:07 ++ local command 07:52:07 ++ command=versions 07:52:07 ++ '[' 1 -gt 0 ']' 07:52:07 ++ shift 07:52:07 ++ case "$command" in 07:52:07 ++ command pyenv versions 07:52:07 ++ grep -E '^[0-9.]*[0-9]$' 07:52:07 ++ sed 's/^[ *]* //' 07:52:07 ++ awk '{ print $1 }' 07:52:07 ++ [[ ! -s /tmp/.pyenv_versions ]] 07:52:07 +++ sort -V 07:52:07 +++ tail -n 1 07:52:07 +++ grep '^3' /tmp/.pyenv_versions 07:52:07 ++ py_version_xyz=3.11.10 07:52:07 ++ [[ -z 3.11.10 ]] 07:52:07 ++ echo 3.11.10 07:52:07 ++ return 0 07:52:07 + pyenv local 3.11.10 07:52:07 + local command 07:52:07 + command=local 07:52:07 + '[' 2 -gt 0 ']' 07:52:07 + shift 07:52:07 + case "$command" in 07:52:07 + command pyenv local 3.11.10 07:52:07 + for arg in "$@" 07:52:07 + case $arg in 07:52:07 + pkg_list+='tox ' 07:52:07 + for arg in "$@" 07:52:07 + case $arg in 07:52:07 + pkg_list+='virtualenv ' 07:52:07 + for arg in "$@" 07:52:07 + case $arg in 07:52:07 + pkg_list+='urllib3~=1.26.15 ' 07:52:07 + [[ -f /tmp/.toxenv ]] 07:52:07 + [[ ! -f /tmp/.toxenv ]] 07:52:07 + [[ -n '' ]] 07:52:07 + python3 -m venv /tmp/venv-J0D4 07:52:11 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-J0D4' 07:52:11 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-J0D4 07:52:11 + echo /tmp/venv-J0D4 07:52:11 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' 07:52:11 lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv 07:52:11 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 07:52:11 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 07:52:11 + local 'pip_opts=--upgrade --quiet' 07:52:11 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 07:52:11 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 07:52:11 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 07:52:11 + [[ -n '' ]] 07:52:11 + [[ -n '' ]] 07:52:11 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 07:52:11 lf-activate-venv(): INFO: Attempting to install with network-safe options... 07:52:11 + /tmp/venv-J0D4/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 07:52:15 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 07:52:15 lf-activate-venv(): INFO: Base packages installed successfully 07:52:15 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 07:52:15 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 07:52:15 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 07:52:15 + /tmp/venv-J0D4/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 07:52:17 + type python3 07:52:17 + true 07:52:17 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-J0D4/bin to PATH' 07:52:17 lf-activate-venv(): INFO: Adding /tmp/venv-J0D4/bin to PATH 07:52:17 + PATH=/tmp/venv-J0D4/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:17 + return 0 07:52:17 + python3 --version 07:52:17 Python 3.11.10 07:52:17 + python3 -m pip --version 07:52:18 pip 26.0.1 from /tmp/venv-J0D4/lib/python3.11/site-packages/pip (python 3.11) 07:52:18 + python3 -m pip freeze 07:52:18 cachetools==7.0.1 07:52:18 colorama==0.4.6 07:52:18 distlib==0.4.0 07:52:18 filelock==3.24.3 07:52:18 packaging==26.0 07:52:18 platformdirs==4.9.2 07:52:18 pluggy==1.6.0 07:52:18 pyproject-api==1.10.0 07:52:18 python-discovery==1.0.0 07:52:18 tox==4.46.3 07:52:18 urllib3==1.26.20 07:52:18 virtualenv==21.0.0 07:52:18 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins12132026692129047883.sh 07:52:18 [EnvInject] - Injecting environment variables from a build step. 07:52:18 [EnvInject] - Injecting as environment variables the properties content 07:52:18 PARALLEL=True 07:52:18 07:52:18 [EnvInject] - Variables injected successfully. 07:52:18 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins12307488610984914687.sh 07:52:18 ---> tox-run.sh 07:52:18 + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 07:52:18 + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 07:52:18 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 07:52:18 + cd /w/workspace/transportpce-tox-verify-transportpce-master/. 07:52:18 + source /home/jenkins/lf-env.sh 07:52:18 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 07:52:18 ++ mktemp -d /tmp/venv-XXXX 07:52:18 + lf_venv=/tmp/venv-mZfH 07:52:18 + local venv_file=/tmp/.os_lf_venv 07:52:18 + local python=python3 07:52:18 + local options 07:52:18 + local set_path=true 07:52:18 + local install_args= 07:52:18 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 07:52:18 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 07:52:18 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 07:52:18 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 07:52:18 + true 07:52:18 + case $1 in 07:52:18 + venv_file=/tmp/.toxenv 07:52:18 + shift 2 07:52:18 + true 07:52:18 + case $1 in 07:52:18 + shift 07:52:18 + break 07:52:18 + case $python in 07:52:18 + local pkg_list= 07:52:18 + [[ -d /opt/pyenv ]] 07:52:18 + echo 'Setup pyenv:' 07:52:18 Setup pyenv: 07:52:18 + export PYENV_ROOT=/opt/pyenv 07:52:18 + PYENV_ROOT=/opt/pyenv 07:52:18 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 + pyenv versions 07:52:18 system 07:52:18 3.8.20 07:52:18 3.9.20 07:52:18 3.10.15 07:52:18 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 07:52:18 + command -v pyenv 07:52:18 ++ pyenv init - --no-rehash 07:52:18 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 07:52:18 for i in ${!paths[@]}; do 07:52:18 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 07:52:18 fi; done; 07:52:18 echo "${paths[*]}"'\'')" 07:52:18 export PATH="/opt/pyenv/shims:${PATH}" 07:52:18 export PYENV_SHELL=bash 07:52:18 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 07:52:18 pyenv() { 07:52:18 local command 07:52:18 command="${1:-}" 07:52:18 if [ "$#" -gt 0 ]; then 07:52:18 shift 07:52:18 fi 07:52:18 07:52:18 case "$command" in 07:52:18 rehash|shell) 07:52:18 eval "$(pyenv "sh-$command" "$@")" 07:52:18 ;; 07:52:18 *) 07:52:18 command pyenv "$command" "$@" 07:52:18 ;; 07:52:18 esac 07:52:18 }' 07:52:18 +++ bash --norc -ec 'IFS=:; paths=($PATH); 07:52:18 for i in ${!paths[@]}; do 07:52:18 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 07:52:18 fi; done; 07:52:18 echo "${paths[*]}"' 07:52:18 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:18 ++ export PYENV_SHELL=bash 07:52:18 ++ PYENV_SHELL=bash 07:52:18 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 07:52:18 +++ complete -F _pyenv pyenv 07:52:18 ++ lf-pyver python3 07:52:18 ++ local py_version_xy=python3 07:52:18 ++ local py_version_xyz= 07:52:18 ++ sed 's/^[ *]* //' 07:52:18 ++ pyenv versions 07:52:18 ++ local command 07:52:18 ++ command=versions 07:52:18 ++ '[' 1 -gt 0 ']' 07:52:18 ++ shift 07:52:18 ++ case "$command" in 07:52:18 ++ command pyenv versions 07:52:18 ++ grep -E '^[0-9.]*[0-9]$' 07:52:18 ++ awk '{ print $1 }' 07:52:18 ++ [[ ! -s /tmp/.pyenv_versions ]] 07:52:18 +++ grep '^3' /tmp/.pyenv_versions 07:52:18 +++ tail -n 1 07:52:18 +++ sort -V 07:52:18 ++ py_version_xyz=3.11.10 07:52:18 ++ [[ -z 3.11.10 ]] 07:52:18 ++ echo 3.11.10 07:52:18 ++ return 0 07:52:18 + pyenv local 3.11.10 07:52:18 + local command 07:52:18 + command=local 07:52:18 + '[' 2 -gt 0 ']' 07:52:18 + shift 07:52:18 + case "$command" in 07:52:18 + command pyenv local 3.11.10 07:52:18 + for arg in "$@" 07:52:18 + case $arg in 07:52:18 + pkg_list+='tox ' 07:52:18 + for arg in "$@" 07:52:18 + case $arg in 07:52:18 + pkg_list+='virtualenv ' 07:52:18 + for arg in "$@" 07:52:18 + case $arg in 07:52:18 + pkg_list+='urllib3~=1.26.15 ' 07:52:18 + [[ -f /tmp/.toxenv ]] 07:52:18 ++ cat /tmp/.toxenv 07:52:18 + lf_venv=/tmp/venv-J0D4 07:52:18 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-J0D4 from' file:/tmp/.toxenv 07:52:18 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-J0D4 from file:/tmp/.toxenv 07:52:18 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 07:52:18 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 07:52:18 + local 'pip_opts=--upgrade --quiet' 07:52:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 07:52:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 07:52:18 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 07:52:18 + [[ -n '' ]] 07:52:18 + [[ -n '' ]] 07:52:18 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 07:52:18 lf-activate-venv(): INFO: Attempting to install with network-safe options... 07:52:18 + /tmp/venv-J0D4/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 07:52:19 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 07:52:19 lf-activate-venv(): INFO: Base packages installed successfully 07:52:19 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 07:52:19 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 07:52:19 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 07:52:19 + /tmp/venv-J0D4/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 07:52:20 + type python3 07:52:20 + true 07:52:20 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-J0D4/bin to PATH' 07:52:20 lf-activate-venv(): INFO: Adding /tmp/venv-J0D4/bin to PATH 07:52:20 + PATH=/tmp/venv-J0D4/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:20 + return 0 07:52:20 + [[ -d /opt/pyenv ]] 07:52:20 + echo '---> Setting up pyenv' 07:52:20 ---> Setting up pyenv 07:52:20 + export PYENV_ROOT=/opt/pyenv 07:52:20 + PYENV_ROOT=/opt/pyenv 07:52:20 + export PATH=/opt/pyenv/bin:/tmp/venv-J0D4/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:20 + PATH=/opt/pyenv/bin:/tmp/venv-J0D4/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 07:52:20 ++ pwd 07:52:20 + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master 07:52:20 + export PYTHONPATH 07:52:20 + export TOX_TESTENV_PASSENV=PYTHONPATH 07:52:20 + TOX_TESTENV_PASSENV=PYTHONPATH 07:52:20 + tox --version 07:52:20 4.46.3 from /tmp/venv-J0D4/lib/python3.11/site-packages/tox/__init__.py 07:52:21 + PARALLEL=True 07:52:21 + TOX_OPTIONS_LIST= 07:52:21 + [[ -n '' ]] 07:52:21 + case ${PARALLEL,,} in 07:52:21 + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' 07:52:21 + tox --parallel auto --parallel-live 07:52:21 + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log 07:52:22 docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt 07:52:22 docs: install_deps> python -I -m pip install -r docs/requirements.txt 07:52:22 buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:52:22 checkbashisms: freeze> python -m pip freeze --all 07:52:23 checkbashisms: pip==26.0.1,setuptools==82.0.0 07:52:23 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 07:52:23 checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' 07:52:23 checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + 07:52:24 checkbashisms: OK ✔ in 3.08 seconds 07:52:24 pre-commit: install_deps> python -I -m pip install pre-commit 07:52:26 pre-commit: freeze> python -m pip freeze --all 07:52:27 pre-commit: cfgv==3.5.0,distlib==0.4.0,filelock==3.24.3,identify==2.6.16,nodeenv==1.10.0,pip==26.0.1,platformdirs==4.9.2,pre_commit==4.5.1,python-discovery==1.0.0,PyYAML==6.0.3,setuptools==82.0.0,virtualenv==21.0.0 07:52:27 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 07:52:27 pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' 07:52:27 /usr/bin/cpan 07:52:27 pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure 07:52:27 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 07:52:27 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 07:52:27 [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. 07:52:28 [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. 07:52:28 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. 07:52:28 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. 07:52:29 [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. 07:52:29 buildcontroller: freeze> python -m pip freeze --all 07:52:29 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. 07:52:29 buildcontroller: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:52:29 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh 07:52:29 + update-java-alternatives -l 07:52:29 java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 07:52:29 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 07:52:29 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 07:52:29 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 07:52:29 update-alternatives: error: no alternatives for jaotc 07:52:29 [INFO] Initializing environment for https://github.com/perltidy/perltidy. 07:52:29 update-alternatives: error: no alternatives for rmic 07:52:29 + + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; 07:52:29 java -version 07:52:29 + JAVA_VER=21 07:52:29 + echo 21 07:52:29 21 07:52:29 + javac -version 07:52:29 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; 07:52:30 + JAVAC_VER=21 07:52:30 + echo 21 07:52:30 + [ 21 -ge 21 ] 07:52:30 + [ 21 -ge 21 ] 07:52:30 + echo ok, java is 21 or newer 07:52:30 + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz -P /tmp 07:52:30 21 07:52:30 ok, java is 21 or newer 07:52:30 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. 07:52:30 [INFO] Once installed this environment will be reused. 07:52:30 [INFO] This may take a few minutes... 07:52:30 2026-02-26 07:52:30 URL:https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz [9233336/9233336] -> "/tmp/apache-maven-3.9.12-bin.tar.gz" [1] 07:52:30 + sudo mkdir -p /opt 07:52:30 + sudo tar xf /tmp/apache-maven-3.9.12-bin.tar.gz -C /opt 07:52:30 + sudo ln -s /opt/apache-maven-3.9.12 /opt/maven 07:52:30 + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn 07:52:30 + mvn --version 07:52:31 Apache Maven 3.9.12 (848fbb4bf2d427b72bdb2471c22fced7ebd9a7a1) 07:52:31 Maven home: /opt/maven 07:52:31 Java version: 21.0.9, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 07:52:31 Default locale: en, platform encoding: UTF-8 07:52:31 OS name: "linux", version: "5.15.0-168-generic", arch: "amd64", family: "unix" 07:52:31 NOTE: Picked up JDK_JAVA_OPTIONS: 07:52:31 --add-opens=java.base/java.io=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.lang=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.net=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.nio=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.nio.file=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.util=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.util.jar=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.util.stream=ALL-UNNAMED 07:52:31 --add-opens=java.base/java.util.zip=ALL-UNNAMED 07:52:31 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 07:52:31 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 07:52:31 -Xlog:disable 07:52:34 [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. 07:52:34 [INFO] Once installed this environment will be reused. 07:52:34 [INFO] This may take a few minutes... 07:52:47 [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. 07:52:47 [INFO] Once installed this environment will be reused. 07:52:47 [INFO] This may take a few minutes... 07:52:54 [INFO] Installing environment for https://github.com/perltidy/perltidy. 07:52:54 [INFO] Once installed this environment will be reused. 07:52:54 [INFO] This may take a few minutes... 07:52:57 docs-linkcheck: freeze> python -m pip freeze --all 07:52:57 docs: freeze> python -m pip freeze --all 07:52:58 docs-linkcheck: alabaster==1.0.0,attrs==25.4.0,babel==2.18.0,blockdiag==3.0.0,certifi==2026.2.25,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs_conf==0.10.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.2,nwdiag==3.0.0,packaging==26.0,pillow==12.1.1,pip==26.0.1,Pygments==2.19.2,pyparsing==3.3.2,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,roman-numerals-py==4.1.0,seqdiag==3.0.0,setuptools==82.0.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 07:52:58 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck 07:52:58 docs: alabaster==1.0.0,attrs==25.4.0,babel==2.18.0,blockdiag==3.0.0,certifi==2026.2.25,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs_conf==0.10.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.2,nwdiag==3.0.0,packaging==26.0,pillow==12.1.1,pip==26.0.1,Pygments==2.19.2,pyparsing==3.3.2,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,roman-numerals-py==4.1.0,seqdiag==3.0.0,setuptools==82.0.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 07:52:58 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html 07:53:01 docs: OK ✔ in 39.93 seconds 07:53:01 pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' 07:53:05 docs-linkcheck: OK ✔ in 42.08 seconds 07:53:05 pylint: freeze> python -m pip freeze --all 07:53:05 pylint: astroid==4.0.4,dill==0.4.1,isort==8.0.0,mccabe==0.7.0,pip==26.0.1,platformdirs==4.9.2,pylint==4.0.5,setuptools==82.0.0,tomlkit==0.14.0 07:53:05 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + 07:53:06 trim trailing whitespace.................................................Passed 07:53:07 Tabs remover.............................................................Passed 07:53:07 autopep8.................................................................Passed 07:53:13 perltidy.................................................................Passed 07:53:14 pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual 07:53:14 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 07:53:14 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 07:53:14 [INFO] Installing environment for https://github.com/jorisroovers/gitlint. 07:53:14 [INFO] Once installed this environment will be reused. 07:53:14 [INFO] This may take a few minutes... 07:53:24 gitlint..................................................................Passed 07:53:31 07:53:31 ------------------------------------ 07:53:31 Your code has been rated at 10.00/10 07:53:31 07:54:37 pre-commit: OK ✔ in 1 minute 0.39 seconds 07:54:37 pylint: OK ✔ in 31.93 seconds 07:54:37 buildcontroller: OK ✔ in 2 minutes 15.32 seconds 07:54:37 build_karaf_tests190: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:54:37 build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:54:37 build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:54:37 build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:54:50 build_karaf_tests121: freeze> python -m pip freeze --all 07:54:50 build_karaf_tests71: freeze> python -m pip freeze --all 07:54:51 build_karaf_tests190: freeze> python -m pip freeze --all 07:54:51 build_karaf_tests221: freeze> python -m pip freeze --all 07:54:51 build_karaf_tests121: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:54:51 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 07:54:51 build karaf in karaf121 with ./karaf121.env 07:54:51 build_karaf_tests71: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:54:51 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 07:54:51 build_karaf_tests190: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:54:51 build_karaf_tests190: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 07:54:51 build karaf in karaf71 with ./karaf71.env 07:54:51 build karaf in karafoc with ./karafoc.env 07:54:51 NOTE: Picked up JDK_JAVA_OPTIONS: 07:54:51 --add-opens=java.base/java.io=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.net=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.file=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.jar=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.stream=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.zip=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 07:54:51 -Xlog:disable 07:54:51 build_karaf_tests221: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:54:51 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 07:54:51 build karaf in karaf221 with ./karaf221.env 07:54:51 NOTE: Picked up JDK_JAVA_OPTIONS: 07:54:51 --add-opens=java.base/java.io=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.net=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.file=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.jar=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.stream=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.zip=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 07:54:51 -Xlog:disable 07:54:51 NOTE: Picked up JDK_JAVA_OPTIONS: 07:54:51 --add-opens=java.base/java.io=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.net=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.file=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.jar=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.stream=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.zip=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 07:54:51 -Xlog:disable 07:54:51 NOTE: Picked up JDK_JAVA_OPTIONS: 07:54:51 --add-opens=java.base/java.io=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.net=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.nio.file=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.jar=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.stream=ALL-UNNAMED 07:54:51 --add-opens=java.base/java.util.zip=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 07:54:51 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 07:54:51 -Xlog:disable 07:56:15 build_karaf_tests71: OK ✔ in 1 minute 38.44 seconds 07:56:15 buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:56:20 build_karaf_tests221: OK ✔ in 1 minute 42.76 seconds 07:56:20 sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:56:25 build_karaf_tests190: OK ✔ in 1 minute 48.13 seconds 07:56:25 build_karaf_tests121: OK ✔ in 1 minute 48.46 seconds 07:56:25 testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 07:56:38 sims: freeze> python -m pip freeze --all 07:56:38 buildlighty: freeze> python -m pip freeze --all 07:56:38 sims: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:56:38 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh 07:56:38 Using lighynode version 22.1.0.6 07:56:38 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory 07:56:38 buildlighty: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 07:56:38 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh 07:56:38 NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED 07:57:51 sims: OK ✔ in 21.8 seconds 07:57:51 buildlighty: OK ✔ in 49.63 seconds 07:57:51 testsPCE: freeze> python -m pip freeze --all 07:57:51 testsPCE: bcrypt==5.0.0,certifi==2026.2.25,cffi==2.0.0,charset-normalizer==3.4.4,click==8.3.1,contourpy==1.3.3,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.8,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.61.1,gnpy4tpce==2.4.7,idna==3.11,iniconfig==2.3.0,injector==0.24.0,invoke==2.2.1,itsdangerous==2.2.0,Jinja2==3.1.6,kiwisolver==1.4.9,lxml==6.0.2,MarkupSafe==3.0.3,matplotlib==3.10.8,netconf-client==3.5.0,networkx==2.8.8,numpy==1.26.4,packaging==26.0,pandas==1.5.3,paramiko==4.0.0,pbr==5.11.1,pillow==12.1.1,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pyparsing==3.3.2,pytest==9.0.2,python-dateutil==2.9.0.post0,pytz==2025.2,requests==2.32.5,scipy==1.17.1,setuptools==50.3.2,six==1.17.0,urllib3==2.6.3,Werkzeug==2.0.3,xlrd==1.2.0 07:57:51 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce 07:57:51 pytest -q transportpce_tests/pce/test01_pce.py 07:58:41 .................... [100%] 07:59:45 20 passed in 113.28s (0:01:53) 07:59:45 pytest -q transportpce_tests/pce/test02_pce_400G.py 08:00:01 .........$ ssh-agent -k 08:00:22 unset SSH_AUTH_SOCK; 08:00:22 unset SSH_AGENT_PID; 08:00:22 echo Agent pid 1576 killed; 08:00:23 [ssh-agent] Stopped. 08:00:23 Build was aborted 08:00:23 Aborted by new patch set. 08:00:23 [PostBuildScript] - [INFO] Executing post build scripts. 08:00:23 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins1212349553158403501.sh 08:00:23 ---> sysstat.sh 08:00:23 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins10565282000249242428.sh 08:00:23 ---> package-listing.sh 08:00:23 ++ facter osfamily 08:00:23 ++ tr '[:upper:]' '[:lower:]' 08:00:24 + OS_FAMILY=debian 08:00:24 + workspace=/w/workspace/transportpce-tox-verify-transportpce-master 08:00:24 + START_PACKAGES=/tmp/packages_start.txt 08:00:24 + END_PACKAGES=/tmp/packages_end.txt 08:00:24 + DIFF_PACKAGES=/tmp/packages_diff.txt 08:00:24 + PACKAGES=/tmp/packages_start.txt 08:00:24 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 08:00:24 + PACKAGES=/tmp/packages_end.txt 08:00:24 + case "${OS_FAMILY}" in 08:00:24 + dpkg -l 08:00:24 + grep '^ii' 08:00:24 + '[' -f /tmp/packages_start.txt ']' 08:00:24 + '[' -f /tmp/packages_end.txt ']' 08:00:24 + diff /tmp/packages_start.txt /tmp/packages_end.txt 08:00:24 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 08:00:24 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 08:00:24 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 08:00:24 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11131614847669849575.sh 08:00:24 ---> capture-instance-metadata.sh 08:00:24 Setup pyenv: 08:00:24 system 08:00:24 3.8.20 08:00:24 3.9.20 08:00:24 3.10.15 08:00:24 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 08:00:24 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-g0aN from file:/tmp/.os_lf_venv 08:00:24 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 08:00:24 lf-activate-venv(): INFO: Attempting to install with network-safe options... 08:00:24 FFlf-activate-venv(): INFO: Base packages installed successfully 08:00:26 lf-activate-venv(): INFO: Installing additional packages: lftools 08:00:26 FE [100%] 08:00:26 ==================================== ERRORS ==================================== 08:00:26 _ ERROR at teardown of TestTransportPCEPce400g.test_12_path_computation_400G_xpdr_bi_cfg _ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 > sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:204: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 08:00:26 raise err 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 address = ('localhost', 8181), timeout = 30, source_address = None 08:00:26 socket_options = [(6, 1, 1)] 08:00:26 08:00:26 def create_connection( 08:00:26 address: tuple[str, int], 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 source_address: tuple[str, int] | None = None, 08:00:26 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 08:00:26 ) -> socket.socket: 08:00:26 """Connect to *address* and return the socket object. 08:00:26 08:00:26 Convenience function. Connect to *address* (a 2-tuple ``(host, 08:00:26 port)``) and return the socket object. Passing the optional 08:00:26 *timeout* parameter will set the timeout on the socket instance 08:00:26 before attempting to connect. If no *timeout* is supplied, the 08:00:26 global default timeout setting returned by :func:`socket.getdefaulttimeout` 08:00:26 is used. If *source_address* is set it must be a tuple of (host, port) 08:00:26 for the socket to bind as a source address before making the connection. 08:00:26 An host of '' or port 0 tells the OS to use the default. 08:00:26 """ 08:00:26 08:00:26 host, port = address 08:00:26 if host.startswith("["): 08:00:26 host = host.strip("[]") 08:00:26 err = None 08:00:26 08:00:26 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 08:00:26 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 08:00:26 # The original create_connection function always returns all records. 08:00:26 family = allowed_gai_family() 08:00:26 08:00:26 try: 08:00:26 host.encode("idna") 08:00:26 except UnicodeError: 08:00:26 raise LocationParseError(f"'{host}', label empty or too long") from None 08:00:26 08:00:26 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 08:00:26 af, socktype, proto, canonname, sa = res 08:00:26 sock = None 08:00:26 try: 08:00:26 sock = socket.socket(af, socktype, proto) 08:00:26 08:00:26 # If provided, set socket level options before connecting. 08:00:26 _set_socket_options(sock, socket_options) 08:00:26 08:00:26 if timeout is not _DEFAULT_TIMEOUT: 08:00:26 sock.settimeout(timeout) 08:00:26 if source_address: 08:00:26 sock.bind(source_address) 08:00:26 > sock.connect(sa) 08:00:26 E ConnectionRefusedError: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 method = 'DELETE', url = '/rests/data/transportpce-portmapping:network' 08:00:26 body = None 08:00:26 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 08:00:26 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 redirect = False, assert_same_host = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 08:00:26 release_conn = False, chunked = False, body_pos = None, preload_content = False 08:00:26 decode_content = False, response_kw = {} 08:00:26 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network', query=None, fragment=None) 08:00:26 destination_scheme = None, conn = None, release_this_conn = True 08:00:26 http_tunnel_required = False, err = None, clean_exit = False 08:00:26 08:00:26 def urlopen( # type: ignore[override] 08:00:26 self, 08:00:26 method: str, 08:00:26 url: str, 08:00:26 body: _TYPE_BODY | None = None, 08:00:26 headers: typing.Mapping[str, str] | None = None, 08:00:26 retries: Retry | bool | int | None = None, 08:00:26 redirect: bool = True, 08:00:26 assert_same_host: bool = True, 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 pool_timeout: int | None = None, 08:00:26 release_conn: bool | None = None, 08:00:26 chunked: bool = False, 08:00:26 body_pos: _TYPE_BODY_POSITION | None = None, 08:00:26 preload_content: bool = True, 08:00:26 decode_content: bool = True, 08:00:26 **response_kw: typing.Any, 08:00:26 ) -> BaseHTTPResponse: 08:00:26 """ 08:00:26 Get a connection from the pool and perform an HTTP request. This is the 08:00:26 lowest level call for making a request, so you'll need to specify all 08:00:26 the raw details. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 More commonly, it's appropriate to use a convenience method 08:00:26 such as :meth:`request`. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 `release_conn` will only behave as expected if 08:00:26 `preload_content=False` because we want to make 08:00:26 `preload_content=False` the default behaviour someday soon without 08:00:26 breaking backwards compatibility. 08:00:26 08:00:26 :param method: 08:00:26 HTTP request method (such as GET, POST, PUT, etc.) 08:00:26 08:00:26 :param url: 08:00:26 The URL to perform the request on. 08:00:26 08:00:26 :param body: 08:00:26 Data to send in the request body, either :class:`str`, :class:`bytes`, 08:00:26 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 08:00:26 08:00:26 :param headers: 08:00:26 Dictionary of custom headers to send, such as User-Agent, 08:00:26 If-None-Match, etc. If None, pool headers are used. If provided, 08:00:26 these headers completely replace any pool-specific headers. 08:00:26 08:00:26 :param retries: 08:00:26 Configure the number of retries to allow before raising a 08:00:26 :class:`~urllib3.exceptions.MaxRetryError` exception. 08:00:26 08:00:26 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 08:00:26 :class:`~urllib3.util.retry.Retry` object for fine-grained control 08:00:26 over different types of retries. 08:00:26 Pass an integer number to retry connection errors that many times, 08:00:26 but no other types of errors. Pass zero to never retry. 08:00:26 08:00:26 If ``False``, then retries are disabled and any exception is raised 08:00:26 immediately. Also, instead of raising a MaxRetryError on redirects, 08:00:26 the redirect response will be returned. 08:00:26 08:00:26 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 08:00:26 08:00:26 :param redirect: 08:00:26 If True, automatically handle redirects (status codes 301, 302, 08:00:26 303, 307, 308). Each redirect counts as a retry. Disabling retries 08:00:26 will disable redirect, too. 08:00:26 08:00:26 :param assert_same_host: 08:00:26 If ``True``, will make sure that the host of the pool requests is 08:00:26 consistent else will raise HostChangedError. When ``False``, you can 08:00:26 use the pool on an HTTP proxy and request foreign hosts. 08:00:26 08:00:26 :param timeout: 08:00:26 If specified, overrides the default timeout for this one 08:00:26 request. It may be a float (in seconds) or an instance of 08:00:26 :class:`urllib3.util.Timeout`. 08:00:26 08:00:26 :param pool_timeout: 08:00:26 If set and the pool is set to block=True, then this method will 08:00:26 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 08:00:26 connection is available within the time period. 08:00:26 08:00:26 :param bool preload_content: 08:00:26 If True, the response's body will be preloaded into memory. 08:00:26 08:00:26 :param bool decode_content: 08:00:26 If True, will attempt to decode the body based on the 08:00:26 'content-encoding' header. 08:00:26 08:00:26 :param release_conn: 08:00:26 If False, then the urlopen call will not release the connection 08:00:26 back into the pool once a response is received (but will release if 08:00:26 you read the entire contents of the response such as when 08:00:26 `preload_content=True`). This is useful if you're not preloading 08:00:26 the response's content immediately. You will need to call 08:00:26 ``r.release_conn()`` on the response ``r`` to return the connection 08:00:26 back into the pool. If None, it takes the value of ``preload_content`` 08:00:26 which defaults to ``True``. 08:00:26 08:00:26 :param bool chunked: 08:00:26 If True, urllib3 will send the body using chunked transfer 08:00:26 encoding. Otherwise, urllib3 will send the body using the standard 08:00:26 content-length form. Defaults to False. 08:00:26 08:00:26 :param int body_pos: 08:00:26 Position to seek to in file-like body in the event of a retry or 08:00:26 redirect. Typically this won't need to be set because urllib3 will 08:00:26 auto-populate the value when needed. 08:00:26 """ 08:00:26 parsed_url = parse_url(url) 08:00:26 destination_scheme = parsed_url.scheme 08:00:26 08:00:26 if headers is None: 08:00:26 headers = self.headers 08:00:26 08:00:26 if not isinstance(retries, Retry): 08:00:26 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 08:00:26 08:00:26 if release_conn is None: 08:00:26 release_conn = preload_content 08:00:26 08:00:26 # Check host 08:00:26 if assert_same_host and not self.is_same_host(url): 08:00:26 raise HostChangedError(self, url, retries) 08:00:26 08:00:26 # Ensure that the URL we're connecting to is properly encoded 08:00:26 if url.startswith("/"): 08:00:26 url = to_str(_encode_target(url)) 08:00:26 else: 08:00:26 url = to_str(parsed_url.url) 08:00:26 08:00:26 conn = None 08:00:26 08:00:26 # Track whether `conn` needs to be released before 08:00:26 # returning/raising/recursing. Update this variable if necessary, and 08:00:26 # leave `release_conn` constant throughout the function. That way, if 08:00:26 # the function recurses, the original value of `release_conn` will be 08:00:26 # passed down into the recursive call, and its value will be respected. 08:00:26 # 08:00:26 # See issue #651 [1] for details. 08:00:26 # 08:00:26 # [1] 08:00:26 release_this_conn = release_conn 08:00:26 08:00:26 http_tunnel_required = connection_requires_http_tunnel( 08:00:26 self.proxy, self.proxy_config, destination_scheme 08:00:26 ) 08:00:26 08:00:26 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 08:00:26 # have to copy the headers dict so we can safely change it without those 08:00:26 # changes being reflected in anyone else's copy. 08:00:26 if not http_tunnel_required: 08:00:26 headers = headers.copy() # type: ignore[attr-defined] 08:00:26 headers.update(self.proxy_headers) # type: ignore[union-attr] 08:00:26 08:00:26 # Must keep the exception bound to a separate variable or else Python 3 08:00:26 # complains about UnboundLocalError. 08:00:26 err = None 08:00:26 08:00:26 # Keep track of whether we cleanly exited the except block. This 08:00:26 # ensures we do proper cleanup in finally. 08:00:26 clean_exit = False 08:00:26 08:00:26 # Rewind body position, if needed. Record current position 08:00:26 # for future rewinds in the event of a redirect/retry. 08:00:26 body_pos = set_file_position(body, body_pos) 08:00:26 08:00:26 try: 08:00:26 # Request a connection from the queue. 08:00:26 timeout_obj = self._get_timeout(timeout) 08:00:26 conn = self._get_conn(timeout=pool_timeout) 08:00:26 08:00:26 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 08:00:26 08:00:26 # Is this a closed/new connection that requires CONNECT tunnelling? 08:00:26 if self.proxy is not None and http_tunnel_required and conn.is_closed: 08:00:26 try: 08:00:26 self._prepare_proxy(conn) 08:00:26 except (BaseSSLError, OSError, SocketTimeout) as e: 08:00:26 self._raise_timeout( 08:00:26 err=e, url=self.proxy.url, timeout_value=conn.timeout 08:00:26 ) 08:00:26 raise 08:00:26 08:00:26 # If we're going to release the connection in ``finally:``, then 08:00:26 # the response doesn't need to know about the connection. Otherwise 08:00:26 # it will also try to release it and we'll have a double-release 08:00:26 # mess. 08:00:26 response_conn = conn if not release_conn else None 08:00:26 08:00:26 # Make the request on the HTTPConnection object 08:00:26 > response = self._make_request( 08:00:26 conn, 08:00:26 method, 08:00:26 url, 08:00:26 timeout=timeout_obj, 08:00:26 body=body, 08:00:26 headers=headers, 08:00:26 chunked=chunked, 08:00:26 retries=retries, 08:00:26 response_conn=response_conn, 08:00:26 preload_content=preload_content, 08:00:26 decode_content=decode_content, 08:00:26 **response_kw, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 08:00:26 conn.request( 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:500: in request 08:00:26 self.endheaders() 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 08:00:26 self._send_output(message_body, encode_chunked=encode_chunked) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 08:00:26 self.send(msg) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 08:00:26 self.connect() 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 08:00:26 self.sock = self._new_conn() 08:00:26 ^^^^^^^^^^^^^^^^ 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 except socket.gaierror as e: 08:00:26 raise NameResolutionError(self.host, self, e) from e 08:00:26 except SocketTimeout as e: 08:00:26 raise ConnectTimeoutError( 08:00:26 self, 08:00:26 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 08:00:26 ) from e 08:00:26 08:00:26 except OSError as e: 08:00:26 > raise NewConnectionError( 08:00:26 self, f"Failed to establish a new connection: {e}" 08:00:26 ) from e 08:00:26 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 request = , stream = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:26 proxies = OrderedDict() 08:00:26 08:00:26 def send( 08:00:26 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:26 ): 08:00:26 """Sends PreparedRequest object. Returns Response object. 08:00:26 08:00:26 :param request: The :class:`PreparedRequest ` being sent. 08:00:26 :param stream: (optional) Whether to stream the request content. 08:00:26 :param timeout: (optional) How long to wait for the server to send 08:00:26 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:26 read timeout) ` tuple. 08:00:26 :type timeout: float or tuple or urllib3 Timeout object 08:00:26 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:26 we verify the server's TLS certificate, or a string, in which case it 08:00:26 must be a path to a CA bundle to use 08:00:26 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:26 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:26 :rtype: requests.Response 08:00:26 """ 08:00:26 08:00:26 try: 08:00:26 conn = self.get_connection_with_tls_context( 08:00:26 request, verify, proxies=proxies, cert=cert 08:00:26 ) 08:00:26 except LocationValueError as e: 08:00:26 raise InvalidURL(e, request=request) 08:00:26 08:00:26 self.cert_verify(conn, request.url, verify, cert) 08:00:26 url = self.request_url(request, proxies) 08:00:26 self.add_headers( 08:00:26 request, 08:00:26 stream=stream, 08:00:26 timeout=timeout, 08:00:26 verify=verify, 08:00:26 cert=cert, 08:00:26 proxies=proxies, 08:00:26 ) 08:00:26 08:00:26 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:26 08:00:26 if isinstance(timeout, tuple): 08:00:26 try: 08:00:26 connect, read = timeout 08:00:26 timeout = TimeoutSauce(connect=connect, read=read) 08:00:26 except ValueError: 08:00:26 raise ValueError( 08:00:26 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:26 f"or a single float to set both timeouts to the same value." 08:00:26 ) 08:00:26 elif isinstance(timeout, TimeoutSauce): 08:00:26 pass 08:00:26 else: 08:00:26 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:26 08:00:26 try: 08:00:26 > resp = conn.urlopen( 08:00:26 method=request.method, 08:00:26 url=url, 08:00:26 body=request.body, 08:00:26 headers=request.headers, 08:00:26 redirect=False, 08:00:26 assert_same_host=False, 08:00:26 preload_content=False, 08:00:26 decode_content=False, 08:00:26 retries=self.max_retries, 08:00:26 timeout=timeout, 08:00:26 chunked=chunked, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:644: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 08:00:26 retries = retries.increment( 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 method = 'DELETE', url = '/rests/data/transportpce-portmapping:network' 08:00:26 response = None 08:00:26 error = NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused") 08:00:26 _pool = 08:00:26 _stacktrace = 08:00:26 08:00:26 def increment( 08:00:26 self, 08:00:26 method: str | None = None, 08:00:26 url: str | None = None, 08:00:26 response: BaseHTTPResponse | None = None, 08:00:26 error: Exception | None = None, 08:00:26 _pool: ConnectionPool | None = None, 08:00:26 _stacktrace: TracebackType | None = None, 08:00:26 ) -> Self: 08:00:26 """Return a new Retry object with incremented retry counters. 08:00:26 08:00:26 :param response: A response object, or None, if the server did not 08:00:26 return a response. 08:00:26 :type response: :class:`~urllib3.response.BaseHTTPResponse` 08:00:26 :param Exception error: An error encountered during the request, or 08:00:26 None if the response was received successfully. 08:00:26 08:00:26 :return: A new ``Retry`` object. 08:00:26 """ 08:00:26 if self.total is False and error: 08:00:26 # Disabled, indicate to re-raise the error. 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 08:00:26 total = self.total 08:00:26 if total is not None: 08:00:26 total -= 1 08:00:26 08:00:26 connect = self.connect 08:00:26 read = self.read 08:00:26 redirect = self.redirect 08:00:26 status_count = self.status 08:00:26 other = self.other 08:00:26 cause = "unknown" 08:00:26 status = None 08:00:26 redirect_location = None 08:00:26 08:00:26 if error and self._is_connection_error(error): 08:00:26 # Connect retry? 08:00:26 if connect is False: 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif connect is not None: 08:00:26 connect -= 1 08:00:26 08:00:26 elif error and self._is_read_error(error): 08:00:26 # Read retry? 08:00:26 if read is False or method is None or not self._is_method_retryable(method): 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif read is not None: 08:00:26 read -= 1 08:00:26 08:00:26 elif error: 08:00:26 # Other retry? 08:00:26 if other is not None: 08:00:26 other -= 1 08:00:26 08:00:26 elif response and response.get_redirect_location(): 08:00:26 # Redirect retry? 08:00:26 if redirect is not None: 08:00:26 redirect -= 1 08:00:26 cause = "too many redirects" 08:00:26 response_redirect_location = response.get_redirect_location() 08:00:26 if response_redirect_location: 08:00:26 redirect_location = response_redirect_location 08:00:26 status = response.status 08:00:26 08:00:26 else: 08:00:26 # Incrementing because of a server error like a 500 in 08:00:26 # status_forcelist and the given method is in the allowed_methods 08:00:26 cause = ResponseError.GENERIC_ERROR 08:00:26 if response and response.status: 08:00:26 if status_count is not None: 08:00:26 status_count -= 1 08:00:26 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 08:00:26 status = response.status 08:00:26 08:00:26 history = self.history + ( 08:00:26 RequestHistory(method, url, error, status, redirect_location), 08:00:26 ) 08:00:26 08:00:26 new_retry = self.new( 08:00:26 total=total, 08:00:26 connect=connect, 08:00:26 read=read, 08:00:26 redirect=redirect, 08:00:26 status=status_count, 08:00:26 other=other, 08:00:26 history=history, 08:00:26 ) 08:00:26 08:00:26 if new_retry.is_exhausted(): 08:00:26 reason = error or ResponseError(cause) 08:00:26 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 08:00:26 08:00:26 During handling of the above exception, another exception occurred: 08:00:26 08:00:26 cls = 08:00:26 08:00:26 @classmethod 08:00:26 def tearDownClass(cls): 08:00:26 # clean datastores 08:00:26 > test_utils.del_portmapping() 08:00:26 08:00:26 transportpce_tests/pce/test02_pce_400G.py:111: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 transportpce_tests/common/test_utils.py:490: in del_portmapping 08:00:26 response = delete_request(url[RESTCONF_VERSION].format('{}')) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 transportpce_tests/common/test_utils.py:134: in delete_request 08:00:26 return requests.request( 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request 08:00:26 return session.request(method=method, url=url, **kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request 08:00:26 resp = self.send(prep, **send_kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send 08:00:26 r = adapter.send(request, **kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = 08:00:26 request = , stream = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:26 proxies = OrderedDict() 08:00:26 08:00:26 def send( 08:00:26 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:26 ): 08:00:26 """Sends PreparedRequest object. Returns Response object. 08:00:26 08:00:26 :param request: The :class:`PreparedRequest ` being sent. 08:00:26 :param stream: (optional) Whether to stream the request content. 08:00:26 :param timeout: (optional) How long to wait for the server to send 08:00:26 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:26 read timeout) ` tuple. 08:00:26 :type timeout: float or tuple or urllib3 Timeout object 08:00:26 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:26 we verify the server's TLS certificate, or a string, in which case it 08:00:26 must be a path to a CA bundle to use 08:00:26 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:26 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:26 :rtype: requests.Response 08:00:26 """ 08:00:26 08:00:26 try: 08:00:26 conn = self.get_connection_with_tls_context( 08:00:26 request, verify, proxies=proxies, cert=cert 08:00:26 ) 08:00:26 except LocationValueError as e: 08:00:26 raise InvalidURL(e, request=request) 08:00:26 08:00:26 self.cert_verify(conn, request.url, verify, cert) 08:00:26 url = self.request_url(request, proxies) 08:00:26 self.add_headers( 08:00:26 request, 08:00:26 stream=stream, 08:00:26 timeout=timeout, 08:00:26 verify=verify, 08:00:26 cert=cert, 08:00:26 proxies=proxies, 08:00:26 ) 08:00:26 08:00:26 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:26 08:00:26 if isinstance(timeout, tuple): 08:00:26 try: 08:00:26 connect, read = timeout 08:00:26 timeout = TimeoutSauce(connect=connect, read=read) 08:00:26 except ValueError: 08:00:26 raise ValueError( 08:00:26 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:26 f"or a single float to set both timeouts to the same value." 08:00:26 ) 08:00:26 elif isinstance(timeout, TimeoutSauce): 08:00:26 pass 08:00:26 else: 08:00:26 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:26 08:00:26 try: 08:00:26 resp = conn.urlopen( 08:00:26 method=request.method, 08:00:26 url=url, 08:00:26 body=request.body, 08:00:26 headers=request.headers, 08:00:26 redirect=False, 08:00:26 assert_same_host=False, 08:00:26 preload_content=False, 08:00:26 decode_content=False, 08:00:26 retries=self.max_retries, 08:00:26 timeout=timeout, 08:00:26 chunked=chunked, 08:00:26 ) 08:00:26 08:00:26 except (ProtocolError, OSError) as err: 08:00:26 raise ConnectionError(err, request=request) 08:00:26 08:00:26 except MaxRetryError as e: 08:00:26 if isinstance(e.reason, ConnectTimeoutError): 08:00:26 # TODO: Remove this in 3.0.0: see #2811 08:00:26 if not isinstance(e.reason, NewConnectionError): 08:00:26 raise ConnectTimeout(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, ResponseError): 08:00:26 raise RetryError(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, _ProxyError): 08:00:26 raise ProxyError(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, _SSLError): 08:00:26 # This branch is for urllib3 v1.22 and later. 08:00:26 raise SSLError(e, request=request) 08:00:26 08:00:26 > raise ConnectionError(e, request=request) 08:00:26 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/transportpce-portmapping:network (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 08:00:26 ----------------------------- Captured stdout call ----------------------------- 08:00:26 execution of test_12_path_computation_400G_xpdr_bi_cfg 08:00:26 =================================== FAILURES =================================== 08:00:26 ____________ TestTransportPCEPce400g.test_10_load_port_mapping_cfg _____________ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def test_10_load_port_mapping_cfg(self): 08:00:26 test_utils.del_portmapping() 08:00:26 time.sleep(1) 08:00:26 response = test_utils.post_portmapping(self.port_mapping_data_cfg) 08:00:26 > self.assertIn(response['status_code'], (requests.codes.created, requests.codes.no_content)) 08:00:26 E AssertionError: 401 not found in (201, 204) 08:00:26 08:00:26 transportpce_tests/pce/test02_pce_400G.py:322: AssertionError 08:00:26 ----------------------------- Captured stdout call ----------------------------- 08:00:26 execution of test_10_load_port_mapping_cfg 08:00:26 ________ TestTransportPCEPce400g.test_11_load_openroadm_topology_bi_cfg ________ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 > sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:204: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 08:00:26 raise err 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 address = ('localhost', 8181), timeout = 30, source_address = None 08:00:26 socket_options = [(6, 1, 1)] 08:00:26 08:00:26 def create_connection( 08:00:26 address: tuple[str, int], 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 source_address: tuple[str, int] | None = None, 08:00:26 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 08:00:26 ) -> socket.socket: 08:00:26 """Connect to *address* and return the socket object. 08:00:26 08:00:26 Convenience function. Connect to *address* (a 2-tuple ``(host, 08:00:26 port)``) and return the socket object. Passing the optional 08:00:26 *timeout* parameter will set the timeout on the socket instance 08:00:26 before attempting to connect. If no *timeout* is supplied, the 08:00:26 global default timeout setting returned by :func:`socket.getdefaulttimeout` 08:00:26 is used. If *source_address* is set it must be a tuple of (host, port) 08:00:26 for the socket to bind as a source address before making the connection. 08:00:26 An host of '' or port 0 tells the OS to use the default. 08:00:26 """ 08:00:26 08:00:26 host, port = address 08:00:26 if host.startswith("["): 08:00:26 host = host.strip("[]") 08:00:26 err = None 08:00:26 08:00:26 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 08:00:26 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 08:00:26 # The original create_connection function always returns all records. 08:00:26 family = allowed_gai_family() 08:00:26 08:00:26 try: 08:00:26 host.encode("idna") 08:00:26 except UnicodeError: 08:00:26 raise LocationParseError(f"'{host}', label empty or too long") from None 08:00:26 08:00:26 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 08:00:26 af, socktype, proto, canonname, sa = res 08:00:26 sock = None 08:00:26 try: 08:00:26 sock = socket.socket(af, socktype, proto) 08:00:26 08:00:26 # If provided, set socket level options before connecting. 08:00:26 _set_socket_options(sock, socket_options) 08:00:26 08:00:26 if timeout is not _DEFAULT_TIMEOUT: 08:00:26 sock.settimeout(timeout) 08:00:26 if source_address: 08:00:26 sock.bind(source_address) 08:00:26 > sock.connect(sa) 08:00:26 E ConnectionRefusedError: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 method = 'PUT' 08:00:26 url = '/rests/data/ietf-network:networks/network=openroadm-topology' 08:00:26 body = '{"network": [{"network-id": "openroadm-topology", "network-types": {"org-openroadm-common-network:openroadm-common-ne...ork:administrative-state": "inService", "destination": {"dest-tp": "DEG2-CTP-TXRX", "dest-node": "ROADM-A1-DEG2"}}]}]}' 08:00:26 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '27660', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 08:00:26 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 redirect = False, assert_same_host = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 08:00:26 release_conn = False, chunked = False, body_pos = None, preload_content = False 08:00:26 decode_content = False, response_kw = {} 08:00:26 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/ietf-network:networks/network=openroadm-topology', query=None, fragment=None) 08:00:26 destination_scheme = None, conn = None, release_this_conn = True 08:00:26 http_tunnel_required = False, err = None, clean_exit = False 08:00:26 08:00:26 def urlopen( # type: ignore[override] 08:00:26 self, 08:00:26 method: str, 08:00:26 url: str, 08:00:26 body: _TYPE_BODY | None = None, 08:00:26 headers: typing.Mapping[str, str] | None = None, 08:00:26 retries: Retry | bool | int | None = None, 08:00:26 redirect: bool = True, 08:00:26 assert_same_host: bool = True, 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 pool_timeout: int | None = None, 08:00:26 release_conn: bool | None = None, 08:00:26 chunked: bool = False, 08:00:26 body_pos: _TYPE_BODY_POSITION | None = None, 08:00:26 preload_content: bool = True, 08:00:26 decode_content: bool = True, 08:00:26 **response_kw: typing.Any, 08:00:26 ) -> BaseHTTPResponse: 08:00:26 """ 08:00:26 Get a connection from the pool and perform an HTTP request. This is the 08:00:26 lowest level call for making a request, so you'll need to specify all 08:00:26 the raw details. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 More commonly, it's appropriate to use a convenience method 08:00:26 such as :meth:`request`. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 `release_conn` will only behave as expected if 08:00:26 `preload_content=False` because we want to make 08:00:26 `preload_content=False` the default behaviour someday soon without 08:00:26 breaking backwards compatibility. 08:00:26 08:00:26 :param method: 08:00:26 HTTP request method (such as GET, POST, PUT, etc.) 08:00:26 08:00:26 :param url: 08:00:26 The URL to perform the request on. 08:00:26 08:00:26 :param body: 08:00:26 Data to send in the request body, either :class:`str`, :class:`bytes`, 08:00:26 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 08:00:26 08:00:26 :param headers: 08:00:26 Dictionary of custom headers to send, such as User-Agent, 08:00:26 If-None-Match, etc. If None, pool headers are used. If provided, 08:00:26 these headers completely replace any pool-specific headers. 08:00:26 08:00:26 :param retries: 08:00:26 Configure the number of retries to allow before raising a 08:00:26 :class:`~urllib3.exceptions.MaxRetryError` exception. 08:00:26 08:00:26 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 08:00:26 :class:`~urllib3.util.retry.Retry` object for fine-grained control 08:00:26 over different types of retries. 08:00:26 Pass an integer number to retry connection errors that many times, 08:00:26 but no other types of errors. Pass zero to never retry. 08:00:26 08:00:26 If ``False``, then retries are disabled and any exception is raised 08:00:26 immediately. Also, instead of raising a MaxRetryError on redirects, 08:00:26 the redirect response will be returned. 08:00:26 08:00:26 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 08:00:26 08:00:26 :param redirect: 08:00:26 If True, automatically handle redirects (status codes 301, 302, 08:00:26 303, 307, 308). Each redirect counts as a retry. Disabling retries 08:00:26 will disable redirect, too. 08:00:26 08:00:26 :param assert_same_host: 08:00:26 If ``True``, will make sure that the host of the pool requests is 08:00:26 consistent else will raise HostChangedError. When ``False``, you can 08:00:26 use the pool on an HTTP proxy and request foreign hosts. 08:00:26 08:00:26 :param timeout: 08:00:26 If specified, overrides the default timeout for this one 08:00:26 request. It may be a float (in seconds) or an instance of 08:00:26 :class:`urllib3.util.Timeout`. 08:00:26 08:00:26 :param pool_timeout: 08:00:26 If set and the pool is set to block=True, then this method will 08:00:26 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 08:00:26 connection is available within the time period. 08:00:26 08:00:26 :param bool preload_content: 08:00:26 If True, the response's body will be preloaded into memory. 08:00:26 08:00:26 :param bool decode_content: 08:00:26 If True, will attempt to decode the body based on the 08:00:26 'content-encoding' header. 08:00:26 08:00:26 :param release_conn: 08:00:26 If False, then the urlopen call will not release the connection 08:00:26 back into the pool once a response is received (but will release if 08:00:26 you read the entire contents of the response such as when 08:00:26 `preload_content=True`). This is useful if you're not preloading 08:00:26 the response's content immediately. You will need to call 08:00:26 ``r.release_conn()`` on the response ``r`` to return the connection 08:00:26 back into the pool. If None, it takes the value of ``preload_content`` 08:00:26 which defaults to ``True``. 08:00:26 08:00:26 :param bool chunked: 08:00:26 If True, urllib3 will send the body using chunked transfer 08:00:26 encoding. Otherwise, urllib3 will send the body using the standard 08:00:26 content-length form. Defaults to False. 08:00:26 08:00:26 :param int body_pos: 08:00:26 Position to seek to in file-like body in the event of a retry or 08:00:26 redirect. Typically this won't need to be set because urllib3 will 08:00:26 auto-populate the value when needed. 08:00:26 """ 08:00:26 parsed_url = parse_url(url) 08:00:26 destination_scheme = parsed_url.scheme 08:00:26 08:00:26 if headers is None: 08:00:26 headers = self.headers 08:00:26 08:00:26 if not isinstance(retries, Retry): 08:00:26 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 08:00:26 08:00:26 if release_conn is None: 08:00:26 release_conn = preload_content 08:00:26 08:00:26 # Check host 08:00:26 if assert_same_host and not self.is_same_host(url): 08:00:26 raise HostChangedError(self, url, retries) 08:00:26 08:00:26 # Ensure that the URL we're connecting to is properly encoded 08:00:26 if url.startswith("/"): 08:00:26 url = to_str(_encode_target(url)) 08:00:26 else: 08:00:26 url = to_str(parsed_url.url) 08:00:26 08:00:26 conn = None 08:00:26 08:00:26 # Track whether `conn` needs to be released before 08:00:26 # returning/raising/recursing. Update this variable if necessary, and 08:00:26 # leave `release_conn` constant throughout the function. That way, if 08:00:26 # the function recurses, the original value of `release_conn` will be 08:00:26 # passed down into the recursive call, and its value will be respected. 08:00:26 # 08:00:26 # See issue #651 [1] for details. 08:00:26 # 08:00:26 # [1] 08:00:26 release_this_conn = release_conn 08:00:26 08:00:26 http_tunnel_required = connection_requires_http_tunnel( 08:00:26 self.proxy, self.proxy_config, destination_scheme 08:00:26 ) 08:00:26 08:00:26 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 08:00:26 # have to copy the headers dict so we can safely change it without those 08:00:26 # changes being reflected in anyone else's copy. 08:00:26 if not http_tunnel_required: 08:00:26 headers = headers.copy() # type: ignore[attr-defined] 08:00:26 headers.update(self.proxy_headers) # type: ignore[union-attr] 08:00:26 08:00:26 # Must keep the exception bound to a separate variable or else Python 3 08:00:26 # complains about UnboundLocalError. 08:00:26 err = None 08:00:26 08:00:26 # Keep track of whether we cleanly exited the except block. This 08:00:26 # ensures we do proper cleanup in finally. 08:00:26 clean_exit = False 08:00:26 08:00:26 # Rewind body position, if needed. Record current position 08:00:26 # for future rewinds in the event of a redirect/retry. 08:00:26 body_pos = set_file_position(body, body_pos) 08:00:26 08:00:26 try: 08:00:26 # Request a connection from the queue. 08:00:26 timeout_obj = self._get_timeout(timeout) 08:00:26 conn = self._get_conn(timeout=pool_timeout) 08:00:26 08:00:26 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 08:00:26 08:00:26 # Is this a closed/new connection that requires CONNECT tunnelling? 08:00:26 if self.proxy is not None and http_tunnel_required and conn.is_closed: 08:00:26 try: 08:00:26 self._prepare_proxy(conn) 08:00:26 except (BaseSSLError, OSError, SocketTimeout) as e: 08:00:26 self._raise_timeout( 08:00:26 err=e, url=self.proxy.url, timeout_value=conn.timeout 08:00:26 ) 08:00:26 raise 08:00:26 08:00:26 # If we're going to release the connection in ``finally:``, then 08:00:26 # the response doesn't need to know about the connection. Otherwise 08:00:26 # it will also try to release it and we'll have a double-release 08:00:26 # mess. 08:00:26 response_conn = conn if not release_conn else None 08:00:26 08:00:26 # Make the request on the HTTPConnection object 08:00:26 > response = self._make_request( 08:00:26 conn, 08:00:26 method, 08:00:26 url, 08:00:26 timeout=timeout_obj, 08:00:26 body=body, 08:00:26 headers=headers, 08:00:26 chunked=chunked, 08:00:26 retries=retries, 08:00:26 response_conn=response_conn, 08:00:26 preload_content=preload_content, 08:00:26 decode_content=decode_content, 08:00:26 **response_kw, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 08:00:26 conn.request( 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:500: in request 08:00:26 self.endheaders() 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 08:00:26 self._send_output(message_body, encode_chunked=encode_chunked) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 08:00:26 self.send(msg) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 08:00:26 self.connect() 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 08:00:26 self.sock = self._new_conn() 08:00:26 ^^^^^^^^^^^^^^^^ 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 except socket.gaierror as e: 08:00:26 raise NameResolutionError(self.host, self, e) from e 08:00:26 except SocketTimeout as e: 08:00:26 raise ConnectTimeoutError( 08:00:26 self, 08:00:26 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 08:00:26 ) from e 08:00:26 08:00:26 except OSError as e: 08:00:26 > raise NewConnectionError( 08:00:26 self, f"Failed to establish a new connection: {e}" 08:00:26 ) from e 08:00:26 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 request = , stream = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:26 proxies = OrderedDict() 08:00:26 08:00:26 def send( 08:00:26 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:26 ): 08:00:26 """Sends PreparedRequest object. Returns Response object. 08:00:26 08:00:26 :param request: The :class:`PreparedRequest ` being sent. 08:00:26 :param stream: (optional) Whether to stream the request content. 08:00:26 :param timeout: (optional) How long to wait for the server to send 08:00:26 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:26 read timeout) ` tuple. 08:00:26 :type timeout: float or tuple or urllib3 Timeout object 08:00:26 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:26 we verify the server's TLS certificate, or a string, in which case it 08:00:26 must be a path to a CA bundle to use 08:00:26 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:26 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:26 :rtype: requests.Response 08:00:26 """ 08:00:26 08:00:26 try: 08:00:26 conn = self.get_connection_with_tls_context( 08:00:26 request, verify, proxies=proxies, cert=cert 08:00:26 ) 08:00:26 except LocationValueError as e: 08:00:26 raise InvalidURL(e, request=request) 08:00:26 08:00:26 self.cert_verify(conn, request.url, verify, cert) 08:00:26 url = self.request_url(request, proxies) 08:00:26 self.add_headers( 08:00:26 request, 08:00:26 stream=stream, 08:00:26 timeout=timeout, 08:00:26 verify=verify, 08:00:26 cert=cert, 08:00:26 proxies=proxies, 08:00:26 ) 08:00:26 08:00:26 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:26 08:00:26 if isinstance(timeout, tuple): 08:00:26 try: 08:00:26 connect, read = timeout 08:00:26 timeout = TimeoutSauce(connect=connect, read=read) 08:00:26 except ValueError: 08:00:26 raise ValueError( 08:00:26 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:26 f"or a single float to set both timeouts to the same value." 08:00:26 ) 08:00:26 elif isinstance(timeout, TimeoutSauce): 08:00:26 pass 08:00:26 else: 08:00:26 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:26 08:00:26 try: 08:00:26 > resp = conn.urlopen( 08:00:26 method=request.method, 08:00:26 url=url, 08:00:26 body=request.body, 08:00:26 headers=request.headers, 08:00:26 redirect=False, 08:00:26 assert_same_host=False, 08:00:26 preload_content=False, 08:00:26 decode_content=False, 08:00:26 retries=self.max_retries, 08:00:26 timeout=timeout, 08:00:26 chunked=chunked, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:644: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 08:00:26 retries = retries.increment( 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 method = 'PUT' 08:00:26 url = '/rests/data/ietf-network:networks/network=openroadm-topology' 08:00:26 response = None 08:00:26 error = NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused") 08:00:26 _pool = 08:00:26 _stacktrace = 08:00:26 08:00:26 def increment( 08:00:26 self, 08:00:26 method: str | None = None, 08:00:26 url: str | None = None, 08:00:26 response: BaseHTTPResponse | None = None, 08:00:26 error: Exception | None = None, 08:00:26 _pool: ConnectionPool | None = None, 08:00:26 _stacktrace: TracebackType | None = None, 08:00:26 ) -> Self: 08:00:26 """Return a new Retry object with incremented retry counters. 08:00:26 08:00:26 :param response: A response object, or None, if the server did not 08:00:26 return a response. 08:00:26 :type response: :class:`~urllib3.response.BaseHTTPResponse` 08:00:26 :param Exception error: An error encountered during the request, or 08:00:26 None if the response was received successfully. 08:00:26 08:00:26 :return: A new ``Retry`` object. 08:00:26 """ 08:00:26 if self.total is False and error: 08:00:26 # Disabled, indicate to re-raise the error. 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 08:00:26 total = self.total 08:00:26 if total is not None: 08:00:26 total -= 1 08:00:26 08:00:26 connect = self.connect 08:00:26 read = self.read 08:00:26 redirect = self.redirect 08:00:26 status_count = self.status 08:00:26 other = self.other 08:00:26 cause = "unknown" 08:00:26 status = None 08:00:26 redirect_location = None 08:00:26 08:00:26 if error and self._is_connection_error(error): 08:00:26 # Connect retry? 08:00:26 if connect is False: 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif connect is not None: 08:00:26 connect -= 1 08:00:26 08:00:26 elif error and self._is_read_error(error): 08:00:26 # Read retry? 08:00:26 if read is False or method is None or not self._is_method_retryable(method): 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif read is not None: 08:00:26 read -= 1 08:00:26 08:00:26 elif error: 08:00:26 # Other retry? 08:00:26 if other is not None: 08:00:26 other -= 1 08:00:26 08:00:26 elif response and response.get_redirect_location(): 08:00:26 # Redirect retry? 08:00:26 if redirect is not None: 08:00:26 redirect -= 1 08:00:26 cause = "too many redirects" 08:00:26 response_redirect_location = response.get_redirect_location() 08:00:26 if response_redirect_location: 08:00:26 redirect_location = response_redirect_location 08:00:26 status = response.status 08:00:26 08:00:26 else: 08:00:26 # Incrementing because of a server error like a 500 in 08:00:26 # status_forcelist and the given method is in the allowed_methods 08:00:26 cause = ResponseError.GENERIC_ERROR 08:00:26 if response and response.status: 08:00:26 if status_count is not None: 08:00:26 status_count -= 1 08:00:26 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 08:00:26 status = response.status 08:00:26 08:00:26 history = self.history + ( 08:00:26 RequestHistory(method, url, error, status, redirect_location), 08:00:26 ) 08:00:26 08:00:26 new_retry = self.new( 08:00:26 total=total, 08:00:26 connect=connect, 08:00:26 read=read, 08:00:26 redirect=redirect, 08:00:26 status=status_count, 08:00:26 other=other, 08:00:26 history=history, 08:00:26 ) 08:00:26 08:00:26 if new_retry.is_exhausted(): 08:00:26 reason = error or ResponseError(cause) 08:00:26 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 08:00:26 08:00:26 During handling of the above exception, another exception occurred: 08:00:26 08:00:26 self = 08:00:26 08:00:26 def test_11_load_openroadm_topology_bi_cfg(self): 08:00:26 > response = test_utils.put_ietf_network('openroadm-topology', self.topo_bi_dir_data) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 08:00:26 transportpce_tests/pce/test02_pce_400G.py:327: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 transportpce_tests/common/test_utils.py:578: in put_ietf_network 08:00:26 response = put_request(url[RESTCONF_VERSION].format('{}', network), json_payload) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 transportpce_tests/common/test_utils.py:125: in put_request 08:00:26 return requests.request( 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request 08:00:26 return session.request(method=method, url=url, **kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request 08:00:26 resp = self.send(prep, **send_kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send 08:00:26 r = adapter.send(request, **kwargs) 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = 08:00:26 request = , stream = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:26 proxies = OrderedDict() 08:00:26 08:00:26 def send( 08:00:26 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:26 ): 08:00:26 """Sends PreparedRequest object. Returns Response object. 08:00:26 08:00:26 :param request: The :class:`PreparedRequest ` being sent. 08:00:26 :param stream: (optional) Whether to stream the request content. 08:00:26 :param timeout: (optional) How long to wait for the server to send 08:00:26 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:26 read timeout) ` tuple. 08:00:26 :type timeout: float or tuple or urllib3 Timeout object 08:00:26 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:26 we verify the server's TLS certificate, or a string, in which case it 08:00:26 must be a path to a CA bundle to use 08:00:26 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:26 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:26 :rtype: requests.Response 08:00:26 """ 08:00:26 08:00:26 try: 08:00:26 conn = self.get_connection_with_tls_context( 08:00:26 request, verify, proxies=proxies, cert=cert 08:00:26 ) 08:00:26 except LocationValueError as e: 08:00:26 raise InvalidURL(e, request=request) 08:00:26 08:00:26 self.cert_verify(conn, request.url, verify, cert) 08:00:26 url = self.request_url(request, proxies) 08:00:26 self.add_headers( 08:00:26 request, 08:00:26 stream=stream, 08:00:26 timeout=timeout, 08:00:26 verify=verify, 08:00:26 cert=cert, 08:00:26 proxies=proxies, 08:00:26 ) 08:00:26 08:00:26 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:26 08:00:26 if isinstance(timeout, tuple): 08:00:26 try: 08:00:26 connect, read = timeout 08:00:26 timeout = TimeoutSauce(connect=connect, read=read) 08:00:26 except ValueError: 08:00:26 raise ValueError( 08:00:26 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:26 f"or a single float to set both timeouts to the same value." 08:00:26 ) 08:00:26 elif isinstance(timeout, TimeoutSauce): 08:00:26 pass 08:00:26 else: 08:00:26 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:26 08:00:26 try: 08:00:26 resp = conn.urlopen( 08:00:26 method=request.method, 08:00:26 url=url, 08:00:26 body=request.body, 08:00:26 headers=request.headers, 08:00:26 redirect=False, 08:00:26 assert_same_host=False, 08:00:26 preload_content=False, 08:00:26 decode_content=False, 08:00:26 retries=self.max_retries, 08:00:26 timeout=timeout, 08:00:26 chunked=chunked, 08:00:26 ) 08:00:26 08:00:26 except (ProtocolError, OSError) as err: 08:00:26 raise ConnectionError(err, request=request) 08:00:26 08:00:26 except MaxRetryError as e: 08:00:26 if isinstance(e.reason, ConnectTimeoutError): 08:00:26 # TODO: Remove this in 3.0.0: see #2811 08:00:26 if not isinstance(e.reason, NewConnectionError): 08:00:26 raise ConnectTimeout(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, ResponseError): 08:00:26 raise RetryError(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, _ProxyError): 08:00:26 raise ProxyError(e, request=request) 08:00:26 08:00:26 if isinstance(e.reason, _SSLError): 08:00:26 # This branch is for urllib3 v1.22 and later. 08:00:26 raise SSLError(e, request=request) 08:00:26 08:00:26 > raise ConnectionError(e, request=request) 08:00:26 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/data/ietf-network:networks/network=openroadm-topology (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 08:00:26 ----------------------------- Captured stdout call ----------------------------- 08:00:26 execution of test_11_load_openroadm_topology_bi_cfg 08:00:26 ______ TestTransportPCEPce400g.test_12_path_computation_400G_xpdr_bi_cfg _______ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 > sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:204: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 08:00:26 raise err 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 address = ('localhost', 8181), timeout = 30, source_address = None 08:00:26 socket_options = [(6, 1, 1)] 08:00:26 08:00:26 def create_connection( 08:00:26 address: tuple[str, int], 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 source_address: tuple[str, int] | None = None, 08:00:26 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 08:00:26 ) -> socket.socket: 08:00:26 """Connect to *address* and return the socket object. 08:00:26 08:00:26 Convenience function. Connect to *address* (a 2-tuple ``(host, 08:00:26 port)``) and return the socket object. Passing the optional 08:00:26 *timeout* parameter will set the timeout on the socket instance 08:00:26 before attempting to connect. If no *timeout* is supplied, the 08:00:26 global default timeout setting returned by :func:`socket.getdefaulttimeout` 08:00:26 is used. If *source_address* is set it must be a tuple of (host, port) 08:00:26 for the socket to bind as a source address before making the connection. 08:00:26 An host of '' or port 0 tells the OS to use the default. 08:00:26 """ 08:00:26 08:00:26 host, port = address 08:00:26 if host.startswith("["): 08:00:26 host = host.strip("[]") 08:00:26 err = None 08:00:26 08:00:26 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 08:00:26 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 08:00:26 # The original create_connection function always returns all records. 08:00:26 family = allowed_gai_family() 08:00:26 08:00:26 try: 08:00:26 host.encode("idna") 08:00:26 except UnicodeError: 08:00:26 raise LocationParseError(f"'{host}', label empty or too long") from None 08:00:26 08:00:26 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 08:00:26 af, socktype, proto, canonname, sa = res 08:00:26 sock = None 08:00:26 try: 08:00:26 sock = socket.socket(af, socktype, proto) 08:00:26 08:00:26 # If provided, set socket level options before connecting. 08:00:26 _set_socket_options(sock, socket_options) 08:00:26 08:00:26 if timeout is not _DEFAULT_TIMEOUT: 08:00:26 sock.settimeout(timeout) 08:00:26 if source_address: 08:00:26 sock.bind(source_address) 08:00:26 > sock.connect(sa) 08:00:26 E ConnectionRefusedError: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 method = 'POST' 08:00:26 url = '/rests/operations/transportpce-pce:path-computation-request' 08:00:26 body = '{"input": {"service-name": "service-1", "resource-reserve": "true", "service-handler-header": {"request-id": "request...ate": "400", "clli": "nodeC", "service-format": "Ethernet", "node-id": "XPDR-C2"}, "pce-routing-metric": "hop-count"}}' 08:00:26 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '379', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 08:00:26 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 redirect = False, assert_same_host = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 08:00:26 release_conn = False, chunked = False, body_pos = None, preload_content = False 08:00:26 decode_content = False, response_kw = {} 08:00:26 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/operations/transportpce-pce:path-computation-request', query=None, fragment=None) 08:00:26 destination_scheme = None, conn = None, release_this_conn = True 08:00:26 http_tunnel_required = False, err = None, clean_exit = False 08:00:26 08:00:26 def urlopen( # type: ignore[override] 08:00:26 self, 08:00:26 method: str, 08:00:26 url: str, 08:00:26 body: _TYPE_BODY | None = None, 08:00:26 headers: typing.Mapping[str, str] | None = None, 08:00:26 retries: Retry | bool | int | None = None, 08:00:26 redirect: bool = True, 08:00:26 assert_same_host: bool = True, 08:00:26 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 08:00:26 pool_timeout: int | None = None, 08:00:26 release_conn: bool | None = None, 08:00:26 chunked: bool = False, 08:00:26 body_pos: _TYPE_BODY_POSITION | None = None, 08:00:26 preload_content: bool = True, 08:00:26 decode_content: bool = True, 08:00:26 **response_kw: typing.Any, 08:00:26 ) -> BaseHTTPResponse: 08:00:26 """ 08:00:26 Get a connection from the pool and perform an HTTP request. This is the 08:00:26 lowest level call for making a request, so you'll need to specify all 08:00:26 the raw details. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 More commonly, it's appropriate to use a convenience method 08:00:26 such as :meth:`request`. 08:00:26 08:00:26 .. note:: 08:00:26 08:00:26 `release_conn` will only behave as expected if 08:00:26 `preload_content=False` because we want to make 08:00:26 `preload_content=False` the default behaviour someday soon without 08:00:26 breaking backwards compatibility. 08:00:26 08:00:26 :param method: 08:00:26 HTTP request method (such as GET, POST, PUT, etc.) 08:00:26 08:00:26 :param url: 08:00:26 The URL to perform the request on. 08:00:26 08:00:26 :param body: 08:00:26 Data to send in the request body, either :class:`str`, :class:`bytes`, 08:00:26 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 08:00:26 08:00:26 :param headers: 08:00:26 Dictionary of custom headers to send, such as User-Agent, 08:00:26 If-None-Match, etc. If None, pool headers are used. If provided, 08:00:26 these headers completely replace any pool-specific headers. 08:00:26 08:00:26 :param retries: 08:00:26 Configure the number of retries to allow before raising a 08:00:26 :class:`~urllib3.exceptions.MaxRetryError` exception. 08:00:26 08:00:26 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 08:00:26 :class:`~urllib3.util.retry.Retry` object for fine-grained control 08:00:26 over different types of retries. 08:00:26 Pass an integer number to retry connection errors that many times, 08:00:26 but no other types of errors. Pass zero to never retry. 08:00:26 08:00:26 If ``False``, then retries are disabled and any exception is raised 08:00:26 immediately. Also, instead of raising a MaxRetryError on redirects, 08:00:26 the redirect response will be returned. 08:00:26 08:00:26 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 08:00:26 08:00:26 :param redirect: 08:00:26 If True, automatically handle redirects (status codes 301, 302, 08:00:26 303, 307, 308). Each redirect counts as a retry. Disabling retries 08:00:26 will disable redirect, too. 08:00:26 08:00:26 :param assert_same_host: 08:00:26 If ``True``, will make sure that the host of the pool requests is 08:00:26 consistent else will raise HostChangedError. When ``False``, you can 08:00:26 use the pool on an HTTP proxy and request foreign hosts. 08:00:26 08:00:26 :param timeout: 08:00:26 If specified, overrides the default timeout for this one 08:00:26 request. It may be a float (in seconds) or an instance of 08:00:26 :class:`urllib3.util.Timeout`. 08:00:26 08:00:26 :param pool_timeout: 08:00:26 If set and the pool is set to block=True, then this method will 08:00:26 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 08:00:26 connection is available within the time period. 08:00:26 08:00:26 :param bool preload_content: 08:00:26 If True, the response's body will be preloaded into memory. 08:00:26 08:00:26 :param bool decode_content: 08:00:26 If True, will attempt to decode the body based on the 08:00:26 'content-encoding' header. 08:00:26 08:00:26 :param release_conn: 08:00:26 If False, then the urlopen call will not release the connection 08:00:26 back into the pool once a response is received (but will release if 08:00:26 you read the entire contents of the response such as when 08:00:26 `preload_content=True`). This is useful if you're not preloading 08:00:26 the response's content immediately. You will need to call 08:00:26 ``r.release_conn()`` on the response ``r`` to return the connection 08:00:26 back into the pool. If None, it takes the value of ``preload_content`` 08:00:26 which defaults to ``True``. 08:00:26 08:00:26 :param bool chunked: 08:00:26 If True, urllib3 will send the body using chunked transfer 08:00:26 encoding. Otherwise, urllib3 will send the body using the standard 08:00:26 content-length form. Defaults to False. 08:00:26 08:00:26 :param int body_pos: 08:00:26 Position to seek to in file-like body in the event of a retry or 08:00:26 redirect. Typically this won't need to be set because urllib3 will 08:00:26 auto-populate the value when needed. 08:00:26 """ 08:00:26 parsed_url = parse_url(url) 08:00:26 destination_scheme = parsed_url.scheme 08:00:26 08:00:26 if headers is None: 08:00:26 headers = self.headers 08:00:26 08:00:26 if not isinstance(retries, Retry): 08:00:26 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 08:00:26 08:00:26 if release_conn is None: 08:00:26 release_conn = preload_content 08:00:26 08:00:26 # Check host 08:00:26 if assert_same_host and not self.is_same_host(url): 08:00:26 raise HostChangedError(self, url, retries) 08:00:26 08:00:26 # Ensure that the URL we're connecting to is properly encoded 08:00:26 if url.startswith("/"): 08:00:26 url = to_str(_encode_target(url)) 08:00:26 else: 08:00:26 url = to_str(parsed_url.url) 08:00:26 08:00:26 conn = None 08:00:26 08:00:26 # Track whether `conn` needs to be released before 08:00:26 # returning/raising/recursing. Update this variable if necessary, and 08:00:26 # leave `release_conn` constant throughout the function. That way, if 08:00:26 # the function recurses, the original value of `release_conn` will be 08:00:26 # passed down into the recursive call, and its value will be respected. 08:00:26 # 08:00:26 # See issue #651 [1] for details. 08:00:26 # 08:00:26 # [1] 08:00:26 release_this_conn = release_conn 08:00:26 08:00:26 http_tunnel_required = connection_requires_http_tunnel( 08:00:26 self.proxy, self.proxy_config, destination_scheme 08:00:26 ) 08:00:26 08:00:26 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 08:00:26 # have to copy the headers dict so we can safely change it without those 08:00:26 # changes being reflected in anyone else's copy. 08:00:26 if not http_tunnel_required: 08:00:26 headers = headers.copy() # type: ignore[attr-defined] 08:00:26 headers.update(self.proxy_headers) # type: ignore[union-attr] 08:00:26 08:00:26 # Must keep the exception bound to a separate variable or else Python 3 08:00:26 # complains about UnboundLocalError. 08:00:26 err = None 08:00:26 08:00:26 # Keep track of whether we cleanly exited the except block. This 08:00:26 # ensures we do proper cleanup in finally. 08:00:26 clean_exit = False 08:00:26 08:00:26 # Rewind body position, if needed. Record current position 08:00:26 # for future rewinds in the event of a redirect/retry. 08:00:26 body_pos = set_file_position(body, body_pos) 08:00:26 08:00:26 try: 08:00:26 # Request a connection from the queue. 08:00:26 timeout_obj = self._get_timeout(timeout) 08:00:26 conn = self._get_conn(timeout=pool_timeout) 08:00:26 08:00:26 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 08:00:26 08:00:26 # Is this a closed/new connection that requires CONNECT tunnelling? 08:00:26 if self.proxy is not None and http_tunnel_required and conn.is_closed: 08:00:26 try: 08:00:26 self._prepare_proxy(conn) 08:00:26 except (BaseSSLError, OSError, SocketTimeout) as e: 08:00:26 self._raise_timeout( 08:00:26 err=e, url=self.proxy.url, timeout_value=conn.timeout 08:00:26 ) 08:00:26 raise 08:00:26 08:00:26 # If we're going to release the connection in ``finally:``, then 08:00:26 # the response doesn't need to know about the connection. Otherwise 08:00:26 # it will also try to release it and we'll have a double-release 08:00:26 # mess. 08:00:26 response_conn = conn if not release_conn else None 08:00:26 08:00:26 # Make the request on the HTTPConnection object 08:00:26 > response = self._make_request( 08:00:26 conn, 08:00:26 method, 08:00:26 url, 08:00:26 timeout=timeout_obj, 08:00:26 body=body, 08:00:26 headers=headers, 08:00:26 chunked=chunked, 08:00:26 retries=retries, 08:00:26 response_conn=response_conn, 08:00:26 preload_content=preload_content, 08:00:26 decode_content=decode_content, 08:00:26 **response_kw, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 08:00:26 conn.request( 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:500: in request 08:00:26 self.endheaders() 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 08:00:26 self._send_output(message_body, encode_chunked=encode_chunked) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 08:00:26 self.send(msg) 08:00:26 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 08:00:26 self.connect() 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 08:00:26 self.sock = self._new_conn() 08:00:26 ^^^^^^^^^^^^^^^^ 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = 08:00:26 08:00:26 def _new_conn(self) -> socket.socket: 08:00:26 """Establish a socket connection and set nodelay settings on it. 08:00:26 08:00:26 :return: New socket connection. 08:00:26 """ 08:00:26 try: 08:00:26 sock = connection.create_connection( 08:00:26 (self._dns_host, self.port), 08:00:26 self.timeout, 08:00:26 source_address=self.source_address, 08:00:26 socket_options=self.socket_options, 08:00:26 ) 08:00:26 except socket.gaierror as e: 08:00:26 raise NameResolutionError(self.host, self, e) from e 08:00:26 except SocketTimeout as e: 08:00:26 raise ConnectTimeoutError( 08:00:26 self, 08:00:26 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 08:00:26 ) from e 08:00:26 08:00:26 except OSError as e: 08:00:26 > raise NewConnectionError( 08:00:26 self, f"Failed to establish a new connection: {e}" 08:00:26 ) from e 08:00:26 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 08:00:26 08:00:26 The above exception was the direct cause of the following exception: 08:00:26 08:00:26 self = 08:00:26 request = , stream = False 08:00:26 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:26 proxies = OrderedDict() 08:00:26 08:00:26 def send( 08:00:26 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:26 ): 08:00:26 """Sends PreparedRequest object. Returns Response object. 08:00:26 08:00:26 :param request: The :class:`PreparedRequest ` being sent. 08:00:26 :param stream: (optional) Whether to stream the request content. 08:00:26 :param timeout: (optional) How long to wait for the server to send 08:00:26 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:26 read timeout) ` tuple. 08:00:26 :type timeout: float or tuple or urllib3 Timeout object 08:00:26 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:26 we verify the server's TLS certificate, or a string, in which case it 08:00:26 must be a path to a CA bundle to use 08:00:26 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:26 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:26 :rtype: requests.Response 08:00:26 """ 08:00:26 08:00:26 try: 08:00:26 conn = self.get_connection_with_tls_context( 08:00:26 request, verify, proxies=proxies, cert=cert 08:00:26 ) 08:00:26 except LocationValueError as e: 08:00:26 raise InvalidURL(e, request=request) 08:00:26 08:00:26 self.cert_verify(conn, request.url, verify, cert) 08:00:26 url = self.request_url(request, proxies) 08:00:26 self.add_headers( 08:00:26 request, 08:00:26 stream=stream, 08:00:26 timeout=timeout, 08:00:26 verify=verify, 08:00:26 cert=cert, 08:00:26 proxies=proxies, 08:00:26 ) 08:00:26 08:00:26 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:26 08:00:26 if isinstance(timeout, tuple): 08:00:26 try: 08:00:26 connect, read = timeout 08:00:26 timeout = TimeoutSauce(connect=connect, read=read) 08:00:26 except ValueError: 08:00:26 raise ValueError( 08:00:26 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:26 f"or a single float to set both timeouts to the same value." 08:00:26 ) 08:00:26 elif isinstance(timeout, TimeoutSauce): 08:00:26 pass 08:00:26 else: 08:00:26 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:26 08:00:26 try: 08:00:26 > resp = conn.urlopen( 08:00:26 method=request.method, 08:00:26 url=url, 08:00:26 body=request.body, 08:00:26 headers=request.headers, 08:00:26 redirect=False, 08:00:26 assert_same_host=False, 08:00:26 preload_content=False, 08:00:26 decode_content=False, 08:00:26 retries=self.max_retries, 08:00:26 timeout=timeout, 08:00:26 chunked=chunked, 08:00:26 ) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:644: 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 08:00:26 retries = retries.increment( 08:00:26 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:26 08:00:26 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 08:00:26 method = 'POST' 08:00:26 url = '/rests/operations/transportpce-pce:path-computation-request' 08:00:26 response = None 08:00:26 error = NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused") 08:00:26 _pool = 08:00:26 _stacktrace = 08:00:26 08:00:26 def increment( 08:00:26 self, 08:00:26 method: str | None = None, 08:00:26 url: str | None = None, 08:00:26 response: BaseHTTPResponse | None = None, 08:00:26 error: Exception | None = None, 08:00:26 _pool: ConnectionPool | None = None, 08:00:26 _stacktrace: TracebackType | None = None, 08:00:26 ) -> Self: 08:00:26 """Return a new Retry object with incremented retry counters. 08:00:26 08:00:26 :param response: A response object, or None, if the server did not 08:00:26 return a response. 08:00:26 :type response: :class:`~urllib3.response.BaseHTTPResponse` 08:00:26 :param Exception error: An error encountered during the request, or 08:00:26 None if the response was received successfully. 08:00:26 08:00:26 :return: A new ``Retry`` object. 08:00:26 """ 08:00:26 if self.total is False and error: 08:00:26 # Disabled, indicate to re-raise the error. 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 08:00:26 total = self.total 08:00:26 if total is not None: 08:00:26 total -= 1 08:00:26 08:00:26 connect = self.connect 08:00:26 read = self.read 08:00:26 redirect = self.redirect 08:00:26 status_count = self.status 08:00:26 other = self.other 08:00:26 cause = "unknown" 08:00:26 status = None 08:00:26 redirect_location = None 08:00:26 08:00:26 if error and self._is_connection_error(error): 08:00:26 # Connect retry? 08:00:26 if connect is False: 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif connect is not None: 08:00:26 connect -= 1 08:00:26 08:00:26 elif error and self._is_read_error(error): 08:00:26 # Read retry? 08:00:26 if read is False or method is None or not self._is_method_retryable(method): 08:00:26 raise reraise(type(error), error, _stacktrace) 08:00:26 elif read is not None: 08:00:26 read -= 1 08:00:26 08:00:26 elif error: 08:00:26 # Other retry? 08:00:26 if other is not None: 08:00:26 other -= 1 08:00:26 08:00:26 elif response and response.get_redirect_location(): 08:00:26 # Redirect retry? 08:00:26 if redirect is not None: 08:00:26 redirect -= 1 08:00:26 cause = "too many redirects" 08:00:26 response_redirect_location = response.get_redirect_location() 08:00:26 if response_redirect_location: 08:00:26 redirect_location = response_redirect_location 08:00:26 status = response.status 08:00:26 08:00:26 else: 08:00:26 # Incrementing because of a server error like a 500 in 08:00:26 # status_forcelist and the given method is in the allowed_methods 08:00:26 cause = ResponseError.GENERIC_ERROR 08:00:26 if response and response.status: 08:00:26 if status_count is not None: 08:00:26 status_count -= 1 08:00:26 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 08:00:26 status = response.status 08:00:26 08:00:26 history = self.history + ( 08:00:26 RequestHistory(method, url, error, status, redirect_location), 08:00:26 ) 08:00:26 08:00:26 new_retry = self.new( 08:00:26 total=total, 08:00:26 connect=connect, 08:00:26 read=read, 08:00:26 redirect=redirect, 08:00:26 status=status_count, 08:00:26 other=other, 08:00:26 history=history, 08:00:26 ) 08:00:26 08:00:26 if new_retry.is_exhausted(): 08:00:26 reason = error or ResponseError(cause) 08:00:26 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 08:00:26 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:26 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:26 08:00:26 ../.tox/testsPCE/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 08:00:26 08:00:26 During handling of the above exception, another exception occurred: 08:00:26 08:00:26 self = 08:00:26 08:00:26 def test_12_path_computation_400G_xpdr_bi_cfg(self): 08:00:26 path_computation_input_data = { 08:00:26 "service-name": "service-1", 08:00:26 "resource-reserve": "true", 08:00:26 "service-handler-header": { 08:00:26 "request-id": "request1" 08:00:26 }, 08:00:26 "service-a-end": { 08:00:27 "service-rate": "400", 08:00:27 "clli": "nodeA", 08:00:27 "service-format": "Ethernet", 08:00:27 "node-id": "XPDR-A2" 08:00:27 }, 08:00:27 "service-z-end": { 08:00:27 "service-rate": "400", 08:00:27 "clli": "nodeC", 08:00:27 "service-format": "Ethernet", 08:00:27 "node-id": "XPDR-C2" 08:00:27 }, 08:00:27 "pce-routing-metric": "hop-count" 08:00:27 } 08:00:27 > response = test_utils.transportpce_api_rpc_request('transportpce-pce', 08:00:27 'path-computation-request', 08:00:27 path_computation_input_data) 08:00:27 08:00:27 transportpce_tests/pce/test02_pce_400G.py:368: 08:00:27 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:27 transportpce_tests/common/test_utils.py:751: in transportpce_api_rpc_request 08:00:27 response = post_request(url, data) 08:00:27 ^^^^^^^^^^^^^^^^^^^^^^^ 08:00:27 transportpce_tests/common/test_utils.py:143: in post_request 08:00:27 return requests.request( 08:00:27 ../.tox/testsPCE/lib/python3.11/site-packages/requests/api.py:59: in request 08:00:27 return session.request(method=method, url=url, **kwargs) 08:00:27 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:27 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:589: in request 08:00:27 resp = self.send(prep, **send_kwargs) 08:00:27 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:27 ../.tox/testsPCE/lib/python3.11/site-packages/requests/sessions.py:703: in send 08:00:27 r = adapter.send(request, **kwargs) 08:00:27 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 08:00:27 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 08:00:27 08:00:27 self = 08:00:27 request = , stream = False 08:00:27 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 08:00:27 proxies = OrderedDict() 08:00:27 08:00:27 def send( 08:00:27 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 08:00:27 ): 08:00:27 """Sends PreparedRequest object. Returns Response object. 08:00:27 08:00:27 :param request: The :class:`PreparedRequest ` being sent. 08:00:27 :param stream: (optional) Whether to stream the request content. 08:00:27 :param timeout: (optional) How long to wait for the server to send 08:00:27 data before giving up, as a float, or a :ref:`(connect timeout, 08:00:27 read timeout) ` tuple. 08:00:27 :type timeout: float or tuple or urllib3 Timeout object 08:00:27 :param verify: (optional) Either a boolean, in which case it controls whether 08:00:27 we verify the server's TLS certificate, or a string, in which case it 08:00:27 must be a path to a CA bundle to use 08:00:27 :param cert: (optional) Any user-provided SSL certificate to be trusted. 08:00:27 :param proxies: (optional) The proxies dictionary to apply to the request. 08:00:27 :rtype: requests.Response 08:00:27 """ 08:00:27 08:00:27 try: 08:00:27 conn = self.get_connection_with_tls_context( 08:00:27 request, verify, proxies=proxies, cert=cert 08:00:27 ) 08:00:27 except LocationValueError as e: 08:00:27 raise InvalidURL(e, request=request) 08:00:27 08:00:27 self.cert_verify(conn, request.url, verify, cert) 08:00:27 url = self.request_url(request, proxies) 08:00:27 self.add_headers( 08:00:27 request, 08:00:27 stream=stream, 08:00:27 timeout=timeout, 08:00:27 verify=verify, 08:00:27 cert=cert, 08:00:27 proxies=proxies, 08:00:27 ) 08:00:27 08:00:27 chunked = not (request.body is None or "Content-Length" in request.headers) 08:00:27 08:00:27 if isinstance(timeout, tuple): 08:00:27 try: 08:00:27 connect, read = timeout 08:00:27 timeout = TimeoutSauce(connect=connect, read=read) 08:00:27 except ValueError: 08:00:27 raise ValueError( 08:00:27 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 08:00:27 f"or a single float to set both timeouts to the same value." 08:00:27 ) 08:00:27 elif isinstance(timeout, TimeoutSauce): 08:00:27 pass 08:00:27 else: 08:00:27 timeout = TimeoutSauce(connect=timeout, read=timeout) 08:00:27 08:00:27 try: 08:00:27 resp = conn.urlopen( 08:00:27 method=request.method, 08:00:27 url=url, 08:00:27 body=request.body, 08:00:27 headers=request.headers, 08:00:27 redirect=False, 08:00:27 assert_same_host=False, 08:00:27 preload_content=False, 08:00:27 decode_content=False, 08:00:27 retries=self.max_retries, 08:00:27 timeout=timeout, 08:00:27 chunked=chunked, 08:00:27 ) 08:00:27 08:00:27 except (ProtocolError, OSError) as err: 08:00:27 raise ConnectionError(err, request=request) 08:00:27 08:00:27 except MaxRetryError as e: 08:00:27 if isinstance(e.reason, ConnectTimeoutError): 08:00:27 # TODO: Remove this in 3.0.0: see #2811 08:00:27 if not isinstance(e.reason, NewConnectionError): 08:00:27 raise ConnectTimeout(e, request=request) 08:00:27 08:00:27 if isinstance(e.reason, ResponseError): 08:00:27 raise RetryError(e, request=request) 08:00:27 08:00:27 if isinstance(e.reason, _ProxyError): 08:00:27 raise ProxyError(e, request=request) 08:00:27 08:00:27 if isinstance(e.reason, _SSLError): 08:00:27 # This branch is for urllib3 v1.22 and later. 08:00:27 raise SSLError(e, request=request) 08:00:27 08:00:27 > raise ConnectionError(e, request=request) 08:00:27 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8181): Max retries exceeded with url: /rests/operations/transportpce-pce:path-computation-request (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8181): Failed to establish a new connection: [Errno 111] Connection refused")) 08:00:27 08:00:27 ../.tox/testsPCE/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 08:00:27 ----------------------------- Captured stdout call ----------------------------- 08:00:27 execution of test_12_path_computation_400G_xpdr_bi_cfg 08:00:27 =========================== short test summary info ============================ 08:00:27 FAILED transportpce_tests/pce/test02_pce_400G.py::TestTransportPCEPce400g::test_10_load_port_mapping_cfg 08:00:27 FAILED transportpce_tests/pce/test02_pce_400G.py::TestTransportPCEPce400g::test_11_load_openroadm_topology_bi_cfg 08:00:27 FAILED transportpce_tests/pce/test02_pce_400G.py::TestTransportPCEPce400g::test_12_path_computation_400G_xpdr_bi_cfg 08:00:27 ERROR transportpce_tests/pce/test02_pce_400G.py::TestTransportPCEPce400g::test_12_path_computation_400G_xpdr_bi_cfg 08:00:27 3 failed, 9 passed, 1 error in 41.26s 08:00:27 testsPCE: exit 1 (155.16 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pid=4269 08:00:27 + tox_status=143 08:00:27 + echo '---> Completed tox runs' 08:00:27 ---> Completed tox runs 08:00:27 + for i in .tox/*/log 08:00:27 ++ echo .tox/build_karaf_tests121/log 08:00:27 ++ awk -F/ '{print $2}' 08:00:27 + tox_env=build_karaf_tests121 08:00:27 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 08:00:27 + for i in .tox/*/log 08:00:27 ++ echo .tox/build_karaf_tests190/log 08:00:27 ++ awk -F/ '{print $2}' 08:00:27 + tox_env=build_karaf_tests190 08:00:27 + cp -r .tox/build_karaf_tests190/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests190 08:00:36 lf-activate-venv(): INFO: Adding /tmp/venv-g0aN/bin to PATH 08:00:36 INFO: Running in OpenStack, capturing instance metadata 08:00:36 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins16093911339467386702.sh 08:00:36 provisioning config files... 08:00:37 Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #4418 08:00:37 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config8525346185862329174tmp 08:00:37 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 08:00:37 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 08:00:37 provisioning config files... 08:00:37 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 08:00:37 [EnvInject] - Injecting environment variables from a build step. 08:00:37 [EnvInject] - Injecting as environment variables the properties content 08:00:37 SERVER_ID=logs 08:00:37 08:00:37 [EnvInject] - Variables injected successfully. 08:00:37 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins7598273750341987546.sh 08:00:37 ---> create-netrc.sh 08:00:37 WARN: Log server credential not found. 08:00:37 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11476315145998368458.sh 08:00:37 ---> python-tools-install.sh 08:00:37 Setup pyenv: 08:00:37 system 08:00:37 3.8.20 08:00:37 3.9.20 08:00:37 3.10.15 08:00:37 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 08:00:37 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-g0aN from file:/tmp/.os_lf_venv 08:00:37 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 08:00:37 lf-activate-venv(): INFO: Attempting to install with network-safe options... 08:00:39 lf-activate-venv(): INFO: Base packages installed successfully 08:00:39 lf-activate-venv(): INFO: Installing additional packages: lftools 08:00:49 lf-activate-venv(): INFO: Adding /tmp/venv-g0aN/bin to PATH 08:00:49 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins14129729939741166182.sh 08:00:49 ---> sudo-logs.sh 08:00:49 Archiving 'sudo' log.. 08:00:49 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins4648906231172220826.sh 08:00:49 ---> job-cost.sh 08:00:49 INFO: Activating Python virtual environment... 08:00:49 Setup pyenv: 08:00:49 system 08:00:49 3.8.20 08:00:49 3.9.20 08:00:49 3.10.15 08:00:49 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 08:00:49 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-g0aN from file:/tmp/.os_lf_venv 08:00:49 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 08:00:49 lf-activate-venv(): INFO: Attempting to install with network-safe options... 08:00:51 lf-activate-venv(): INFO: Base packages installed successfully 08:00:51 lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 08:00:57 lf-activate-venv(): INFO: Adding /tmp/venv-g0aN/bin to PATH 08:00:57 INFO: No stack-cost file found 08:00:57 INFO: Instance uptime: 649s 08:00:57 INFO: Fetching instance metadata (attempt 1 of 3)... 08:00:57 DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type 08:00:57 INFO: Successfully fetched instance metadata 08:00:57 INFO: Instance type: v3-standard-4 08:00:57 INFO: Retrieving pricing info for: v3-standard-4 08:00:57 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... 08:00:57 DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-4/cost?seconds=649 08:00:58 INFO: Successfully fetched Vexxhost pricing API 08:00:58 INFO: Retrieved cost: 0.11 08:00:58 INFO: Retrieved resource: v3-standard-4 08:00:58 INFO: Creating archive directory: /w/workspace/transportpce-tox-verify-transportpce-master/archives/cost 08:00:58 INFO: Archiving costs to: /w/workspace/transportpce-tox-verify-transportpce-master/archives/cost.csv 08:00:58 INFO: Successfully archived job cost data 08:00:58 DEBUG: Cost data: transportpce-tox-verify-transportpce-master,4418,2026-02-26 08:00:58,v3-standard-4,649,0.11,0.00,ABORTED 08:00:58 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins11882852538543539489.sh 08:00:58 ---> logs-deploy.sh 08:00:58 Setup pyenv: 08:00:58 system 08:00:58 3.8.20 08:00:58 3.9.20 08:00:58 3.10.15 08:00:58 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 08:00:58 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-g0aN from file:/tmp/.os_lf_venv 08:00:58 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 08:00:58 lf-activate-venv(): INFO: Attempting to install with network-safe options... 08:01:00 lf-activate-venv(): INFO: Base packages installed successfully 08:01:00 lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 08:01:09 lf-activate-venv(): INFO: Adding /tmp/venv-g0aN/bin to PATH 08:01:09 WARNING: Nexus logging server not set 08:01:09 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/4418/ 08:01:09 INFO: archiving logs to S3 08:01:09 /tmp/venv-g0aN/lib/python3.11/site-packages/requests/__init__.py:113: RequestsDependencyWarning: urllib3 (1.26.20) or chardet (6.0.0.post1)/charset_normalizer (3.4.4) doesn't match a supported version! 08:01:09 warnings.warn( 08:01:10 ---> uname -a: 08:01:10 Linux prd-ubuntu2204-docker-4c-16g-20850 5.15.0-168-generic #178-Ubuntu SMP Fri Jan 9 19:05:03 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux 08:01:10 08:01:10 08:01:10 ---> lscpu: 08:01:10 Architecture: x86_64 08:01:10 CPU op-mode(s): 32-bit, 64-bit 08:01:10 Address sizes: 40 bits physical, 48 bits virtual 08:01:10 Byte Order: Little Endian 08:01:10 CPU(s): 4 08:01:10 On-line CPU(s) list: 0-3 08:01:10 Vendor ID: AuthenticAMD 08:01:10 Model name: AMD EPYC-Rome Processor 08:01:10 CPU family: 23 08:01:10 Model: 49 08:01:10 Thread(s) per core: 1 08:01:10 Core(s) per socket: 1 08:01:10 Socket(s): 4 08:01:10 Stepping: 0 08:01:10 BogoMIPS: 5599.99 08:01:10 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 08:01:10 Virtualization: AMD-V 08:01:10 Hypervisor vendor: KVM 08:01:10 Virtualization type: full 08:01:10 L1d cache: 128 KiB (4 instances) 08:01:10 L1i cache: 128 KiB (4 instances) 08:01:10 L2 cache: 2 MiB (4 instances) 08:01:10 L3 cache: 64 MiB (4 instances) 08:01:10 NUMA node(s): 1 08:01:10 NUMA node0 CPU(s): 0-3 08:01:10 Vulnerability Gather data sampling: Not affected 08:01:10 Vulnerability Indirect target selection: Not affected 08:01:10 Vulnerability Itlb multihit: Not affected 08:01:10 Vulnerability L1tf: Not affected 08:01:10 Vulnerability Mds: Not affected 08:01:10 Vulnerability Meltdown: Not affected 08:01:10 Vulnerability Mmio stale data: Not affected 08:01:10 Vulnerability Reg file data sampling: Not affected 08:01:10 Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled 08:01:10 Vulnerability Spec rstack overflow: Mitigation; SMT disabled 08:01:10 Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp 08:01:10 Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization 08:01:10 Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected 08:01:10 Vulnerability Srbds: Not affected 08:01:10 Vulnerability Tsa: Not affected 08:01:10 Vulnerability Tsx async abort: Not affected 08:01:10 Vulnerability Vmscape: Not affected 08:01:10 08:01:10 08:01:10 ---> nproc: 08:01:10 4 08:01:10 08:01:10 08:01:10 ---> df -h: 08:01:10 Filesystem Size Used Avail Use% Mounted on 08:01:10 tmpfs 1.6G 1.1M 1.6G 1% /run 08:01:10 /dev/vda1 78G 16G 62G 21% / 08:01:10 tmpfs 7.9G 0 7.9G 0% /dev/shm 08:01:10 tmpfs 5.0M 0 5.0M 0% /run/lock 08:01:10 /dev/vda15 105M 6.1M 99M 6% /boot/efi 08:01:10 tmpfs 1.6G 4.0K 1.6G 1% /run/user/1001 08:01:10 08:01:10 08:01:10 ---> free -m: 08:01:10 total used free shared buff/cache available 08:01:10 Mem: 15989 694 4940 4 10353 14951 08:01:10 Swap: 1023 0 1023 08:01:10 08:01:10 08:01:10 ---> ip addr: 08:01:10 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 08:01:10 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 08:01:10 inet 127.0.0.1/8 scope host lo 08:01:10 valid_lft forever preferred_lft forever 08:01:10 inet6 ::1/128 scope host 08:01:10 valid_lft forever preferred_lft forever 08:01:10 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 08:01:10 link/ether fa:16:3e:9a:00:e8 brd ff:ff:ff:ff:ff:ff 08:01:10 altname enp0s3 08:01:10 inet 10.30.171.104/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 08:01:10 valid_lft 85746sec preferred_lft 85746sec 08:01:10 inet6 fe80::f816:3eff:fe9a:e8/64 scope link 08:01:10 valid_lft forever preferred_lft forever 08:01:10 3: docker0: mtu 1458 qdisc noqueue state DOWN group default 08:01:10 link/ether 22:82:e4:b6:04:e9 brd ff:ff:ff:ff:ff:ff 08:01:10 inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 08:01:10 valid_lft forever preferred_lft forever 08:01:10 08:01:10 08:01:10 ---> sar -b -r -n DEV: 08:01:10 Linux 5.15.0-168-generic (prd-ubuntu2204-docker-4c-16g-20850) 02/26/26 _x86_64_ (4 CPU) 08:01:10 08:01:10 07:50:18 LINUX RESTART (4 CPU) 08:01:10 08:01:10 08:01:10 ---> sar -P ALL: 08:01:10 Linux 5.15.0-168-generic (prd-ubuntu2204-docker-4c-16g-20850) 02/26/26 _x86_64_ (4 CPU) 08:01:10 08:01:10 07:50:18 LINUX RESTART (4 CPU) 08:01:10 08:01:10 08:01:10