11:50:01 Triggered by Gerrit: https://git.opendaylight.org/gerrit/c/transportpce/+/120829 11:50:01 Running as SYSTEM 11:50:01 [EnvInject] - Loading node environment variables. 11:50:01 Building remotely on prd-ubuntu2204-docker-4c-16g-19296 (ubuntu2204-docker-4c-16g) in workspace /w/workspace/transportpce-tox-verify-transportpce-master 11:50:01 [ssh-agent] Looking for ssh-agent implementation... 11:50:01 [ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine) 11:50:01 $ ssh-agent 11:50:01 SSH_AUTH_SOCK=/tmp/ssh-XXXXXX1AhUJZ/agent.1572 11:50:01 SSH_AGENT_PID=1573 11:50:01 [ssh-agent] Started. 11:50:01 Running ssh-add (command line suppressed) 11:50:01 Identity added: /w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_806663084050772068.key (/w/workspace/transportpce-tox-verify-transportpce-master@tmp/private_key_806663084050772068.key) 11:50:01 [ssh-agent] Using credentials jenkins (jenkins-ssh) 11:50:01 The recommended git tool is: NONE 11:50:03 using credential jenkins-ssh 11:50:03 Wiping out workspace first. 11:50:03 Cloning the remote Git repository 11:50:03 Cloning repository git://devvexx.opendaylight.org/mirror/transportpce 11:50:03 > git init /w/workspace/transportpce-tox-verify-transportpce-master # timeout=10 11:50:03 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 11:50:03 > git --version # timeout=10 11:50:03 > git --version # 'git version 2.34.1' 11:50:03 using GIT_SSH to set credentials jenkins-ssh 11:50:03 Verifying host key using known hosts file 11:50:03 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 11:50:03 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce +refs/heads/*:refs/remotes/origin/* # timeout=10 11:50:07 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 11:50:07 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 11:50:07 > git config remote.origin.url git://devvexx.opendaylight.org/mirror/transportpce # timeout=10 11:50:07 Fetching upstream changes from git://devvexx.opendaylight.org/mirror/transportpce 11:50:07 using GIT_SSH to set credentials jenkins-ssh 11:50:07 Verifying host key using known hosts file 11:50:07 You're using 'Known hosts file' strategy to verify ssh host keys, but your known_hosts file does not exist, please go to 'Manage Jenkins' -> 'Security' -> 'Git Host Key Verification Configuration' and configure host key verification. 11:50:07 > git fetch --tags --force --progress -- git://devvexx.opendaylight.org/mirror/transportpce refs/changes/29/120829/2 # timeout=10 11:50:08 > git rev-parse 4cdd7dbd002e56f4f1b7d31a326ce402e83482f8^{commit} # timeout=10 11:50:08 JENKINS-19022: warning: possible memory leak due to Git plugin usage; see: https://plugins.jenkins.io/git/#remove-git-plugin-buildsbybranch-builddata-script 11:50:08 Checking out Revision 4cdd7dbd002e56f4f1b7d31a326ce402e83482f8 (refs/changes/29/120829/2) 11:50:08 > git config core.sparsecheckout # timeout=10 11:50:08 > git checkout -f 4cdd7dbd002e56f4f1b7d31a326ce402e83482f8 # timeout=10 11:50:08 Commit message: "Support for openconfig 2.0" 11:50:08 > git rev-parse FETCH_HEAD^{commit} # timeout=10 11:50:08 > git rev-list --no-walk 509d781065379100eb9da8d0414bc0043a05ebc0 # timeout=10 11:50:08 > git remote # timeout=10 11:50:08 > git submodule init # timeout=10 11:50:08 > git submodule sync # timeout=10 11:50:08 > git config --get remote.origin.url # timeout=10 11:50:08 > git submodule init # timeout=10 11:50:08 > git config -f .gitmodules --get-regexp ^submodule\.(.+)\.url # timeout=10 11:50:08 ERROR: No submodules found. 11:50:12 provisioning config files... 11:50:12 copy managed file [npmrc] to file:/home/jenkins/.npmrc 11:50:12 copy managed file [pipconf] to file:/home/jenkins/.config/pip/pip.conf 11:50:12 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins16504089426708404681.sh 11:50:12 ---> python-tools-install.sh 11:50:12 Setup pyenv: 11:50:12 * system (set by /opt/pyenv/version) 11:50:12 * 3.8.20 (set by /opt/pyenv/version) 11:50:12 * 3.9.20 (set by /opt/pyenv/version) 11:50:12 3.10.15 11:50:12 3.11.10 11:50:17 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-NQwY 11:50:17 lf-activate-venv(): INFO: Save venv in file: /tmp/.os_lf_venv 11:50:17 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 11:50:17 lf-activate-venv(): INFO: Attempting to install with network-safe options... 11:50:21 lf-activate-venv(): INFO: Base packages installed successfully 11:50:21 lf-activate-venv(): INFO: Installing additional packages: lftools 11:50:46 lf-activate-venv(): INFO: Adding /tmp/venv-NQwY/bin to PATH 11:50:46 Generating Requirements File 11:51:04 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. 11:51:04 httplib2 0.30.2 requires pyparsing<4,>=3.0.4, but you have pyparsing 2.4.7 which is incompatible. 11:51:04 Python 3.11.10 11:51:04 pip 26.0.1 from /tmp/venv-NQwY/lib/python3.11/site-packages/pip (python 3.11) 11:51:05 appdirs==1.4.4 11:51:05 argcomplete==3.6.3 11:51:05 aspy.yaml==1.3.0 11:51:05 attrs==25.4.0 11:51:05 autopage==0.6.0 11:51:05 beautifulsoup4==4.14.3 11:51:05 boto3==1.42.53 11:51:05 botocore==1.42.53 11:51:05 bs4==0.0.2 11:51:05 certifi==2026.1.4 11:51:05 cffi==2.0.0 11:51:05 cfgv==3.5.0 11:51:05 chardet==5.2.0 11:51:05 charset-normalizer==3.4.4 11:51:05 click==8.3.1 11:51:05 cliff==4.13.2 11:51:05 cmd2==3.2.0 11:51:05 cryptography==3.3.2 11:51:05 debtcollector==3.0.0 11:51:05 decorator==5.2.1 11:51:05 defusedxml==0.7.1 11:51:05 Deprecated==1.3.1 11:51:05 distlib==0.4.0 11:51:05 dnspython==2.8.0 11:51:05 docker==7.1.0 11:51:05 dogpile.cache==1.5.0 11:51:05 durationpy==0.10 11:51:05 email-validator==2.3.0 11:51:05 filelock==3.24.3 11:51:05 future==1.0.0 11:51:05 gitdb==4.0.12 11:51:05 GitPython==3.1.46 11:51:05 httplib2==0.30.2 11:51:05 identify==2.6.16 11:51:05 idna==3.11 11:51:05 importlib-resources==1.5.0 11:51:05 iso8601==2.1.0 11:51:05 Jinja2==3.1.6 11:51:05 jmespath==1.1.0 11:51:05 jsonpatch==1.33 11:51:05 jsonpointer==3.0.0 11:51:05 jsonschema==4.26.0 11:51:05 jsonschema-specifications==2025.9.1 11:51:05 keystoneauth1==5.13.0 11:51:05 kubernetes==35.0.0 11:51:05 lftools==0.37.21 11:51:05 lxml==6.0.2 11:51:05 markdown-it-py==4.0.0 11:51:05 MarkupSafe==3.0.3 11:51:05 mdurl==0.1.2 11:51:05 msgpack==1.1.2 11:51:05 multi_key_dict==2.0.3 11:51:05 munch==4.0.0 11:51:05 netaddr==1.3.0 11:51:05 niet==1.4.2 11:51:05 nodeenv==1.10.0 11:51:05 oauth2client==4.1.3 11:51:05 oauthlib==3.3.1 11:51:05 openstacksdk==4.10.0 11:51:05 os-service-types==1.8.2 11:51:05 osc-lib==4.4.0 11:51:05 oslo.config==10.3.0 11:51:05 oslo.context==6.3.0 11:51:05 oslo.i18n==6.7.2 11:51:05 oslo.log==8.1.0 11:51:05 oslo.serialization==5.9.1 11:51:05 oslo.utils==9.2.0 11:51:05 packaging==26.0 11:51:05 pbr==7.0.3 11:51:05 platformdirs==4.9.2 11:51:05 prettytable==3.17.0 11:51:05 psutil==7.2.2 11:51:05 pyasn1==0.6.2 11:51:05 pyasn1_modules==0.4.2 11:51:05 pycparser==3.0 11:51:05 pygerrit2==2.0.15 11:51:05 PyGithub==2.8.1 11:51:05 Pygments==2.19.2 11:51:05 PyJWT==2.11.0 11:51:05 PyNaCl==1.6.2 11:51:05 pyparsing==2.4.7 11:51:05 pyperclip==1.11.0 11:51:05 pyrsistent==0.20.0 11:51:05 python-cinderclient==9.8.0 11:51:05 python-dateutil==2.9.0.post0 11:51:05 python-heatclient==5.0.0 11:51:05 python-jenkins==1.8.3 11:51:05 python-keystoneclient==5.7.0 11:51:05 python-magnumclient==4.9.0 11:51:05 python-openstackclient==9.0.0 11:51:05 python-swiftclient==4.9.0 11:51:05 PyYAML==6.0.3 11:51:05 referencing==0.37.0 11:51:05 requests==2.32.5 11:51:05 requests-oauthlib==2.0.0 11:51:05 requestsexceptions==1.4.0 11:51:05 rfc3986==2.0.0 11:51:05 rich==14.3.3 11:51:05 rich-argparse==1.7.2 11:51:05 rpds-py==0.30.0 11:51:05 rsa==4.9.1 11:51:05 ruamel.yaml==0.19.1 11:51:05 ruamel.yaml.clib==0.2.15 11:51:05 s3transfer==0.16.0 11:51:05 simplejson==3.20.2 11:51:05 six==1.17.0 11:51:05 smmap==5.0.2 11:51:05 soupsieve==2.8.3 11:51:05 stevedore==5.6.0 11:51:05 tabulate==0.9.0 11:51:05 toml==0.10.2 11:51:05 tomlkit==0.14.0 11:51:05 tqdm==4.67.3 11:51:05 typing_extensions==4.15.0 11:51:05 tzdata==2025.3 11:51:05 urllib3==1.26.20 11:51:05 virtualenv==20.38.0 11:51:05 wcwidth==0.6.0 11:51:05 websocket-client==1.9.0 11:51:05 wrapt==2.1.1 11:51:05 xdg==6.0.0 11:51:05 xmltodict==1.0.3 11:51:05 yq==3.4.3 11:51:05 [EnvInject] - Injecting environment variables from a build step. 11:51:05 [EnvInject] - Injecting as environment variables the properties content 11:51:05 PYTHON=python3 11:51:05 11:51:05 [EnvInject] - Variables injected successfully. 11:51:05 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins12502544403337136513.sh 11:51:05 ---> tox-install.sh 11:51:05 + source /home/jenkins/lf-env.sh 11:51:05 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 11:51:05 ++ mktemp -d /tmp/venv-XXXX 11:51:05 + lf_venv=/tmp/venv-hmUl 11:51:05 + local venv_file=/tmp/.os_lf_venv 11:51:05 + local python=python3 11:51:05 + local options 11:51:05 + local set_path=true 11:51:05 + local install_args= 11:51:05 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 11:51:05 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 11:51:05 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 11:51:05 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 11:51:05 + true 11:51:05 + case $1 in 11:51:05 + venv_file=/tmp/.toxenv 11:51:05 + shift 2 11:51:05 + true 11:51:05 + case $1 in 11:51:05 + shift 11:51:05 + break 11:51:05 + case $python in 11:51:05 + local pkg_list= 11:51:05 + [[ -d /opt/pyenv ]] 11:51:05 + echo 'Setup pyenv:' 11:51:05 Setup pyenv: 11:51:05 + export PYENV_ROOT=/opt/pyenv 11:51:05 + PYENV_ROOT=/opt/pyenv 11:51:05 + export PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:05 + PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:05 + pyenv versions 11:51:05 system 11:51:05 3.8.20 11:51:05 3.9.20 11:51:05 3.10.15 11:51:05 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 11:51:05 + command -v pyenv 11:51:05 ++ pyenv init - --no-rehash 11:51:05 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 11:51:05 for i in ${!paths[@]}; do 11:51:05 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 11:51:05 fi; done; 11:51:05 echo "${paths[*]}"'\'')" 11:51:05 export PATH="/opt/pyenv/shims:${PATH}" 11:51:05 export PYENV_SHELL=bash 11:51:05 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 11:51:05 pyenv() { 11:51:05 local command 11:51:05 command="${1:-}" 11:51:05 if [ "$#" -gt 0 ]; then 11:51:05 shift 11:51:05 fi 11:51:05 11:51:05 case "$command" in 11:51:05 rehash|shell) 11:51:05 eval "$(pyenv "sh-$command" "$@")" 11:51:05 ;; 11:51:05 *) 11:51:05 command pyenv "$command" "$@" 11:51:05 ;; 11:51:05 esac 11:51:05 }' 11:51:05 +++ bash --norc -ec 'IFS=:; paths=($PATH); 11:51:05 for i in ${!paths[@]}; do 11:51:05 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 11:51:05 fi; done; 11:51:05 echo "${paths[*]}"' 11:51:05 ++ PATH=/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:05 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:05 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:05 ++ export PYENV_SHELL=bash 11:51:05 ++ PYENV_SHELL=bash 11:51:05 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 11:51:05 +++ complete -F _pyenv pyenv 11:51:05 ++ lf-pyver python3 11:51:05 ++ local py_version_xy=python3 11:51:05 ++ local py_version_xyz= 11:51:05 ++ pyenv versions 11:51:05 ++ local command 11:51:05 ++ command=versions 11:51:05 ++ '[' 1 -gt 0 ']' 11:51:05 ++ shift 11:51:05 ++ case "$command" in 11:51:05 ++ command pyenv versions 11:51:05 ++ sed 's/^[ *]* //' 11:51:05 ++ grep -E '^[0-9.]*[0-9]$' 11:51:05 ++ awk '{ print $1 }' 11:51:05 ++ [[ ! -s /tmp/.pyenv_versions ]] 11:51:05 +++ grep '^3' /tmp/.pyenv_versions 11:51:05 +++ sort -V 11:51:05 +++ tail -n 1 11:51:05 ++ py_version_xyz=3.11.10 11:51:05 ++ [[ -z 3.11.10 ]] 11:51:05 ++ echo 3.11.10 11:51:05 ++ return 0 11:51:05 + pyenv local 3.11.10 11:51:05 + local command 11:51:05 + command=local 11:51:05 + '[' 2 -gt 0 ']' 11:51:05 + shift 11:51:05 + case "$command" in 11:51:05 + command pyenv local 3.11.10 11:51:05 + for arg in "$@" 11:51:05 + case $arg in 11:51:05 + pkg_list+='tox ' 11:51:05 + for arg in "$@" 11:51:05 + case $arg in 11:51:05 + pkg_list+='virtualenv ' 11:51:05 + for arg in "$@" 11:51:05 + case $arg in 11:51:05 + pkg_list+='urllib3~=1.26.15 ' 11:51:05 + [[ -f /tmp/.toxenv ]] 11:51:05 + [[ ! -f /tmp/.toxenv ]] 11:51:05 + [[ -n '' ]] 11:51:05 + python3 -m venv /tmp/venv-hmUl 11:51:09 + echo 'lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-hmUl' 11:51:09 lf-activate-venv(): INFO: Creating python3 venv at /tmp/venv-hmUl 11:51:09 + echo /tmp/venv-hmUl 11:51:09 + echo 'lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv' 11:51:09 lf-activate-venv(): INFO: Save venv in file: /tmp/.toxenv 11:51:09 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 11:51:09 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 11:51:09 + local 'pip_opts=--upgrade --quiet' 11:51:09 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 11:51:09 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 11:51:09 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 11:51:09 + [[ -n '' ]] 11:51:09 + [[ -n '' ]] 11:51:09 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 11:51:09 lf-activate-venv(): INFO: Attempting to install with network-safe options... 11:51:09 + /tmp/venv-hmUl/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 11:51:13 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 11:51:13 lf-activate-venv(): INFO: Base packages installed successfully 11:51:13 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 11:51:13 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 11:51:13 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 11:51:13 + /tmp/venv-hmUl/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 11:51:15 + type python3 11:51:15 + true 11:51:15 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-hmUl/bin to PATH' 11:51:15 lf-activate-venv(): INFO: Adding /tmp/venv-hmUl/bin to PATH 11:51:15 + PATH=/tmp/venv-hmUl/bin:/opt/pyenv/shims:/opt/pyenv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:15 + return 0 11:51:15 + python3 --version 11:51:15 Python 3.11.10 11:51:15 + python3 -m pip --version 11:51:15 pip 26.0.1 from /tmp/venv-hmUl/lib/python3.11/site-packages/pip (python 3.11) 11:51:15 + python3 -m pip freeze 11:51:16 cachetools==7.0.1 11:51:16 chardet==5.2.0 11:51:16 colorama==0.4.6 11:51:16 distlib==0.4.0 11:51:16 filelock==3.24.3 11:51:16 packaging==26.0 11:51:16 platformdirs==4.9.2 11:51:16 pluggy==1.6.0 11:51:16 pyproject-api==1.10.0 11:51:16 tox==4.42.0 11:51:16 urllib3==1.26.20 11:51:16 virtualenv==20.38.0 11:51:16 [transportpce-tox-verify-transportpce-master] $ /bin/sh -xe /tmp/jenkins10783970820898772947.sh 11:51:16 [EnvInject] - Injecting environment variables from a build step. 11:51:16 [EnvInject] - Injecting as environment variables the properties content 11:51:16 PARALLEL=True 11:51:16 11:51:16 [EnvInject] - Variables injected successfully. 11:51:16 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins10722505322088531669.sh 11:51:16 ---> tox-run.sh 11:51:16 + PATH=/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 + ARCHIVE_TOX_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 11:51:16 + ARCHIVE_DOC_DIR=/w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 11:51:16 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox 11:51:16 + cd /w/workspace/transportpce-tox-verify-transportpce-master/. 11:51:16 + source /home/jenkins/lf-env.sh 11:51:16 + lf-activate-venv --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 11:51:16 ++ mktemp -d /tmp/venv-XXXX 11:51:16 + lf_venv=/tmp/venv-TqEh 11:51:16 + local venv_file=/tmp/.os_lf_venv 11:51:16 + local python=python3 11:51:16 + local options 11:51:16 + local set_path=true 11:51:16 + local install_args= 11:51:16 ++ getopt -o np:v: -l no-path,system-site-packages,python:,venv-file: -n lf-activate-venv -- --venv-file /tmp/.toxenv tox virtualenv urllib3~=1.26.15 11:51:16 + options=' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 11:51:16 + eval set -- ' --venv-file '\''/tmp/.toxenv'\'' -- '\''tox'\'' '\''virtualenv'\'' '\''urllib3~=1.26.15'\''' 11:51:16 ++ set -- --venv-file /tmp/.toxenv -- tox virtualenv urllib3~=1.26.15 11:51:16 + true 11:51:16 + case $1 in 11:51:16 + venv_file=/tmp/.toxenv 11:51:16 + shift 2 11:51:16 + true 11:51:16 + case $1 in 11:51:16 + shift 11:51:16 + break 11:51:16 + case $python in 11:51:16 + local pkg_list= 11:51:16 + [[ -d /opt/pyenv ]] 11:51:16 + echo 'Setup pyenv:' 11:51:16 Setup pyenv: 11:51:16 + export PYENV_ROOT=/opt/pyenv 11:51:16 + PYENV_ROOT=/opt/pyenv 11:51:16 + export PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 + PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 + pyenv versions 11:51:16 system 11:51:16 3.8.20 11:51:16 3.9.20 11:51:16 3.10.15 11:51:16 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 11:51:16 + command -v pyenv 11:51:16 ++ pyenv init - --no-rehash 11:51:16 + eval 'PATH="$(bash --norc -ec '\''IFS=:; paths=($PATH); 11:51:16 for i in ${!paths[@]}; do 11:51:16 if [[ ${paths[i]} == "'\'''\''/opt/pyenv/shims'\'''\''" ]]; then unset '\''\'\'''\''paths[i]'\''\'\'''\''; 11:51:16 fi; done; 11:51:16 echo "${paths[*]}"'\'')" 11:51:16 export PATH="/opt/pyenv/shims:${PATH}" 11:51:16 export PYENV_SHELL=bash 11:51:16 source '\''/opt/pyenv/libexec/../completions/pyenv.bash'\'' 11:51:16 pyenv() { 11:51:16 local command 11:51:16 command="${1:-}" 11:51:16 if [ "$#" -gt 0 ]; then 11:51:16 shift 11:51:16 fi 11:51:16 11:51:16 case "$command" in 11:51:16 rehash|shell) 11:51:16 eval "$(pyenv "sh-$command" "$@")" 11:51:16 ;; 11:51:16 *) 11:51:16 command pyenv "$command" "$@" 11:51:16 ;; 11:51:16 esac 11:51:16 }' 11:51:16 +++ bash --norc -ec 'IFS=:; paths=($PATH); 11:51:16 for i in ${!paths[@]}; do 11:51:16 if [[ ${paths[i]} == "/opt/pyenv/shims" ]]; then unset '\''paths[i]'\''; 11:51:16 fi; done; 11:51:16 echo "${paths[*]}"' 11:51:16 ++ PATH=/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 ++ export PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 ++ PATH=/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:16 ++ export PYENV_SHELL=bash 11:51:16 ++ PYENV_SHELL=bash 11:51:16 ++ source /opt/pyenv/libexec/../completions/pyenv.bash 11:51:16 +++ complete -F _pyenv pyenv 11:51:16 ++ lf-pyver python3 11:51:16 ++ local py_version_xy=python3 11:51:16 ++ local py_version_xyz= 11:51:16 ++ pyenv versions 11:51:16 ++ local command 11:51:16 ++ command=versions 11:51:16 ++ '[' 1 -gt 0 ']' 11:51:16 ++ shift 11:51:16 ++ case "$command" in 11:51:16 ++ command pyenv versions 11:51:16 ++ sed 's/^[ *]* //' 11:51:16 ++ grep -E '^[0-9.]*[0-9]$' 11:51:16 ++ awk '{ print $1 }' 11:51:16 ++ [[ ! -s /tmp/.pyenv_versions ]] 11:51:16 +++ grep '^3' /tmp/.pyenv_versions 11:51:16 +++ sort -V 11:51:16 +++ tail -n 1 11:51:16 ++ py_version_xyz=3.11.10 11:51:16 ++ [[ -z 3.11.10 ]] 11:51:16 ++ echo 3.11.10 11:51:16 ++ return 0 11:51:16 + pyenv local 3.11.10 11:51:16 + local command 11:51:16 + command=local 11:51:16 + '[' 2 -gt 0 ']' 11:51:16 + shift 11:51:16 + case "$command" in 11:51:16 + command pyenv local 3.11.10 11:51:16 + for arg in "$@" 11:51:16 + case $arg in 11:51:16 + pkg_list+='tox ' 11:51:16 + for arg in "$@" 11:51:16 + case $arg in 11:51:16 + pkg_list+='virtualenv ' 11:51:16 + for arg in "$@" 11:51:16 + case $arg in 11:51:16 + pkg_list+='urllib3~=1.26.15 ' 11:51:16 + [[ -f /tmp/.toxenv ]] 11:51:16 ++ cat /tmp/.toxenv 11:51:16 + lf_venv=/tmp/venv-hmUl 11:51:16 + echo 'lf-activate-venv(): INFO: Reuse venv:/tmp/venv-hmUl from' file:/tmp/.toxenv 11:51:16 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-hmUl from file:/tmp/.toxenv 11:51:16 + echo 'lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv)' 11:51:16 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 11:51:16 + local 'pip_opts=--upgrade --quiet' 11:51:16 + pip_opts='--upgrade --quiet --trusted-host pypi.org' 11:51:16 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org' 11:51:16 + pip_opts='--upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org' 11:51:16 + [[ -n '' ]] 11:51:16 + [[ -n '' ]] 11:51:16 + echo 'lf-activate-venv(): INFO: Attempting to install with network-safe options...' 11:51:16 lf-activate-venv(): INFO: Attempting to install with network-safe options... 11:51:16 + /tmp/venv-hmUl/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org pip 'setuptools<66' virtualenv 11:51:17 + echo 'lf-activate-venv(): INFO: Base packages installed successfully' 11:51:17 lf-activate-venv(): INFO: Base packages installed successfully 11:51:17 + [[ -z tox virtualenv urllib3~=1.26.15 ]] 11:51:17 + echo 'lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 ' 11:51:17 lf-activate-venv(): INFO: Installing additional packages: tox virtualenv urllib3~=1.26.15 11:51:17 + /tmp/venv-hmUl/bin/python3 -m pip install --upgrade --quiet --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org --upgrade-strategy eager tox virtualenv urllib3~=1.26.15 11:51:18 + type python3 11:51:18 + true 11:51:18 + echo 'lf-activate-venv(): INFO: Adding /tmp/venv-hmUl/bin to PATH' 11:51:18 lf-activate-venv(): INFO: Adding /tmp/venv-hmUl/bin to PATH 11:51:18 + PATH=/tmp/venv-hmUl/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:18 + return 0 11:51:18 + [[ -d /opt/pyenv ]] 11:51:18 + echo '---> Setting up pyenv' 11:51:18 ---> Setting up pyenv 11:51:18 + export PYENV_ROOT=/opt/pyenv 11:51:18 + PYENV_ROOT=/opt/pyenv 11:51:18 + export PATH=/opt/pyenv/bin:/tmp/venv-hmUl/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:18 + PATH=/opt/pyenv/bin:/tmp/venv-hmUl/bin:/opt/pyenv/shims:/opt/pyenv/bin:/home/jenkins/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/opt/puppetlabs/bin 11:51:18 ++ pwd 11:51:18 + PYTHONPATH=/w/workspace/transportpce-tox-verify-transportpce-master 11:51:18 + export PYTHONPATH 11:51:18 + export TOX_TESTENV_PASSENV=PYTHONPATH 11:51:18 + TOX_TESTENV_PASSENV=PYTHONPATH 11:51:18 + tox --version 11:51:18 4.42.0 from /tmp/venv-hmUl/lib/python3.11/site-packages/tox/__init__.py 11:51:18 + PARALLEL=True 11:51:18 + TOX_OPTIONS_LIST= 11:51:18 + [[ -n '' ]] 11:51:18 + case ${PARALLEL,,} in 11:51:18 + TOX_OPTIONS_LIST=' --parallel auto --parallel-live' 11:51:18 + tox --parallel auto --parallel-live 11:51:18 + tee -a /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tox.log 11:51:20 docs: install_deps> python -I -m pip install -r docs/requirements.txt 11:51:20 buildcontroller: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:51:20 checkbashisms: freeze> python -m pip freeze --all 11:51:20 docs-linkcheck: install_deps> python -I -m pip install -r docs/requirements.txt 11:51:21 checkbashisms: pip==26.0.1,setuptools==82.0.0 11:51:21 checkbashisms: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 11:51:21 checkbashisms: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'command checkbashisms>/dev/null || sudo yum install -y devscripts-checkbashisms || sudo yum install -y devscripts-minimal || sudo yum install -y devscripts || sudo yum install -y https://archives.fedoraproject.org/pub/archive/fedora/linux/releases/31/Everything/x86_64/os/Packages/d/devscripts-checkbashisms-2.19.6-2.fc31.x86_64.rpm || (echo "checkbashisms command not found - please install it (e.g. sudo apt-get install devscripts | yum install devscripts-minimal )" >&2 && exit 1)' 11:51:21 checkbashisms: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find . -not -path '*/\.*' -name '*.sh' -exec checkbashisms -f '{}' + 11:51:22 checkbashisms: OK ✔ in 3.17 seconds 11:51:22 pre-commit: install_deps> python -I -m pip install pre-commit 11:51:24 pre-commit: freeze> python -m pip freeze --all 11:51:24 pre-commit: cfgv==3.5.0,distlib==0.4.0,filelock==3.24.3,identify==2.6.16,nodeenv==1.10.0,pip==26.0.1,platformdirs==4.9.2,pre_commit==4.5.1,PyYAML==6.0.3,setuptools==82.0.0,virtualenv==20.38.0 11:51:24 pre-commit: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./fixCIcentOS8reposMirrors.sh 11:51:24 pre-commit: commands[1] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sh -c 'which cpan || sudo yum install -y perl-CPAN || (echo "cpan command not found - please install it (e.g. sudo apt-get install perl-modules | yum install perl-CPAN )" >&2 && exit 1)' 11:51:25 /usr/bin/cpan 11:51:25 pre-commit: commands[2] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run --all-files --show-diff-on-failure 11:51:25 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 11:51:25 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 11:51:25 [INFO] Initializing environment for https://github.com/pre-commit/pre-commit-hooks. 11:51:25 [WARNING] repo `https://github.com/pre-commit/pre-commit-hooks` uses deprecated stage names (commit, push) which will be removed in a future version. Hint: often `pre-commit autoupdate --repo https://github.com/pre-commit/pre-commit-hooks` will fix this. if it does not -- consider reporting an issue to that repo. 11:51:25 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint. 11:51:26 [INFO] Initializing environment for https://github.com/jorisroovers/gitlint:./gitlint-core[trusted-deps]. 11:51:26 [INFO] Initializing environment for https://github.com/Lucas-C/pre-commit-hooks. 11:51:27 [INFO] Initializing environment for https://github.com/pre-commit/mirrors-autopep8. 11:51:27 [INFO] Initializing environment for https://github.com/perltidy/perltidy. 11:51:27 buildcontroller: freeze> python -m pip freeze --all 11:51:27 buildcontroller: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:51:27 buildcontroller: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh 11:51:27 + update-java-alternatives -l 11:51:27 java-1.11.0-openjdk-amd64 1111 /usr/lib/jvm/java-1.11.0-openjdk-amd64 11:51:27 java-1.17.0-openjdk-amd64 1711 /usr/lib/jvm/java-1.17.0-openjdk-amd64 11:51:27 java-1.21.0-openjdk-amd64 2111 /usr/lib/jvm/java-1.21.0-openjdk-amd64 11:51:27 + sudo update-java-alternatives -s java-1.21.0-openjdk-amd64 11:51:27 update-alternatives: error: no alternatives for jaotc 11:51:27 update-alternatives: error: no alternatives for rmic 11:51:28 + sed -n ;s/.* version "\(.*\)\.\(.*\)\..*".*$/\1/p; 11:51:28 + java -version 11:51:28 + JAVA_VER=21 11:51:28 + echo 21 11:51:28 21 11:51:28 + javac -version 11:51:28 + sed -n ;s/javac \(.*\)\.\(.*\)\..*.*$/\1/p; 11:51:28 [INFO] Installing environment for https://github.com/pre-commit/pre-commit-hooks. 11:51:28 [INFO] Once installed this environment will be reused. 11:51:28 [INFO] This may take a few minutes... 11:51:28 21 11:51:28 ok, java is 21 or newer 11:51:28 + JAVAC_VER=21 11:51:28 + echo 21 11:51:28 + [ 21 -ge 21 ] 11:51:28 + [ 21 -ge 21 ] 11:51:28 + echo ok, java is 21 or newer 11:51:28 + wget -nv https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz -P /tmp 11:51:28 2026-02-20 11:51:28 URL:https://dlcdn.apache.org/maven/maven-3/3.9.12/binaries/apache-maven-3.9.12-bin.tar.gz [9233336/9233336] -> "/tmp/apache-maven-3.9.12-bin.tar.gz" [1] 11:51:28 + sudo mkdir -p /opt 11:51:28 + sudo tar xf /tmp/apache-maven-3.9.12-bin.tar.gz -C /opt 11:51:29 + sudo ln -s /opt/apache-maven-3.9.12 /opt/maven 11:51:29 + sudo ln -s /opt/maven/bin/mvn /usr/bin/mvn 11:51:29 + mvn --version 11:51:29 Apache Maven 3.9.12 (848fbb4bf2d427b72bdb2471c22fced7ebd9a7a1) 11:51:29 Maven home: /opt/maven 11:51:29 Java version: 21.0.9, vendor: Ubuntu, runtime: /usr/lib/jvm/java-21-openjdk-amd64 11:51:29 Default locale: en, platform encoding: UTF-8 11:51:29 OS name: "linux", version: "5.15.0-168-generic", arch: "amd64", family: "unix" 11:51:29 NOTE: Picked up JDK_JAVA_OPTIONS: 11:51:29 --add-opens=java.base/java.io=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.lang=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.net=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.nio=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.util=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:51:29 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:51:29 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:51:29 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:51:29 -Xlog:disable 11:51:33 [INFO] Installing environment for https://github.com/Lucas-C/pre-commit-hooks. 11:51:33 [INFO] Once installed this environment will be reused. 11:51:33 [INFO] This may take a few minutes... 11:51:40 [INFO] Installing environment for https://github.com/pre-commit/mirrors-autopep8. 11:51:40 [INFO] Once installed this environment will be reused. 11:51:40 [INFO] This may take a few minutes... 11:51:45 [ERROR] Failed to execute goal on project transportpce-api: Could not resolve dependencies for project org.opendaylight.transportpce:transportpce-api:bundle:13.0.0-SNAPSHOT 11:51:45 [ERROR] dependency: org.opendaylight.transportpce.models:openconfig-251203:jar:24.0.0-SNAPSHOT (compile) 11:51:45 [ERROR] Could not find artifact org.opendaylight.transportpce.models:openconfig-251203:jar:24.0.0-SNAPSHOT in opendaylight-snapshot (https://nexus.opendaylight.org/content/repositories/opendaylight.snapshot/) 11:51:45 [ERROR] -> [Help 1] 11:51:45 [ERROR] 11:51:45 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 11:51:45 [ERROR] Re-run Maven using the -X switch to enable full debug logging. 11:51:45 [ERROR] 11:51:45 [ERROR] For more information about the errors and possible solutions, please read the following articles: 11:51:45 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException 11:51:45 [ERROR] 11:51:45 [ERROR] After correcting the problems, you can resume the build with the command 11:51:45 [ERROR] mvn -rf :transportpce-api 11:51:45 buildcontroller: exit 1 (17.24 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_controller.sh pid=2717 11:51:45 buildcontroller: FAIL ✖ in 25.99 seconds 11:51:45 pylint: install_deps> python -I -m pip install 'pylint>=2.6.0' 11:51:45 [INFO] Installing environment for https://github.com/perltidy/perltidy. 11:51:45 [INFO] Once installed this environment will be reused. 11:51:45 [INFO] This may take a few minutes... 11:51:46 docs: freeze> python -m pip freeze --all 11:51:46 docs-linkcheck: freeze> python -m pip freeze --all 11:51:47 docs: alabaster==1.0.0,attrs==25.4.0,babel==2.18.0,blockdiag==3.0.0,certifi==2026.1.4,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs_conf==0.10.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.2,nwdiag==3.0.0,packaging==26.0,pillow==12.1.1,pip==26.0.1,Pygments==2.19.2,pyparsing==3.3.2,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,roman-numerals-py==4.1.0,seqdiag==3.0.0,setuptools==82.0.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 11:51:47 docs: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -W --keep-going -b html -n -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/html 11:51:47 docs-linkcheck: alabaster==1.0.0,attrs==25.4.0,babel==2.18.0,blockdiag==3.0.0,certifi==2026.1.4,charset-normalizer==3.4.4,contourpy==1.3.3,cycler==0.12.1,docutils==0.21.2,fonttools==4.61.1,funcparserlib==2.0.0a0,future==1.0.0,idna==3.11,imagesize==1.4.1,Jinja2==3.1.6,jsonschema==3.2.0,kiwisolver==1.4.9,lfdocs_conf==0.10.0,MarkupSafe==3.0.3,matplotlib==3.10.8,numpy==2.4.2,nwdiag==3.0.0,packaging==26.0,pillow==12.1.1,pip==26.0.1,Pygments==2.19.2,pyparsing==3.3.2,pyrsistent==0.20.0,python-dateutil==2.9.0.post0,PyYAML==6.0.3,requests==2.32.5,requests-file==1.5.1,roman-numerals==4.1.0,roman-numerals-py==4.1.0,seqdiag==3.0.0,setuptools==82.0.0,six==1.17.0,snowballstemmer==3.0.1,Sphinx==8.2.3,sphinx-bootstrap-theme==0.8.1,sphinx-data-viewer==0.1.5,sphinx-tabs==3.4.7,sphinx_rtd_theme==3.1.0,sphinxcontrib-applehelp==2.0.0,sphinxcontrib-blockdiag==3.0.0,sphinxcontrib-devhelp==2.0.0,sphinxcontrib-htmlhelp==2.1.0,sphinxcontrib-jquery==4.1,sphinxcontrib-jsmath==1.0.1,sphinxcontrib-needs==0.7.9,sphinxcontrib-nwdiag==2.0.0,sphinxcontrib-plantuml==0.31,sphinxcontrib-qthelp==2.0.0,sphinxcontrib-seqdiag==3.0.0,sphinxcontrib-serializinghtml==2.0.0,sphinxcontrib-swaggerdoc==0.1.7,urllib3==2.6.3,webcolors==25.10.0 11:51:47 docs-linkcheck: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> sphinx-build -q -b linkcheck -d /w/workspace/transportpce-tox-verify-transportpce-master/.tox/docs-linkcheck/tmp/doctrees ../docs/ /w/workspace/transportpce-tox-verify-transportpce-master/docs/_build/linkcheck 11:51:48 pylint: freeze> python -m pip freeze --all 11:51:48 pylint: astroid==4.0.4,dill==0.4.1,isort==8.0.0,mccabe==0.7.0,pip==26.0.1,platformdirs==4.9.2,pylint==4.0.5,setuptools==82.0.0,tomlkit==0.14.0 11:51:48 pylint: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> find transportpce_tests/ -name '*.py' -exec pylint --fail-under=10 --max-line-length=120 --disable=missing-docstring,import-error --disable=fixme --disable=duplicate-code '--module-rgx=([a-z0-9_]+$)|([0-9.]{1,30}$)' '--method-rgx=(([a-z_][a-zA-Z0-9_]{2,})|(_[a-z0-9_]*)|(__[a-zA-Z][a-zA-Z0-9_]+__))$' '--variable-rgx=[a-zA-Z_][a-zA-Z0-9_]{1,30}$' '{}' + 11:51:49 docs: OK ✔ in 29.94 seconds 11:51:49 build_karaf_tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:51:52 docs-linkcheck: OK ✔ in 32.85 seconds 11:51:52 build_karaf_tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:51:54 trim trailing whitespace.................................................Passed 11:51:54 Tabs remover.............................................................Passed 11:51:55 autopep8.................................................................build_karaf_tests121: freeze> python -m pip freeze --all 11:51:57 build_karaf_tests121: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:51:57 build_karaf_tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 11:51:57 build karaf in karaf121 with ./karaf121.env 11:51:57 NOTE: Picked up JDK_JAVA_OPTIONS: 11:51:57 --add-opens=java.base/java.io=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.lang=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.net=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.nio=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.util=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:51:57 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:51:57 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:51:57 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:51:57 -Xlog:disable 11:52:00 Passed 11:52:00 perltidy.................................................................Passed 11:52:01 pre-commit: commands[3] /w/workspace/transportpce-tox-verify-transportpce-master/tests> pre-commit run gitlint-ci --hook-stage manual 11:52:01 build_karaf_tests221: freeze> python -m pip freeze --all 11:52:01 [WARNING] hook id `remove-tabs` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 11:52:01 [WARNING] hook id `perltidy` uses deprecated stage names (commit) which will be removed in a future version. run: `pre-commit migrate-config` to automatically fix this. 11:52:01 [INFO] Installing environment for https://github.com/jorisroovers/gitlint. 11:52:01 [INFO] Once installed this environment will be reused. 11:52:01 [INFO] This may take a few minutes... 11:52:01 build_karaf_tests221: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:52:01 build_karaf_tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 11:52:01 build karaf in karaf221 with ./karaf221.env 11:52:02 NOTE: Picked up JDK_JAVA_OPTIONS: 11:52:02 --add-opens=java.base/java.io=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.lang=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.net=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.nio=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.util=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:52:02 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:52:02 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:52:02 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:52:02 -Xlog:disable 11:52:15 gitlint..................................................................Passed 11:52:15 pre-commit: OK ✔ in 53.04 seconds 11:52:15 build_karaf_tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:52:21 11:52:21 ------------------------------------ 11:52:21 Your code has been rated at 10.00/10 11:52:21 11:52:25 pylint: OK ✔ in 40.13 seconds 11:52:25 build_karaf_tests200: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:52:28 build_karaf_tests71: freeze> python -m pip freeze --all 11:52:28 build_karaf_tests71: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:52:28 build_karaf_tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 11:52:28 build karaf in karaf71 with ./karaf71.env 11:52:28 NOTE: Picked up JDK_JAVA_OPTIONS: 11:52:28 --add-opens=java.base/java.io=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.lang=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.net=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.nio=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.util=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:52:28 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:52:28 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:52:28 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:52:28 -Xlog:disable 11:52:34 build_karaf_tests200: freeze> python -m pip freeze --all 11:52:34 build_karaf_tests200: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:52:34 build_karaf_tests200: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./build_karaf_for_tests.sh 11:52:34 build karaf in karafoc200 with ./karafoc200.env 11:52:34 NOTE: Picked up JDK_JAVA_OPTIONS: 11:52:34 --add-opens=java.base/java.io=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.lang=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.lang.invoke=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.lang.reflect=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.net=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.nio=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.nio.charset=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.nio.file=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.util=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.util.jar=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.util.stream=ALL-UNNAMED 11:52:34 --add-opens=java.base/java.util.zip=ALL-UNNAMED 11:52:34 --add-opens java.base/sun.nio.ch=ALL-UNNAMED 11:52:34 --add-opens java.base/sun.nio.fs=ALL-UNNAMED 11:52:34 -Xlog:disable 11:52:56 build_karaf_tests121: OK ✔ in 1 minute 7.54 seconds 11:52:56 buildlighty: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:53:05 build_karaf_tests221: OK ✔ in 1 minute 13.51 seconds 11:53:05 sims: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:53:11 build_karaf_tests71: OK ✔ in 55.41 seconds 11:53:11 testsPCE: install_deps> python -I -m pip install gnpy4tpce==2.4.7 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:53:14 buildlighty: freeze> python -m pip freeze --all 11:53:14 buildlighty: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:53:14 buildlighty: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh 11:53:14 sims: freeze> python -m pip freeze --all 11:53:15 NOTE: Picked up JDK_JAVA_OPTIONS: --add-opens=java.base/java.lang=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED 11:53:15 sims: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:53:15 sims: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./install_lightynode.sh 11:53:15 Using lighynode version 22.1.0.7 11:53:15 Installing lightynode device to ./lightynode/lightynode-openroadm-device directory 11:53:18 sims: OK ✔ in 13.16 seconds 11:53:18 tests71: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 11:53:29 tests71: freeze> python -m pip freeze --all 11:53:29 tests71: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 11:53:29 tests71: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 7.1 11:53:29 using environment variables from ./karaf71.env 11:53:29 pytest -q transportpce_tests/7.1/test01_portmapping.py 11:53:33 [ERROR] COMPILATION ERROR : 11:53:33 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[27,52] cannot find symbol 11:53:33 symbol: class OCPortMappingVersion200 11:53:33 location: package org.opendaylight.transportpce.common.mapping 11:53:33 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[39,65] cannot find symbol 11:53:33 symbol: class OpenConfigInterfacesImpl200 11:53:33 location: package org.opendaylight.transportpce.common.openconfiginterfaces 11:53:33 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.13.0:compile (default-compile) on project tpce: Compilation failure: Compilation failure: 11:53:33 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[27,52] cannot find symbol 11:53:33 [ERROR] symbol: class OCPortMappingVersion200 11:53:33 [ERROR] location: package org.opendaylight.transportpce.common.mapping 11:53:33 [ERROR] /w/workspace/transportpce-tox-verify-transportpce-master/lighty/src/main/java/io/lighty/controllers/tpce/module/TransportPCEImpl.java:[39,65] cannot find symbol 11:53:33 [ERROR] symbol: class OpenConfigInterfacesImpl200 11:53:33 [ERROR] location: package org.opendaylight.transportpce.common.openconfiginterfaces 11:53:33 [ERROR] -> [Help 1] 11:53:33 [ERROR] 11:53:33 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. 11:53:33 [ERROR] Re-run Maven using the -X switch to enable full debug logging. 11:53:33 [ERROR] 11:53:33 [ERROR] For more information about the errors and possible solutions, please read the following articles: 11:53:33 [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException 11:53:33 unzip: cannot find or open target/tpce-bin.zip, target/tpce-bin.zip.zip or target/tpce-bin.zip.ZIP. 11:53:33 build_karaf_tests200: OK ✔ in 1 minute 7.91 seconds 11:53:33 buildlighty: exit 9 (18.46 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/lighty> ./build.sh pid=3974 11:54:11 .buildlighty: FAIL ✖ in 36.81 seconds 11:54:12 testsPCE: freeze> python -m pip freeze --all 11:54:12 .testsPCE: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,click==8.3.1,contourpy==1.3.3,cryptography==3.3.2,cycler==0.12.1,dict2xml==1.7.8,Flask==2.1.3,Flask-Injector==0.14.0,fonttools==4.61.1,gnpy4tpce==2.4.7,idna==3.11,iniconfig==2.3.0,injector==0.24.0,invoke==2.2.1,itsdangerous==2.2.0,Jinja2==3.1.6,kiwisolver==1.4.9,lxml==6.0.2,MarkupSafe==3.0.3,matplotlib==3.10.8,netconf-client==3.5.0,networkx==2.8.8,numpy==1.26.4,packaging==26.0,pandas==1.5.3,paramiko==4.0.0,pbr==5.11.1,pillow==12.1.1,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pyparsing==3.3.2,pytest==9.0.2,python-dateutil==2.9.0.post0,pytz==2025.2,requests==2.32.5,scipy==1.17.0,setuptools==50.3.2,six==1.17.0,urllib3==2.6.3,Werkzeug==2.0.3,xlrd==1.2.0 11:54:13 testsPCE: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce 11:54:13 pytest -q transportpce_tests/pce/test01_pce.py 11:54:13 .......... [100%] 11:54:25 12 passed in 54.08s 11:54:25 pytest -q transportpce_tests/7.1/test02_otn_renderer.py 11:54:52 .......................................E....................... [100%] 11:57:02 62 passed in 157.51s (0:02:37) 11:57:02 pytest -q transportpce_tests/7.1/test03_renderer_or_modes.py 11:57:36 .................E............................... [100%] 11:59:20 48 passed in 137.16s (0:02:17) 11:59:20 pytest -q transportpce_tests/7.1/test04_renderer_regen_mode.py 11:59:48 ...........E........... [100%] 12:00:36 22 passed in 75.89s (0:01:15) 12:02:13 EEEEEEEEEEEEEEEEE [100%] 12:34:14 ==================================== ERRORS ==================================== 12:34:14 _______ ERROR at setup of TestTransportPCEPce.test_01_load_port_mapping ________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ____ ERROR at setup of TestTransportPCEPce.test_02_load_simple_topology_bi _____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___________ ERROR at setup of TestTransportPCEPce.test_03_get_nodeId ___________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___________ ERROR at setup of TestTransportPCEPce.test_04_get_linkId ___________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ____ ERROR at setup of TestTransportPCEPce.test_05_path_computation_xpdr_bi ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ____ ERROR at setup of TestTransportPCEPce.test_06_path_computation_rdm_bi _____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ____ ERROR at setup of TestTransportPCEPce.test_07_load_simple_topology_uni ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___________ ERROR at setup of TestTransportPCEPce.test_08_get_nodeId ___________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___________ ERROR at setup of TestTransportPCEPce.test_09_get_linkId ___________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___ ERROR at setup of TestTransportPCEPce.test_10_path_computation_xpdr_uni ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ____ ERROR at setup of TestTransportPCEPce.test_11_path_computation_rdm_uni ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 _____ ERROR at setup of TestTransportPCEPce.test_12_load_complex_topology ______ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___________ ERROR at setup of TestTransportPCEPce.test_13_get_nodeId ___________ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 _____ ERROR at setup of TestTransportPCEPce.test_14_fail_path_computation ______ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___ ERROR at setup of TestTransportPCEPce.test_15_success1_path_computation ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___ ERROR at setup of TestTransportPCEPce.test_16_success2_path_computation ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 ___ ERROR at setup of TestTransportPCEPce.test_17_success3_path_computation ____ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 _ ERROR at setup of TestTransportPCEPce.test_18_path_computation_before_oms_attribute_deletion _ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 _ ERROR at setup of TestTransportPCEPce.test_19_delete_oms_attribute_in_openroadm13toopenroadm12_link _ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 _ ERROR at setup of TestTransportPCEPce.test_20_path_computation_after_oms_attribute_deletion _ 12:34:14 12:34:14 cls = 12:34:14 12:34:14 @classmethod 12:34:14 def setUpClass(cls): 12:34:14 # pylint: disable=bare-except 12:34:14 sample_files_parsed = False 12:34:14 time.sleep(20) 12:34:14 try: 12:34:14 TOPO_BI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "honeynode-topo.json") 12:34:14 with open(TOPO_BI_DIR_FILE, 'r', encoding='utf-8') as topo_bi_dir: 12:34:14 cls.simple_topo_bi_dir_data = topo_bi_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-simple-topology.json") 12:34:14 12:34:14 with open(TOPO_UNI_DIR_FILE, 'r', encoding='utf-8') as topo_uni_dir: 12:34:14 cls.simple_topo_uni_dir_data = topo_uni_dir.read() 12:34:14 12:34:14 TOPO_UNI_DIR_COMPLEX_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "NW-for-test-5-4.json") 12:34:14 with open(TOPO_UNI_DIR_COMPLEX_FILE, 'r', encoding='utf-8') as topo_uni_dir_complex: 12:34:14 cls.complex_topo_uni_dir_data = topo_uni_dir_complex.read() 12:34:14 PORT_MAPPING_FILE = os.path.join(os.path.dirname(os.path.realpath(__file__)), 12:34:14 "..", "..", "sample_configs", "pce_portmapping_121.json") 12:34:14 with open(PORT_MAPPING_FILE, 'r', encoding='utf-8') as port_mapping: 12:34:14 cls.port_mapping_data = port_mapping.read() 12:34:14 sample_files_parsed = True 12:34:14 except PermissionError as err: 12:34:14 print("Permission Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except FileNotFoundError as err: 12:34:14 print("File Not found Error when trying to read sample files\n", err) 12:34:14 sys.exit(2) 12:34:14 except: 12:34:14 print("Unexpected error when trying to read sample files\n", sys.exc_info()[0]) 12:34:14 sys.exit(2) 12:34:14 finally: 12:34:14 if sample_files_parsed: 12:34:14 print("sample files content loaded") 12:34:14 12:34:14 > cls.processes = test_utils.start_tpce() 12:34:14 ^^^^^^^^^^^^^^^^^^^^^^^ 12:34:14 12:34:14 transportpce_tests/pce/test01_pce.py:93: 12:34:14 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:34:14 12:34:14 def start_tpce(): 12:34:14 if 'NO_ODL_STARTUP' in os.environ: 12:34:14 print('No OpenDaylight instance to start!') 12:34:14 return [] 12:34:14 print('starting OpenDaylight...') 12:34:14 if 'USE_LIGHTY' in os.environ and os.environ['USE_LIGHTY'] == 'True': 12:34:14 process = start_lighty() 12:34:14 else: 12:34:14 process = start_karaf() 12:34:14 if wait_until_log_contains(TPCE_LOG, [LIGHTY_OK_START_MSG, KARAF_OK_START_MSG], time_to_wait=100): 12:34:14 print('OpenDaylight started !') 12:34:14 else: 12:34:14 print('OpenDaylight failed to start !') 12:34:14 shutdown_process(process) 12:34:14 for pid in process_list: 12:34:14 shutdown_process(pid) 12:34:14 > sys.exit(1) 12:34:14 E SystemExit: 1 12:34:14 12:34:14 transportpce_tests/common/test_utils.py:237: SystemExit 12:34:14 ---------------------------- Captured stdout setup ----------------------------- 12:34:14 sample files content loaded 12:34:14 starting OpenDaylight... 12:34:14 starting KARAF (karaf) TransportPCE build... 12:34:14 Pattern not found after 100 seconds! OpenDaylight failed to start ! 12:34:14 =========================== short test summary info ============================ 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_01_load_port_mapping 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_02_load_simple_topology_bi 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_03_get_nodeId 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_04_get_linkId 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_05_path_computation_xpdr_bi 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_06_path_computation_rdm_bi 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_07_load_simple_topology_uni 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_08_get_nodeId 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_09_get_linkId 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_10_path_computation_xpdr_uni 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_11_path_computation_rdm_uni 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_12_load_complex_topology 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_13_get_nodeId 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_14_fail_path_computation 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_15_success1_path_computation 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_16_success2_path_computation 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_17_success3_path_computation 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_18_path_computation_before_oms_attribute_deletion 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_19_delete_oms_attribute_in_openroadm13toopenroadm12_link 12:34:14 ERROR transportpce_tests/pce/test01_pce.py::TestTransportPCEPce::test_20_path_computation_after_oms_attribute_deletion 12:34:14 20 errors in 2401.58s (0:40:01) 12:34:15 tests71: OK ✔ in 7 minutes 17.98 seconds 12:34:15 testsPCE: exit 1 (2401.89 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh pce pid=4869 12:34:15 testsPCE: FAIL ✖ in 41 minutes 4.29 seconds 12:34:15 tests121: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:34:15 tests200: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:34:15 tests_tapi: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 12:34:22 tests_tapi: freeze> python -m pip freeze --all 12:34:22 tests200: freeze> python -m pip freeze --all 12:34:22 tests121: freeze> python -m pip freeze --all 12:34:22 tests_tapi: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 12:34:22 tests_tapi: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh tapi 12:34:22 using environment variables from ./karaf221.env 12:34:22 pytest -q transportpce_tests/tapi/test01_abstracted_topology.py 12:34:22 tests121: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 12:34:22 tests121: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 12:34:22 using environment variables from ./karaf121.env 12:34:22 pytest -q transportpce_tests/1.2.1/test01_portmapping.py 12:34:22 tests200: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 12:34:22 tests200: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh oc200 12:34:22 using environment variables from ./karafoc200.env 12:34:22 pytest -q transportpce_tests/oc200/test01_portmapping.py 12:35:35 .......................................................FFFFF...... [100%] 12:39:31 51 passed in 308.45s (0:05:08) 12:39:31 pytest -q transportpce_tests/tapi/test02_full_topology.py 12:40:28 .................. [100%] 12:41:46 =================================== FAILURES =================================== 12:41:46 __________ TestTransportPCEPortmapping.test_05_xpdr_portmapping_info ___________ 12:41:46 12:41:46 self = 12:41:46 12:41:46 def test_05_xpdr_portmapping_info(self): 12:41:46 response = test_utils.get_portmapping_node_attr("XPDR-OC", "node-info", None) 12:41:46 > self.assertEqual(response['status_code'], requests.codes.ok) 12:41:46 E AssertionError: 409 != 200 12:41:46 12:41:46 transportpce_tests/oc200/test01_portmapping.py:69: AssertionError 12:41:46 ----------------------------- Captured stdout call ----------------------------- 12:41:46 execution of test_05_xpdr_portmapping_info 12:41:46 ________ TestTransportPCEPortmapping.test_06_mpdr_portmapping_NETWORK5 _________ 12:41:46 12:41:46 self = 12:41:46 12:41:46 def test_06_mpdr_portmapping_NETWORK5(self): 12:41:46 response = test_utils.get_portmapping_node_attr("XPDR-OC", "mapping", "XPDR1-NETWORK9") 12:41:46 > self.assertEqual(response['status_code'], requests.codes.ok) 12:41:46 E AssertionError: 409 != 200 12:41:46 12:41:46 transportpce_tests/oc200/test01_portmapping.py:81: AssertionError 12:41:46 ----------------------------- Captured stdout call ----------------------------- 12:41:46 execution of test_06_mpdr_portmapping_NETWORK5 12:41:46 _________ TestTransportPCEPortmapping.test_07_mpdr_portmapping_CLIENT1 _________ 12:41:46 12:41:46 self = 12:41:46 12:41:46 def test_07_mpdr_portmapping_CLIENT1(self): 12:41:46 response = test_utils.get_portmapping_node_attr("XPDR-OC", "mapping", "XPDR1-CLIENT1") 12:41:46 > self.assertEqual(response['status_code'], requests.codes.ok) 12:41:46 E AssertionError: 409 != 200 12:41:46 12:41:46 transportpce_tests/oc200/test01_portmapping.py:104: AssertionError 12:41:46 ----------------------------- Captured stdout call ----------------------------- 12:41:46 execution of test_07_mpdr_portmapping_CLIENT1 12:41:46 ___________ TestTransportPCEPortmapping.test_08_mpdr_switching_pool ____________ 12:41:46 12:41:46 self = 12:41:46 12:41:46 def test_08_mpdr_switching_pool(self): 12:41:46 response = test_utils.get_portmapping_node_attr("XPDR-OC", "switching-pool-lcp", "2") 12:41:46 > self.assertEqual(response['status_code'], requests.codes.ok) 12:41:46 E AssertionError: 409 != 200 12:41:46 12:41:46 transportpce_tests/oc200/test01_portmapping.py:127: AssertionError 12:41:46 ----------------------------- Captured stdout call ----------------------------- 12:41:46 execution of test_08_mpdr_switching_pool 12:41:46 ____________ TestTransportPCEPortmapping.test_09_check_mccapprofile ____________ 12:41:46 12:41:46 self = 12:41:46 12:41:46 def test_09_check_mccapprofile(self): 12:41:46 res = test_utils.get_portmapping_node_attr("XPDR-OC", "mc-capabilities", "XPDR-mcprofile") 12:41:46 > self.assertEqual(res['status_code'], requests.codes.ok) 12:41:46 E AssertionError: 409 != 200 12:41:46 12:41:46 transportpce_tests/oc200/test01_portmapping.py:144: AssertionError 12:41:46 ----------------------------- Captured stdout call ----------------------------- 12:41:46 execution of test_09_check_mccapprofile 12:41:46 =========================== short test summary info ============================ 12:41:46 FAILED transportpce_tests/oc200/test01_portmapping.py::TestTransportPCEPortmapping::test_05_xpdr_portmapping_info 12:41:46 FAILED transportpce_tests/oc200/test01_portmapping.py::TestTransportPCEPortmapping::test_06_mpdr_portmapping_NETWORK5 12:41:46 FAILED transportpce_tests/oc200/test01_portmapping.py::TestTransportPCEPortmapping::test_07_mpdr_portmapping_CLIENT1 12:41:46 FAILED transportpce_tests/oc200/test01_portmapping.py::TestTransportPCEPortmapping::test_08_mpdr_switching_pool 12:41:46 FAILED transportpce_tests/oc200/test01_portmapping.py::TestTransportPCEPortmapping::test_09_check_mccapprofile 12:41:46 5 failed, 5 passed in 443.51s (0:07:23) 12:41:46 tests200: exit 1 (443.84 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh oc200 pid=8331 12:41:53 ..FF.FFFFFFFFFFFF [100%] 12:42:15 =================================== FAILURES =================================== 12:42:15 __________ TestTransportPCEPortmapping.test_08_xpdr_device_connected ___________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query='content=nonconfig', fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_08_xpdr_device_connected(self): 12:42:15 > response = test_utils.check_device_connection("XPDRA01") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:104: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:409: in check_device_connection 12:42:15 response = get_request(url[RESTCONF_VERSION].format('{}', node)) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_08_xpdr_device_connected 12:42:15 __________ TestTransportPCEPortmapping.test_09_xpdr_portmapping_info ___________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_09_xpdr_portmapping_info(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:110: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_09_xpdr_portmapping_info 12:42:15 ________ TestTransportPCEPortmapping.test_10_xpdr_portmapping_NETWORK1 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_10_xpdr_portmapping_NETWORK1(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK1") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:123: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK1 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_10_xpdr_portmapping_NETWORK1 12:42:15 ________ TestTransportPCEPortmapping.test_11_xpdr_portmapping_NETWORK2 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_11_xpdr_portmapping_NETWORK2(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-NETWORK2") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:135: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-NETWORK2 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_11_xpdr_portmapping_NETWORK2 12:42:15 _________ TestTransportPCEPortmapping.test_12_xpdr_portmapping_CLIENT1 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_12_xpdr_portmapping_CLIENT1(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT1") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:147: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT1 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_12_xpdr_portmapping_CLIENT1 12:42:15 _________ TestTransportPCEPortmapping.test_13_xpdr_portmapping_CLIENT2 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_13_xpdr_portmapping_CLIENT2(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT2") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:159: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT2 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_13_xpdr_portmapping_CLIENT2 12:42:15 _________ TestTransportPCEPortmapping.test_14_xpdr_portmapping_CLIENT3 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_14_xpdr_portmapping_CLIENT3(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT3") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:170: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT3 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_14_xpdr_portmapping_CLIENT3 12:42:15 _________ TestTransportPCEPortmapping.test_15_xpdr_portmapping_CLIENT4 _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_15_xpdr_portmapping_CLIENT4(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "mapping", "XPDR1-CLIENT4") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:182: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/mapping=XPDR1-CLIENT4 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_15_xpdr_portmapping_CLIENT4 12:42:15 ________ TestTransportPCEPortmapping.test_16_xpdr_device_disconnection _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'DELETE' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'DELETE' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_16_xpdr_device_disconnection(self): 12:42:15 > response = test_utils.unmount_device("XPDRA01") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:193: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:398: in unmount_device 12:42:15 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:134: in delete_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_16_xpdr_device_disconnection 12:42:15 _________ TestTransportPCEPortmapping.test_17_xpdr_device_disconnected _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01', query='content=nonconfig', fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_17_xpdr_device_disconnected(self): 12:42:15 > response = test_utils.check_device_connection("XPDRA01") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:197: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:409: in check_device_connection 12:42:15 response = get_request(url[RESTCONF_VERSION].format('{}', node)) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=XPDRA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_17_xpdr_device_disconnected 12:42:15 ________ TestTransportPCEPortmapping.test_18_xpdr_device_not_connected _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_18_xpdr_device_not_connected(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("XPDRA01", "node-info", None) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:205: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=XPDRA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_18_xpdr_device_not_connected 12:42:15 _________ TestTransportPCEPortmapping.test_19_rdm_device_disconnection _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'DELETE' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Content-Length': '0', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'DELETE' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_19_rdm_device_disconnection(self): 12:42:15 > response = test_utils.unmount_device("ROADMA01") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:213: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:398: in unmount_device 12:42:15 response = delete_request(url[RESTCONF_VERSION].format('{}', node)) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:134: in delete_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01 (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_19_rdm_device_disconnection 12:42:15 _________ TestTransportPCEPortmapping.test_20_rdm_device_disconnected __________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01', query='content=nonconfig', fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_20_rdm_device_disconnected(self): 12:42:15 > response = test_utils.check_device_connection("ROADMA01") 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:217: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:409: in check_device_connection 12:42:15 response = get_request(url[RESTCONF_VERSION].format('{}', node)) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/network-topology:network-topology/topology=topology-netconf/node=ROADMA01?content=nonconfig (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_20_rdm_device_disconnected 12:42:15 _________ TestTransportPCEPortmapping.test_21_rdm_device_not_connected _________ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 > sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:204: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:85: in create_connection 12:42:15 raise err 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 address = ('localhost', 8191), timeout = 30, source_address = None 12:42:15 socket_options = [(6, 1, 1)] 12:42:15 12:42:15 def create_connection( 12:42:15 address: tuple[str, int], 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 source_address: tuple[str, int] | None = None, 12:42:15 socket_options: _TYPE_SOCKET_OPTIONS | None = None, 12:42:15 ) -> socket.socket: 12:42:15 """Connect to *address* and return the socket object. 12:42:15 12:42:15 Convenience function. Connect to *address* (a 2-tuple ``(host, 12:42:15 port)``) and return the socket object. Passing the optional 12:42:15 *timeout* parameter will set the timeout on the socket instance 12:42:15 before attempting to connect. If no *timeout* is supplied, the 12:42:15 global default timeout setting returned by :func:`socket.getdefaulttimeout` 12:42:15 is used. If *source_address* is set it must be a tuple of (host, port) 12:42:15 for the socket to bind as a source address before making the connection. 12:42:15 An host of '' or port 0 tells the OS to use the default. 12:42:15 """ 12:42:15 12:42:15 host, port = address 12:42:15 if host.startswith("["): 12:42:15 host = host.strip("[]") 12:42:15 err = None 12:42:15 12:42:15 # Using the value from allowed_gai_family() in the context of getaddrinfo lets 12:42:15 # us select whether to work with IPv4 DNS records, IPv6 records, or both. 12:42:15 # The original create_connection function always returns all records. 12:42:15 family = allowed_gai_family() 12:42:15 12:42:15 try: 12:42:15 host.encode("idna") 12:42:15 except UnicodeError: 12:42:15 raise LocationParseError(f"'{host}', label empty or too long") from None 12:42:15 12:42:15 for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): 12:42:15 af, socktype, proto, canonname, sa = res 12:42:15 sock = None 12:42:15 try: 12:42:15 sock = socket.socket(af, socktype, proto) 12:42:15 12:42:15 # If provided, set socket level options before connecting. 12:42:15 _set_socket_options(sock, socket_options) 12:42:15 12:42:15 if timeout is not _DEFAULT_TIMEOUT: 12:42:15 sock.settimeout(timeout) 12:42:15 if source_address: 12:42:15 sock.bind(source_address) 12:42:15 > sock.connect(sa) 12:42:15 E ConnectionRefusedError: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/connection.py:73: ConnectionRefusedError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' 12:42:15 body = None 12:42:15 headers = {'User-Agent': 'python-requests/2.32.5', 'Accept-Encoding': 'gzip, deflate', 'Accept': 'application/json', 'Connection': 'keep-alive', 'Content-Type': 'application/json', 'Authorization': 'Basic YWRtaW46YWRtaW4='} 12:42:15 retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 redirect = False, assert_same_host = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), pool_timeout = None 12:42:15 release_conn = False, chunked = False, body_pos = None, preload_content = False 12:42:15 decode_content = False, response_kw = {} 12:42:15 parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info', query=None, fragment=None) 12:42:15 destination_scheme = None, conn = None, release_this_conn = True 12:42:15 http_tunnel_required = False, err = None, clean_exit = False 12:42:15 12:42:15 def urlopen( # type: ignore[override] 12:42:15 self, 12:42:15 method: str, 12:42:15 url: str, 12:42:15 body: _TYPE_BODY | None = None, 12:42:15 headers: typing.Mapping[str, str] | None = None, 12:42:15 retries: Retry | bool | int | None = None, 12:42:15 redirect: bool = True, 12:42:15 assert_same_host: bool = True, 12:42:15 timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, 12:42:15 pool_timeout: int | None = None, 12:42:15 release_conn: bool | None = None, 12:42:15 chunked: bool = False, 12:42:15 body_pos: _TYPE_BODY_POSITION | None = None, 12:42:15 preload_content: bool = True, 12:42:15 decode_content: bool = True, 12:42:15 **response_kw: typing.Any, 12:42:15 ) -> BaseHTTPResponse: 12:42:15 """ 12:42:15 Get a connection from the pool and perform an HTTP request. This is the 12:42:15 lowest level call for making a request, so you'll need to specify all 12:42:15 the raw details. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 More commonly, it's appropriate to use a convenience method 12:42:15 such as :meth:`request`. 12:42:15 12:42:15 .. note:: 12:42:15 12:42:15 `release_conn` will only behave as expected if 12:42:15 `preload_content=False` because we want to make 12:42:15 `preload_content=False` the default behaviour someday soon without 12:42:15 breaking backwards compatibility. 12:42:15 12:42:15 :param method: 12:42:15 HTTP request method (such as GET, POST, PUT, etc.) 12:42:15 12:42:15 :param url: 12:42:15 The URL to perform the request on. 12:42:15 12:42:15 :param body: 12:42:15 Data to send in the request body, either :class:`str`, :class:`bytes`, 12:42:15 an iterable of :class:`str`/:class:`bytes`, or a file-like object. 12:42:15 12:42:15 :param headers: 12:42:15 Dictionary of custom headers to send, such as User-Agent, 12:42:15 If-None-Match, etc. If None, pool headers are used. If provided, 12:42:15 these headers completely replace any pool-specific headers. 12:42:15 12:42:15 :param retries: 12:42:15 Configure the number of retries to allow before raising a 12:42:15 :class:`~urllib3.exceptions.MaxRetryError` exception. 12:42:15 12:42:15 If ``None`` (default) will retry 3 times, see ``Retry.DEFAULT``. Pass a 12:42:15 :class:`~urllib3.util.retry.Retry` object for fine-grained control 12:42:15 over different types of retries. 12:42:15 Pass an integer number to retry connection errors that many times, 12:42:15 but no other types of errors. Pass zero to never retry. 12:42:15 12:42:15 If ``False``, then retries are disabled and any exception is raised 12:42:15 immediately. Also, instead of raising a MaxRetryError on redirects, 12:42:15 the redirect response will be returned. 12:42:15 12:42:15 :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. 12:42:15 12:42:15 :param redirect: 12:42:15 If True, automatically handle redirects (status codes 301, 302, 12:42:15 303, 307, 308). Each redirect counts as a retry. Disabling retries 12:42:15 will disable redirect, too. 12:42:15 12:42:15 :param assert_same_host: 12:42:15 If ``True``, will make sure that the host of the pool requests is 12:42:15 consistent else will raise HostChangedError. When ``False``, you can 12:42:15 use the pool on an HTTP proxy and request foreign hosts. 12:42:15 12:42:15 :param timeout: 12:42:15 If specified, overrides the default timeout for this one 12:42:15 request. It may be a float (in seconds) or an instance of 12:42:15 :class:`urllib3.util.Timeout`. 12:42:15 12:42:15 :param pool_timeout: 12:42:15 If set and the pool is set to block=True, then this method will 12:42:15 block for ``pool_timeout`` seconds and raise EmptyPoolError if no 12:42:15 connection is available within the time period. 12:42:15 12:42:15 :param bool preload_content: 12:42:15 If True, the response's body will be preloaded into memory. 12:42:15 12:42:15 :param bool decode_content: 12:42:15 If True, will attempt to decode the body based on the 12:42:15 'content-encoding' header. 12:42:15 12:42:15 :param release_conn: 12:42:15 If False, then the urlopen call will not release the connection 12:42:15 back into the pool once a response is received (but will release if 12:42:15 you read the entire contents of the response such as when 12:42:15 `preload_content=True`). This is useful if you're not preloading 12:42:15 the response's content immediately. You will need to call 12:42:15 ``r.release_conn()`` on the response ``r`` to return the connection 12:42:15 back into the pool. If None, it takes the value of ``preload_content`` 12:42:15 which defaults to ``True``. 12:42:15 12:42:15 :param bool chunked: 12:42:15 If True, urllib3 will send the body using chunked transfer 12:42:15 encoding. Otherwise, urllib3 will send the body using the standard 12:42:15 content-length form. Defaults to False. 12:42:15 12:42:15 :param int body_pos: 12:42:15 Position to seek to in file-like body in the event of a retry or 12:42:15 redirect. Typically this won't need to be set because urllib3 will 12:42:15 auto-populate the value when needed. 12:42:15 """ 12:42:15 parsed_url = parse_url(url) 12:42:15 destination_scheme = parsed_url.scheme 12:42:15 12:42:15 if headers is None: 12:42:15 headers = self.headers 12:42:15 12:42:15 if not isinstance(retries, Retry): 12:42:15 retries = Retry.from_int(retries, redirect=redirect, default=self.retries) 12:42:15 12:42:15 if release_conn is None: 12:42:15 release_conn = preload_content 12:42:15 12:42:15 # Check host 12:42:15 if assert_same_host and not self.is_same_host(url): 12:42:15 raise HostChangedError(self, url, retries) 12:42:15 12:42:15 # Ensure that the URL we're connecting to is properly encoded 12:42:15 if url.startswith("/"): 12:42:15 url = to_str(_encode_target(url)) 12:42:15 else: 12:42:15 url = to_str(parsed_url.url) 12:42:15 12:42:15 conn = None 12:42:15 12:42:15 # Track whether `conn` needs to be released before 12:42:15 # returning/raising/recursing. Update this variable if necessary, and 12:42:15 # leave `release_conn` constant throughout the function. That way, if 12:42:15 # the function recurses, the original value of `release_conn` will be 12:42:15 # passed down into the recursive call, and its value will be respected. 12:42:15 # 12:42:15 # See issue #651 [1] for details. 12:42:15 # 12:42:15 # [1] 12:42:15 release_this_conn = release_conn 12:42:15 12:42:15 http_tunnel_required = connection_requires_http_tunnel( 12:42:15 self.proxy, self.proxy_config, destination_scheme 12:42:15 ) 12:42:15 12:42:15 # Merge the proxy headers. Only done when not using HTTP CONNECT. We 12:42:15 # have to copy the headers dict so we can safely change it without those 12:42:15 # changes being reflected in anyone else's copy. 12:42:15 if not http_tunnel_required: 12:42:15 headers = headers.copy() # type: ignore[attr-defined] 12:42:15 headers.update(self.proxy_headers) # type: ignore[union-attr] 12:42:15 12:42:15 # Must keep the exception bound to a separate variable or else Python 3 12:42:15 # complains about UnboundLocalError. 12:42:15 err = None 12:42:15 12:42:15 # Keep track of whether we cleanly exited the except block. This 12:42:15 # ensures we do proper cleanup in finally. 12:42:15 clean_exit = False 12:42:15 12:42:15 # Rewind body position, if needed. Record current position 12:42:15 # for future rewinds in the event of a redirect/retry. 12:42:15 body_pos = set_file_position(body, body_pos) 12:42:15 12:42:15 try: 12:42:15 # Request a connection from the queue. 12:42:15 timeout_obj = self._get_timeout(timeout) 12:42:15 conn = self._get_conn(timeout=pool_timeout) 12:42:15 12:42:15 conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] 12:42:15 12:42:15 # Is this a closed/new connection that requires CONNECT tunnelling? 12:42:15 if self.proxy is not None and http_tunnel_required and conn.is_closed: 12:42:15 try: 12:42:15 self._prepare_proxy(conn) 12:42:15 except (BaseSSLError, OSError, SocketTimeout) as e: 12:42:15 self._raise_timeout( 12:42:15 err=e, url=self.proxy.url, timeout_value=conn.timeout 12:42:15 ) 12:42:15 raise 12:42:15 12:42:15 # If we're going to release the connection in ``finally:``, then 12:42:15 # the response doesn't need to know about the connection. Otherwise 12:42:15 # it will also try to release it and we'll have a double-release 12:42:15 # mess. 12:42:15 response_conn = conn if not release_conn else None 12:42:15 12:42:15 # Make the request on the HTTPConnection object 12:42:15 > response = self._make_request( 12:42:15 conn, 12:42:15 method, 12:42:15 url, 12:42:15 timeout=timeout_obj, 12:42:15 body=body, 12:42:15 headers=headers, 12:42:15 chunked=chunked, 12:42:15 retries=retries, 12:42:15 response_conn=response_conn, 12:42:15 preload_content=preload_content, 12:42:15 decode_content=decode_content, 12:42:15 **response_kw, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:787: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:493: in _make_request 12:42:15 conn.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:500: in request 12:42:15 self.endheaders() 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1298: in endheaders 12:42:15 self._send_output(message_body, encode_chunked=encode_chunked) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:1058: in _send_output 12:42:15 self.send(msg) 12:42:15 /opt/pyenv/versions/3.11.10/lib/python3.11/http/client.py:996: in send 12:42:15 self.connect() 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:331: in connect 12:42:15 self.sock = self._new_conn() 12:42:15 ^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 12:42:15 def _new_conn(self) -> socket.socket: 12:42:15 """Establish a socket connection and set nodelay settings on it. 12:42:15 12:42:15 :return: New socket connection. 12:42:15 """ 12:42:15 try: 12:42:15 sock = connection.create_connection( 12:42:15 (self._dns_host, self.port), 12:42:15 self.timeout, 12:42:15 source_address=self.source_address, 12:42:15 socket_options=self.socket_options, 12:42:15 ) 12:42:15 except socket.gaierror as e: 12:42:15 raise NameResolutionError(self.host, self, e) from e 12:42:15 except SocketTimeout as e: 12:42:15 raise ConnectTimeoutError( 12:42:15 self, 12:42:15 f"Connection to {self.host} timed out. (connect timeout={self.timeout})", 12:42:15 ) from e 12:42:15 12:42:15 except OSError as e: 12:42:15 > raise NewConnectionError( 12:42:15 self, f"Failed to establish a new connection: {e}" 12:42:15 ) from e 12:42:15 E urllib3.exceptions.NewConnectionError: HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connection.py:219: NewConnectionError 12:42:15 12:42:15 The above exception was the direct cause of the following exception: 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 > resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:644: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/connectionpool.py:841: in urlopen 12:42:15 retries = retries.increment( 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = Retry(total=0, connect=None, read=False, redirect=None, status=None) 12:42:15 method = 'GET' 12:42:15 url = '/rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info' 12:42:15 response = None 12:42:15 error = NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused") 12:42:15 _pool = 12:42:15 _stacktrace = 12:42:15 12:42:15 def increment( 12:42:15 self, 12:42:15 method: str | None = None, 12:42:15 url: str | None = None, 12:42:15 response: BaseHTTPResponse | None = None, 12:42:15 error: Exception | None = None, 12:42:15 _pool: ConnectionPool | None = None, 12:42:15 _stacktrace: TracebackType | None = None, 12:42:15 ) -> Self: 12:42:15 """Return a new Retry object with incremented retry counters. 12:42:15 12:42:15 :param response: A response object, or None, if the server did not 12:42:15 return a response. 12:42:15 :type response: :class:`~urllib3.response.BaseHTTPResponse` 12:42:15 :param Exception error: An error encountered during the request, or 12:42:15 None if the response was received successfully. 12:42:15 12:42:15 :return: A new ``Retry`` object. 12:42:15 """ 12:42:15 if self.total is False and error: 12:42:15 # Disabled, indicate to re-raise the error. 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 12:42:15 total = self.total 12:42:15 if total is not None: 12:42:15 total -= 1 12:42:15 12:42:15 connect = self.connect 12:42:15 read = self.read 12:42:15 redirect = self.redirect 12:42:15 status_count = self.status 12:42:15 other = self.other 12:42:15 cause = "unknown" 12:42:15 status = None 12:42:15 redirect_location = None 12:42:15 12:42:15 if error and self._is_connection_error(error): 12:42:15 # Connect retry? 12:42:15 if connect is False: 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif connect is not None: 12:42:15 connect -= 1 12:42:15 12:42:15 elif error and self._is_read_error(error): 12:42:15 # Read retry? 12:42:15 if read is False or method is None or not self._is_method_retryable(method): 12:42:15 raise reraise(type(error), error, _stacktrace) 12:42:15 elif read is not None: 12:42:15 read -= 1 12:42:15 12:42:15 elif error: 12:42:15 # Other retry? 12:42:15 if other is not None: 12:42:15 other -= 1 12:42:15 12:42:15 elif response and response.get_redirect_location(): 12:42:15 # Redirect retry? 12:42:15 if redirect is not None: 12:42:15 redirect -= 1 12:42:15 cause = "too many redirects" 12:42:15 response_redirect_location = response.get_redirect_location() 12:42:15 if response_redirect_location: 12:42:15 redirect_location = response_redirect_location 12:42:15 status = response.status 12:42:15 12:42:15 else: 12:42:15 # Incrementing because of a server error like a 500 in 12:42:15 # status_forcelist and the given method is in the allowed_methods 12:42:15 cause = ResponseError.GENERIC_ERROR 12:42:15 if response and response.status: 12:42:15 if status_count is not None: 12:42:15 status_count -= 1 12:42:15 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) 12:42:15 status = response.status 12:42:15 12:42:15 history = self.history + ( 12:42:15 RequestHistory(method, url, error, status, redirect_location), 12:42:15 ) 12:42:15 12:42:15 new_retry = self.new( 12:42:15 total=total, 12:42:15 connect=connect, 12:42:15 read=read, 12:42:15 redirect=redirect, 12:42:15 status=status_count, 12:42:15 other=other, 12:42:15 history=history, 12:42:15 ) 12:42:15 12:42:15 if new_retry.is_exhausted(): 12:42:15 reason = error or ResponseError(cause) 12:42:15 > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 E urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/urllib3/util/retry.py:535: MaxRetryError 12:42:15 12:42:15 During handling of the above exception, another exception occurred: 12:42:15 12:42:15 self = 12:42:15 12:42:15 def test_21_rdm_device_not_connected(self): 12:42:15 > response = test_utils.get_portmapping_node_attr("ROADMA01", "node-info", None) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 12:42:15 transportpce_tests/1.2.1/test01_portmapping.py:225: 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 transportpce_tests/common/test_utils.py:519: in get_portmapping_node_attr 12:42:15 response = get_request(target_url) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 transportpce_tests/common/test_utils.py:117: in get_request 12:42:15 return requests.request( 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/api.py:59: in request 12:42:15 return session.request(method=method, url=url, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:589: in request 12:42:15 resp = self.send(prep, **send_kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/sessions.py:703: in send 12:42:15 r = adapter.send(request, **kwargs) 12:42:15 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 12:42:15 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 12:42:15 12:42:15 self = 12:42:15 request = , stream = False 12:42:15 timeout = Timeout(connect=30, read=30, total=None), verify = True, cert = None 12:42:15 proxies = OrderedDict() 12:42:15 12:42:15 def send( 12:42:15 self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None 12:42:15 ): 12:42:15 """Sends PreparedRequest object. Returns Response object. 12:42:15 12:42:15 :param request: The :class:`PreparedRequest ` being sent. 12:42:15 :param stream: (optional) Whether to stream the request content. 12:42:15 :param timeout: (optional) How long to wait for the server to send 12:42:15 data before giving up, as a float, or a :ref:`(connect timeout, 12:42:15 read timeout) ` tuple. 12:42:15 :type timeout: float or tuple or urllib3 Timeout object 12:42:15 :param verify: (optional) Either a boolean, in which case it controls whether 12:42:15 we verify the server's TLS certificate, or a string, in which case it 12:42:15 must be a path to a CA bundle to use 12:42:15 :param cert: (optional) Any user-provided SSL certificate to be trusted. 12:42:15 :param proxies: (optional) The proxies dictionary to apply to the request. 12:42:15 :rtype: requests.Response 12:42:15 """ 12:42:15 12:42:15 try: 12:42:15 conn = self.get_connection_with_tls_context( 12:42:15 request, verify, proxies=proxies, cert=cert 12:42:15 ) 12:42:15 except LocationValueError as e: 12:42:15 raise InvalidURL(e, request=request) 12:42:15 12:42:15 self.cert_verify(conn, request.url, verify, cert) 12:42:15 url = self.request_url(request, proxies) 12:42:15 self.add_headers( 12:42:15 request, 12:42:15 stream=stream, 12:42:15 timeout=timeout, 12:42:15 verify=verify, 12:42:15 cert=cert, 12:42:15 proxies=proxies, 12:42:15 ) 12:42:15 12:42:15 chunked = not (request.body is None or "Content-Length" in request.headers) 12:42:15 12:42:15 if isinstance(timeout, tuple): 12:42:15 try: 12:42:15 connect, read = timeout 12:42:15 timeout = TimeoutSauce(connect=connect, read=read) 12:42:15 except ValueError: 12:42:15 raise ValueError( 12:42:15 f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " 12:42:15 f"or a single float to set both timeouts to the same value." 12:42:15 ) 12:42:15 elif isinstance(timeout, TimeoutSauce): 12:42:15 pass 12:42:15 else: 12:42:15 timeout = TimeoutSauce(connect=timeout, read=timeout) 12:42:15 12:42:15 try: 12:42:15 resp = conn.urlopen( 12:42:15 method=request.method, 12:42:15 url=url, 12:42:15 body=request.body, 12:42:15 headers=request.headers, 12:42:15 redirect=False, 12:42:15 assert_same_host=False, 12:42:15 preload_content=False, 12:42:15 decode_content=False, 12:42:15 retries=self.max_retries, 12:42:15 timeout=timeout, 12:42:15 chunked=chunked, 12:42:15 ) 12:42:15 12:42:15 except (ProtocolError, OSError) as err: 12:42:15 raise ConnectionError(err, request=request) 12:42:15 12:42:15 except MaxRetryError as e: 12:42:15 if isinstance(e.reason, ConnectTimeoutError): 12:42:15 # TODO: Remove this in 3.0.0: see #2811 12:42:15 if not isinstance(e.reason, NewConnectionError): 12:42:15 raise ConnectTimeout(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, ResponseError): 12:42:15 raise RetryError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _ProxyError): 12:42:15 raise ProxyError(e, request=request) 12:42:15 12:42:15 if isinstance(e.reason, _SSLError): 12:42:15 # This branch is for urllib3 v1.22 and later. 12:42:15 raise SSLError(e, request=request) 12:42:15 12:42:15 > raise ConnectionError(e, request=request) 12:42:15 E requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8191): Max retries exceeded with url: /rests/data/transportpce-portmapping:network/nodes=ROADMA01/node-info (Caused by NewConnectionError("HTTPConnection(host='localhost', port=8191): Failed to establish a new connection: [Errno 111] Connection refused")) 12:42:15 12:42:15 ../.tox/tests121/lib/python3.11/site-packages/requests/adapters.py:677: ConnectionError 12:42:15 ----------------------------- Captured stdout call ----------------------------- 12:42:15 execution of test_21_rdm_device_not_connected 12:42:15 --------------------------- Captured stdout teardown --------------------------- 12:42:15 all processes killed 12:42:15 ODL log file stored 12:42:15 =========================== short test summary info ============================ 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_08_xpdr_device_connected 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_09_xpdr_portmapping_info 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_10_xpdr_portmapping_NETWORK1 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_11_xpdr_portmapping_NETWORK2 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_12_xpdr_portmapping_CLIENT1 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_13_xpdr_portmapping_CLIENT2 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_14_xpdr_portmapping_CLIENT3 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_15_xpdr_portmapping_CLIENT4 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_16_xpdr_device_disconnection 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_17_xpdr_device_disconnected 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_18_xpdr_device_not_connected 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_19_rdm_device_disconnection 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_20_rdm_device_disconnected 12:42:15 FAILED transportpce_tests/1.2.1/test01_portmapping.py::TestTransportPCEPortmapping::test_21_rdm_device_not_connected 12:42:15 14 failed, 7 passed in 472.42s (0:07:52) 12:42:15 tests200: FAIL ✖ in 7 minutes 31.38 seconds 12:42:15 tests121: exit 1 (472.77 seconds) /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 1.2.1 pid=8320 12:42:21 ................. [100%] 12:51:05 36 passed in 693.97s (0:11:33) 12:51:05 pytest -q transportpce_tests/tapi/test03_tapi_device_change_notifications.py 12:51:54 ....................................................................... [100%] 12:56:24 71 passed in 318.89s (0:05:18) 12:56:24 pytest -q transportpce_tests/tapi/test04_topo_extension.py 12:57:18 ................... [100%] 12:58:49 19 passed in 143.88s (0:02:23) 12:58:49 pytest -q transportpce_tests/tapi/test05_pce_tapi.py 13:00:52 ...................... [100%] 13:06:28 22 passed in 459.54s (0:07:39) 13:06:29 tests121: FAIL ✖ in 8 minutes 0.31 seconds 13:06:29 tests_tapi: OK ✔ in 32 minutes 14.01 seconds 13:06:29 tests221: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 13:06:36 tests221: freeze> python -m pip freeze --all 13:06:36 tests221: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 13:06:36 tests221: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh 2.2.1 13:06:36 using environment variables from ./karaf221.env 13:06:36 pytest -q transportpce_tests/2.2.1/test01_portmapping.py 13:07:12 ................................... [100%] 13:07:52 35 passed in 76.34s (0:01:16) 13:07:52 pytest -q transportpce_tests/2.2.1/test02_topo_portmapping.py 13:08:25 ...... [100%] 13:08:39 6 passed in 46.39s 13:08:39 pytest -q transportpce_tests/2.2.1/test03_topology.py 13:09:23 ............................................ [100%] 13:13:58 44 passed in 318.89s (0:05:18) 13:13:58 pytest -q transportpce_tests/2.2.1/test04_otn_topology.py 13:14:35 ............ [100%] 13:15:00 12 passed in 60.88s (0:01:00) 13:15:00 pytest -q transportpce_tests/2.2.1/test05_flex_grid.py 13:15:30 ................ [100%] 13:16:59 16 passed in 119.06s (0:01:59) 13:16:59 pytest -q transportpce_tests/2.2.1/test06_renderer_service_path_nominal.py 13:17:33 ............................... [100%] 13:17:39 31 passed in 40.21s 13:17:39 pytest -q transportpce_tests/2.2.1/test07_otn_renderer.py 13:18:18 .......................... [100%] 13:19:14 26 passed in 94.33s (0:01:34) 13:19:14 pytest -q transportpce_tests/2.2.1/test08_otn_sh_renderer.py 13:19:53 ...................... [100%] 13:20:56 22 passed in 102.04s (0:01:42) 13:20:56 pytest -q transportpce_tests/2.2.1/test09_olm.py 13:21:39 ........................................ [100%] 13:24:01 40 passed in 184.27s (0:03:04) 13:24:01 pytest -q transportpce_tests/2.2.1/test11_otn_end2end.py 13:24:46 ........................................................................ [ 74%] 13:30:23 ......................... [100%] 13:35:15 97 passed in 673.62s (0:11:13) 13:35:15 pytest -q transportpce_tests/2.2.1/test12_end2end.py 13:35:56 ...................................................... [100%] 13:42:44 54 passed in 448.54s (0:07:28) 13:42:44 pytest -q transportpce_tests/2.2.1/test14_otn_switch_end2end.py 13:43:40 ........................................................................ [ 71%] 13:48:49 ............................. [100%] 13:50:59 101 passed in 494.55s (0:08:14) 13:50:59 pytest -q transportpce_tests/2.2.1/test15_otn_end2end_with_intermediate_switch.py 13:51:55 ........................................................................ [ 67%] 13:57:42 ................................... [100%] 14:07:02 107 passed in 963.51s (0:16:03) 14:07:02 pytest -q transportpce_tests/2.2.1/test16_freq_end2end.py 14:07:46 ............................................. [100%] 14:13:24 45 passed in 381.35s (0:06:21) 14:13:24 tests221: OK ✔ in 1 hour 6 minutes 55.59 seconds 14:13:24 tests_hybrid: install_deps> python -I -m pip install 'setuptools>=7.0' -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/requirements.txt -r /w/workspace/transportpce-tox-verify-transportpce-master/tests/test-requirements.txt 14:13:31 tests_hybrid: freeze> python -m pip freeze --all 14:13:31 tests_hybrid: bcrypt==5.0.0,certifi==2026.1.4,cffi==2.0.0,charset-normalizer==3.4.4,cryptography==46.0.5,dict2xml==1.7.8,idna==3.11,iniconfig==2.3.0,invoke==2.2.1,lxml==6.0.2,netconf-client==3.5.0,packaging==26.0,paramiko==4.0.0,pip==26.0.1,pluggy==1.6.0,psutil==7.2.2,pycparser==3.0,Pygments==2.19.2,PyNaCl==1.6.2,pytest==9.0.2,requests==2.32.5,setuptools==82.0.0,urllib3==2.6.3 14:13:31 tests_hybrid: commands[0] /w/workspace/transportpce-tox-verify-transportpce-master/tests> ./launch_tests.sh hybrid 14:13:31 using environment variables from ./karaf221.env 14:13:31 pytest -q transportpce_tests/hybrid/test01_device_change_notifications.py 14:14:12 ................................................... [100%] 14:18:59 51 passed in 327.05s (0:05:27) 14:18:59 pytest -q transportpce_tests/hybrid/test02_B100G_end2end.py 14:19:42 ........................................................................ [ 66%] 14:24:04 ..................................... [100%] 14:29:10 109 passed in 610.57s (0:10:10) 14:29:10 pytest -q transportpce_tests/hybrid/test03_autonomous_reroute.py 14:29:58 ..................................................... [100%] 14:33:31 53 passed in 260.87s (0:04:20) 14:33:31 buildcontroller: FAIL code 1 (25.99=setup[8.74]+cmd[17.24] seconds) 14:33:31 sims: OK (13.16=setup[9.75]+cmd[3.42] seconds) 14:33:31 build_karaf_tests121: OK (67.54=setup[8.12]+cmd[59.42] seconds) 14:33:31 testsPCE: FAIL code 1 (2464.29=setup[62.40]+cmd[2401.89] seconds) 14:33:31 tests121: FAIL code 1 (480.31=setup[7.53]+cmd[472.77] seconds) 14:33:31 build_karaf_tests221: OK (73.51=setup[9.70]+cmd[63.81] seconds) 14:33:31 tests_tapi: OK (1934.01=setup[7.50]+cmd[1926.51] seconds) 14:33:31 tests221: OK (4015.59=setup[7.33]+cmd[4008.26] seconds) 14:33:31 build_karaf_tests71: OK (55.41=setup[13.20]+cmd[42.21] seconds) 14:33:31 tests71: OK (437.98=setup[11.12]+cmd[426.85] seconds) 14:33:31 build_karaf_tests200: OK (67.91=setup[9.41]+cmd[58.50] seconds) 14:33:31 tests200: FAIL code 1 (451.38=setup[7.54]+cmd[443.84] seconds) 14:33:31 tests_hybrid: OK (1206.66=setup[7.23]+cmd[1199.43] seconds) 14:33:31 buildlighty: FAIL code 9 (36.80=setup[18.35]+cmd[18.46] seconds) 14:33:31 docs: OK (29.94=setup[27.92]+cmd[2.02] seconds) 14:33:31 docs-linkcheck: OK (32.84=setup[28.18]+cmd[4.66] seconds) 14:33:31 checkbashisms: OK (3.16=setup[1.91]+cmd[0.00,0.04,1.20] seconds) 14:33:31 pre-commit: OK (53.03=setup[2.73]+cmd[0.00,0.01,36.25,14.04] seconds) 14:33:31 pylint: OK (40.13=setup[3.27]+cmd[36.86] seconds) 14:33:31 evaluation failed :( (9732.21 seconds) 14:33:31 + tox_status=1 14:33:31 + echo '---> Completed tox runs' 14:33:31 ---> Completed tox runs 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/build_karaf_tests121/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=build_karaf_tests121 14:33:31 + cp -r .tox/build_karaf_tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests121 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/build_karaf_tests200/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=build_karaf_tests200 14:33:31 + cp -r .tox/build_karaf_tests200/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests200 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/build_karaf_tests221/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=build_karaf_tests221 14:33:31 + cp -r .tox/build_karaf_tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests221 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/build_karaf_tests71/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=build_karaf_tests71 14:33:31 + cp -r .tox/build_karaf_tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/build_karaf_tests71 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/buildcontroller/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=buildcontroller 14:33:31 + cp -r .tox/buildcontroller/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildcontroller 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/buildlighty/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=buildlighty 14:33:31 + cp -r .tox/buildlighty/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/buildlighty 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/checkbashisms/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=checkbashisms 14:33:31 + cp -r .tox/checkbashisms/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/checkbashisms 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/docs-linkcheck/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=docs-linkcheck 14:33:31 + cp -r .tox/docs-linkcheck/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs-linkcheck 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/docs/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=docs 14:33:31 + cp -r .tox/docs/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/docs 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/pre-commit/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=pre-commit 14:33:31 + cp -r .tox/pre-commit/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pre-commit 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/pylint/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=pylint 14:33:31 + cp -r .tox/pylint/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/pylint 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/sims/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=sims 14:33:31 + cp -r .tox/sims/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/sims 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests121/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests121 14:33:31 + cp -r .tox/tests121/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests121 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests200/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests200 14:33:31 + cp -r .tox/tests200/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests200 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests221/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests221 14:33:31 + cp -r .tox/tests221/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests221 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests71/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests71 14:33:31 + cp -r .tox/tests71/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests71 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/testsPCE/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=testsPCE 14:33:31 + cp -r .tox/testsPCE/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/testsPCE 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests_hybrid/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests_hybrid 14:33:31 + cp -r .tox/tests_hybrid/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_hybrid 14:33:31 + for i in .tox/*/log 14:33:31 ++ echo .tox/tests_tapi/log 14:33:31 ++ awk -F/ '{print $2}' 14:33:31 + tox_env=tests_tapi 14:33:31 + cp -r .tox/tests_tapi/log /w/workspace/transportpce-tox-verify-transportpce-master/archives/tox/tests_tapi 14:33:31 + DOC_DIR=docs/_build/html 14:33:31 + [[ -d docs/_build/html ]] 14:33:31 + echo '---> Archiving generated docs' 14:33:31 ---> Archiving generated docs 14:33:31 + mv docs/_build/html /w/workspace/transportpce-tox-verify-transportpce-master/archives/docs 14:33:31 + echo '---> tox-run.sh ends' 14:33:31 ---> tox-run.sh ends 14:33:31 + test 1 -eq 0 14:33:31 + exit 1 14:33:31 ++ '[' 1 = 1 ']' 14:33:31 ++ '[' -x /usr/bin/clear_console ']' 14:33:31 ++ /usr/bin/clear_console -q 14:33:31 Build step 'Execute shell' marked build as failure 14:33:31 $ ssh-agent -k 14:33:31 unset SSH_AUTH_SOCK; 14:33:31 unset SSH_AGENT_PID; 14:33:31 echo Agent pid 1573 killed; 14:33:31 [ssh-agent] Stopped. 14:33:31 [PostBuildScript] - [INFO] Executing post build scripts. 14:33:31 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins3518906514468344076.sh 14:33:31 ---> sysstat.sh 14:33:32 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins826206397781733528.sh 14:33:32 ---> package-listing.sh 14:33:32 ++ tr '[:upper:]' '[:lower:]' 14:33:32 ++ facter osfamily 14:33:32 + OS_FAMILY=debian 14:33:32 + workspace=/w/workspace/transportpce-tox-verify-transportpce-master 14:33:32 + START_PACKAGES=/tmp/packages_start.txt 14:33:32 + END_PACKAGES=/tmp/packages_end.txt 14:33:32 + DIFF_PACKAGES=/tmp/packages_diff.txt 14:33:32 + PACKAGES=/tmp/packages_start.txt 14:33:32 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 14:33:32 + PACKAGES=/tmp/packages_end.txt 14:33:32 + case "${OS_FAMILY}" in 14:33:32 + dpkg -l 14:33:32 + grep '^ii' 14:33:32 + '[' -f /tmp/packages_start.txt ']' 14:33:32 + '[' -f /tmp/packages_end.txt ']' 14:33:32 + diff /tmp/packages_start.txt /tmp/packages_end.txt 14:33:32 + '[' /w/workspace/transportpce-tox-verify-transportpce-master ']' 14:33:32 + mkdir -p /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 14:33:32 + cp -f /tmp/packages_diff.txt /tmp/packages_end.txt /tmp/packages_start.txt /w/workspace/transportpce-tox-verify-transportpce-master/archives/ 14:33:32 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15131079148701039465.sh 14:33:32 ---> capture-instance-metadata.sh 14:33:32 Setup pyenv: 14:33:32 system 14:33:32 3.8.20 14:33:32 3.9.20 14:33:32 3.10.15 14:33:32 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 14:33:32 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-NQwY from file:/tmp/.os_lf_venv 14:33:32 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 14:33:32 lf-activate-venv(): INFO: Attempting to install with network-safe options... 14:33:35 lf-activate-venv(): INFO: Base packages installed successfully 14:33:35 lf-activate-venv(): INFO: Installing additional packages: lftools 14:33:46 lf-activate-venv(): INFO: Adding /tmp/venv-NQwY/bin to PATH 14:33:46 INFO: Running in OpenStack, capturing instance metadata 14:33:46 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins1552007341662731843.sh 14:33:46 provisioning config files... 14:33:47 Could not find credentials [logs] for transportpce-tox-verify-transportpce-master #4391 14:33:47 copy managed file [jenkins-log-archives-settings] to file:/w/workspace/transportpce-tox-verify-transportpce-master@tmp/config1376596168709613231tmp 14:33:47 Regular expression run condition: Expression=[^.*logs-s3.*], Label=[odl-logs-s3-cloudfront-index] 14:33:47 Run condition [Regular expression match] enabling perform for step [Provide Configuration files] 14:33:47 provisioning config files... 14:33:47 copy managed file [jenkins-s3-log-ship] to file:/home/jenkins/.aws/credentials 14:33:47 [EnvInject] - Injecting environment variables from a build step. 14:33:47 [EnvInject] - Injecting as environment variables the properties content 14:33:47 SERVER_ID=logs 14:33:47 14:33:47 [EnvInject] - Variables injected successfully. 14:33:47 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins11759739548851719572.sh 14:33:47 ---> create-netrc.sh 14:33:47 WARN: Log server credential not found. 14:33:47 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins899026499255842334.sh 14:33:47 ---> python-tools-install.sh 14:33:47 Setup pyenv: 14:33:47 system 14:33:47 3.8.20 14:33:47 3.9.20 14:33:47 3.10.15 14:33:47 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 14:33:47 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-NQwY from file:/tmp/.os_lf_venv 14:33:47 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 14:33:47 lf-activate-venv(): INFO: Attempting to install with network-safe options... 14:33:49 lf-activate-venv(): INFO: Base packages installed successfully 14:33:49 lf-activate-venv(): INFO: Installing additional packages: lftools 14:33:59 lf-activate-venv(): INFO: Adding /tmp/venv-NQwY/bin to PATH 14:33:59 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins17313776328937363764.sh 14:33:59 ---> sudo-logs.sh 14:33:59 Archiving 'sudo' log.. 14:33:59 [transportpce-tox-verify-transportpce-master] $ /bin/bash /tmp/jenkins15059761312553728451.sh 14:33:59 ---> job-cost.sh 14:33:59 INFO: Activating Python virtual environment... 14:33:59 Setup pyenv: 14:33:59 system 14:33:59 3.8.20 14:33:59 3.9.20 14:33:59 3.10.15 14:33:59 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 14:33:59 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-NQwY from file:/tmp/.os_lf_venv 14:33:59 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 14:33:59 lf-activate-venv(): INFO: Attempting to install with network-safe options... 14:34:01 lf-activate-venv(): INFO: Base packages installed successfully 14:34:01 lf-activate-venv(): INFO: Installing additional packages: zipp==1.1.0 python-openstackclient urllib3~=1.26.15 14:34:08 lf-activate-venv(): INFO: Adding /tmp/venv-NQwY/bin to PATH 14:34:08 INFO: No stack-cost file found 14:34:08 INFO: Instance uptime: 9891s 14:34:08 INFO: Fetching instance metadata (attempt 1 of 3)... 14:34:08 DEBUG: URL: http://169.254.169.254/latest/meta-data/instance-type 14:34:08 INFO: Successfully fetched instance metadata 14:34:08 INFO: Instance type: v3-standard-4 14:34:08 INFO: Retrieving pricing info for: v3-standard-4 14:34:08 INFO: Fetching Vexxhost pricing API (attempt 1 of 3)... 14:34:08 DEBUG: URL: https://pricing.vexxhost.net/v1/pricing/v3-standard-4/cost?seconds=9891 14:34:09 INFO: Successfully fetched Vexxhost pricing API 14:34:09 INFO: Retrieved cost: 0.33 14:34:09 INFO: Retrieved resource: v3-standard-4 14:34:09 INFO: Creating archive directory: /w/workspace/transportpce-tox-verify-transportpce-master/archives/cost 14:34:09 INFO: Archiving costs to: /w/workspace/transportpce-tox-verify-transportpce-master/archives/cost.csv 14:34:09 INFO: Successfully archived job cost data 14:34:09 DEBUG: Cost data: transportpce-tox-verify-transportpce-master,4391,2026-02-20 14:34:09,v3-standard-4,9891,0.33,0.00,FAILURE 14:34:09 [transportpce-tox-verify-transportpce-master] $ /bin/bash -l /tmp/jenkins4307516750161328800.sh 14:34:09 ---> logs-deploy.sh 14:34:09 Setup pyenv: 14:34:09 system 14:34:09 3.8.20 14:34:09 3.9.20 14:34:09 3.10.15 14:34:09 * 3.11.10 (set by /w/workspace/transportpce-tox-verify-transportpce-master/.python-version) 14:34:09 lf-activate-venv(): INFO: Reuse venv:/tmp/venv-NQwY from file:/tmp/.os_lf_venv 14:34:09 lf-activate-venv(): INFO: Installing base packages (pip, setuptools, virtualenv) 14:34:09 lf-activate-venv(): INFO: Attempting to install with network-safe options... 14:34:11 lf-activate-venv(): INFO: Base packages installed successfully 14:34:11 lf-activate-venv(): INFO: Installing additional packages: lftools urllib3~=1.26.15 14:34:20 lf-activate-venv(): INFO: Adding /tmp/venv-NQwY/bin to PATH 14:34:20 WARNING: Nexus logging server not set 14:34:20 INFO: S3 path logs/releng/vex-yul-odl-jenkins-1/transportpce-tox-verify-transportpce-master/4391/ 14:34:20 INFO: archiving logs to S3 14:34:22 ---> uname -a: 14:34:22 Linux prd-ubuntu2204-docker-4c-16g-19296 5.15.0-168-generic #178-Ubuntu SMP Fri Jan 9 19:05:03 UTC 2026 x86_64 x86_64 x86_64 GNU/Linux 14:34:22 14:34:22 14:34:22 ---> lscpu: 14:34:22 Architecture: x86_64 14:34:22 CPU op-mode(s): 32-bit, 64-bit 14:34:22 Address sizes: 40 bits physical, 48 bits virtual 14:34:22 Byte Order: Little Endian 14:34:22 CPU(s): 4 14:34:22 On-line CPU(s) list: 0-3 14:34:22 Vendor ID: AuthenticAMD 14:34:22 Model name: AMD EPYC-Rome Processor 14:34:22 CPU family: 23 14:34:22 Model: 49 14:34:22 Thread(s) per core: 1 14:34:22 Core(s) per socket: 1 14:34:22 Socket(s): 4 14:34:22 Stepping: 0 14:34:22 BogoMIPS: 5599.99 14:34:22 Flags: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy svm cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase tsc_adjust bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 clzero xsaveerptr wbnoinvd arat npt nrip_save umip rdpid arch_capabilities 14:34:22 Virtualization: AMD-V 14:34:22 Hypervisor vendor: KVM 14:34:22 Virtualization type: full 14:34:22 L1d cache: 128 KiB (4 instances) 14:34:22 L1i cache: 128 KiB (4 instances) 14:34:22 L2 cache: 2 MiB (4 instances) 14:34:22 L3 cache: 64 MiB (4 instances) 14:34:22 NUMA node(s): 1 14:34:22 NUMA node0 CPU(s): 0-3 14:34:22 Vulnerability Gather data sampling: Not affected 14:34:22 Vulnerability Indirect target selection: Not affected 14:34:22 Vulnerability Itlb multihit: Not affected 14:34:22 Vulnerability L1tf: Not affected 14:34:22 Vulnerability Mds: Not affected 14:34:22 Vulnerability Meltdown: Not affected 14:34:22 Vulnerability Mmio stale data: Not affected 14:34:22 Vulnerability Reg file data sampling: Not affected 14:34:22 Vulnerability Retbleed: Mitigation; untrained return thunk; SMT disabled 14:34:22 Vulnerability Spec rstack overflow: Mitigation; SMT disabled 14:34:22 Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp 14:34:22 Vulnerability Spectre v1: Mitigation; usercopy/swapgs barriers and __user pointer sanitization 14:34:22 Vulnerability Spectre v2: Mitigation; Retpolines; IBPB conditional; STIBP disabled; RSB filling; PBRSB-eIBRS Not affected; BHI Not affected 14:34:22 Vulnerability Srbds: Not affected 14:34:22 Vulnerability Tsa: Not affected 14:34:22 Vulnerability Tsx async abort: Not affected 14:34:22 Vulnerability Vmscape: Not affected 14:34:22 14:34:22 14:34:22 ---> nproc: 14:34:22 4 14:34:22 14:34:22 14:34:22 ---> df -h: 14:34:22 Filesystem Size Used Avail Use% Mounted on 14:34:22 tmpfs 1.6G 1.1M 1.6G 1% /run 14:34:22 /dev/vda1 78G 16G 62G 21% / 14:34:22 tmpfs 7.9G 0 7.9G 0% /dev/shm 14:34:22 tmpfs 5.0M 0 5.0M 0% /run/lock 14:34:22 /dev/vda15 105M 6.1M 99M 6% /boot/efi 14:34:22 tmpfs 1.6G 4.0K 1.6G 1% /run/user/1001 14:34:22 14:34:22 14:34:22 ---> free -m: 14:34:22 total used free shared buff/cache available 14:34:22 Mem: 15989 703 10591 3 4694 14942 14:34:22 Swap: 1023 0 1023 14:34:22 14:34:22 14:34:22 ---> ip addr: 14:34:22 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 14:34:22 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 14:34:22 inet 127.0.0.1/8 scope host lo 14:34:22 valid_lft forever preferred_lft forever 14:34:22 inet6 ::1/128 scope host 14:34:22 valid_lft forever preferred_lft forever 14:34:22 2: ens3: mtu 1458 qdisc mq state UP group default qlen 1000 14:34:22 link/ether fa:16:3e:49:49:64 brd ff:ff:ff:ff:ff:ff 14:34:22 altname enp0s3 14:34:22 inet 10.30.170.128/23 metric 100 brd 10.30.171.255 scope global dynamic ens3 14:34:22 valid_lft 76503sec preferred_lft 76503sec 14:34:22 inet6 fe80::f816:3eff:fe49:4964/64 scope link 14:34:22 valid_lft forever preferred_lft forever 14:34:22 3: docker0: mtu 1458 qdisc noqueue state DOWN group default 14:34:22 link/ether 46:c2:7c:54:ff:bf brd ff:ff:ff:ff:ff:ff 14:34:22 inet 10.250.0.254/24 brd 10.250.0.255 scope global docker0 14:34:22 valid_lft forever preferred_lft forever 14:34:22 14:34:22 14:34:22 ---> sar -b -r -n DEV: 14:34:22 Linux 5.15.0-168-generic (prd-ubuntu2204-docker-4c-16g-19296) 02/20/26 _x86_64_ (4 CPU) 14:34:22 14:34:22 11:49:27 LINUX RESTART (4 CPU) 14:34:22 14:34:22 11:50:00 tps rtps wtps dtps bread/s bwrtn/s bdscd/s 14:34:22 12:00:01 125.16 7.18 113.31 4.68 819.71 27901.10 6211.03 14:34:22 12:10:02 1.70 0.00 1.67 0.03 0.04 27.38 8.82 14:34:22 12:20:02 16.36 0.01 1.67 14.69 0.16 19.28 220394.08 14:34:22 12:30:01 1.63 0.00 1.61 0.02 0.00 19.13 0.15 14:34:22 12:40:02 29.61 2.56 25.75 1.31 109.43 3705.71 3492.92 14:34:22 12:50:02 2.79 0.03 2.64 0.11 1.13 59.76 9.25 14:34:22 13:00:01 10.34 0.03 9.68 0.63 1.04 331.16 2982.76 14:34:22 13:10:02 11.49 0.45 10.52 0.52 54.90 574.63 1315.64 14:34:22 13:20:02 12.24 0.01 11.72 0.50 0.56 396.31 469.32 14:34:22 13:30:01 7.12 0.00 6.88 0.23 0.36 207.42 324.40 14:34:22 13:40:02 4.72 0.00 4.56 0.17 0.16 128.17 102.64 14:34:22 13:50:02 5.69 0.01 5.51 0.18 1.05 158.35 315.73 14:34:22 14:00:01 5.13 0.00 4.98 0.15 0.31 163.31 88.89 14:34:22 14:10:02 4.65 0.00 4.51 0.14 0.13 140.50 55.53 14:34:22 14:20:02 9.57 0.06 9.17 0.34 2.00 545.61 74.19 14:34:22 14:30:01 5.30 0.02 5.14 0.15 0.52 164.19 279.03 14:34:22 Average: 15.86 0.65 13.72 1.49 62.08 2162.47 14756.10 14:34:22 14:34:22 11:50:00 kbmemfree kbavail kbmemused %memused kbbuffers kbcached kbcommit %commit kbactive kbinact kbdirty 14:34:22 12:00:01 4644552 13599628 2331864 14.24 222988 8575180 3059012 17.56 1688156 9350196 192 14:34:22 12:10:02 6362972 15320244 612236 3.74 223464 8576852 1339500 7.69 1689576 7622096 204 14:34:22 12:20:02 6360708 15322924 609568 3.72 223944 8581320 1340728 7.70 1694356 7623324 308 14:34:22 12:30:01 6358692 15321296 611248 3.73 224328 8581328 1340872 7.70 1694728 7626460 16 14:34:22 12:40:02 4768908 8348248 7583488 46.32 237788 3267564 9391028 53.90 1648960 9324952 2768 14:34:22 12:50:02 7498256 11082108 4849548 29.62 238628 3271280 5522152 31.70 1655592 6583368 372 14:34:22 13:00:01 5530844 9135356 6797980 41.52 240500 3290052 8353812 47.95 1683240 8533000 320 14:34:22 13:10:02 8011768 11745528 4188524 25.58 245800 3411200 5025372 28.85 1723032 6012420 76 14:34:22 13:20:02 9154612 12947804 2987160 18.24 247540 3468780 3707492 21.28 1728072 4864392 376 14:34:22 13:30:01 7560360 11381380 4552464 27.80 248256 3495844 5225548 29.99 1729460 6459204 220 14:34:22 13:40:02 7556568 11394752 4538980 27.72 249040 3512144 5269548 30.25 1730868 6454148 396 14:34:22 13:50:02 6092664 9951744 5981184 36.53 249732 3532300 6691224 38.41 1732912 7913460 236 14:34:22 14:00:01 5989024 9869408 6063560 37.03 250528 3552752 6790032 38.97 1736252 8017156 604 14:34:22 14:10:02 7553812 11455876 4477920 27.35 251192 3573680 5226748 30.00 1738504 6445348 376 14:34:22 14:20:02 7415160 11442352 4491120 27.43 255104 3689568 5206588 29.89 1764724 6560200 788 14:34:22 14:30:01 7018484 11071000 4862408 29.70 255860 3714024 5778700 33.17 1775512 6946088 504 14:34:22 Average: 6742336 11836853 4096203 25.02 241543 4755867 4954272 28.44 1713372 7270988 485 14:34:22 14:34:22 11:50:00 IFACE rxpck/s txpck/s rxkB/s txkB/s rxcmp/s txcmp/s rxmcst/s %ifutil 14:34:22 12:00:01 lo 13.90 13.90 6.96 6.96 0.00 0.00 0.00 0.00 14:34:22 12:00:01 ens3 165.11 123.28 2350.99 16.30 0.00 0.00 0.00 0.00 14:34:22 12:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 12:10:02 lo 1.73 1.73 0.54 0.54 0.00 0.00 0.00 0.00 14:34:22 12:10:02 ens3 0.55 0.24 0.17 0.11 0.00 0.00 0.00 0.00 14:34:22 12:10:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 12:20:02 lo 0.10 0.10 0.01 0.01 0.00 0.00 0.00 0.00 14:34:22 12:20:02 ens3 0.42 0.18 0.13 0.08 0.00 0.00 0.00 0.00 14:34:22 12:20:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 12:30:01 lo 0.10 0.10 0.01 0.01 0.00 0.00 0.00 0.00 14:34:22 12:30:01 ens3 0.72 0.43 0.24 1.28 0.00 0.00 0.00 0.00 14:34:22 12:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 12:40:02 lo 9.71 9.71 9.67 9.67 0.00 0.00 0.00 0.00 14:34:22 12:40:02 ens3 1.12 0.91 0.43 0.46 0.00 0.00 0.00 0.00 14:34:22 12:40:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 12:50:02 lo 7.46 7.46 4.35 4.35 0.00 0.00 0.00 0.00 14:34:22 12:50:02 ens3 1.04 0.57 0.25 0.84 0.00 0.00 0.00 0.00 14:34:22 12:50:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:00:01 lo 11.19 11.19 7.05 7.05 0.00 0.00 0.00 0.00 14:34:22 13:00:01 ens3 0.72 0.56 0.16 0.12 0.00 0.00 0.00 0.00 14:34:22 13:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:10:02 lo 12.96 12.96 5.87 5.87 0.00 0.00 0.00 0.00 14:34:22 13:10:02 ens3 2.08 0.74 0.52 0.29 0.00 0.00 0.00 0.00 14:34:22 13:10:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:20:02 lo 8.57 8.57 5.12 5.12 0.00 0.00 0.00 0.00 14:34:22 13:20:02 ens3 1.54 0.60 0.38 0.23 0.00 0.00 0.00 0.00 14:34:22 13:20:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:30:01 lo 20.14 20.14 9.72 9.72 0.00 0.00 0.00 0.00 14:34:22 13:30:01 ens3 0.94 0.62 0.20 0.15 0.00 0.00 0.00 0.00 14:34:22 13:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:40:02 lo 17.47 17.47 6.39 6.39 0.00 0.00 0.00 0.00 14:34:22 13:40:02 ens3 0.68 0.40 0.14 0.10 0.00 0.00 0.00 0.00 14:34:22 13:40:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 13:50:02 lo 25.37 25.37 11.18 11.18 0.00 0.00 0.00 0.00 14:34:22 13:50:02 ens3 0.74 0.45 0.13 0.09 0.00 0.00 0.00 0.00 14:34:22 13:50:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 14:00:01 lo 17.76 17.76 10.53 10.53 0.00 0.00 0.00 0.00 14:34:22 14:00:01 ens3 0.84 0.56 0.22 0.16 0.00 0.00 0.00 0.00 14:34:22 14:00:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 14:10:02 lo 6.23 6.23 3.11 3.11 0.00 0.00 0.00 0.00 14:34:22 14:10:02 ens3 0.76 0.40 0.25 0.17 0.00 0.00 0.00 0.00 14:34:22 14:10:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 14:20:02 lo 9.79 9.79 5.85 5.85 0.00 0.00 0.00 0.00 14:34:22 14:20:02 ens3 0.69 0.52 0.18 0.14 0.00 0.00 0.00 0.00 14:34:22 14:20:02 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 14:30:01 lo 14.65 14.65 5.97 5.97 0.00 0.00 0.00 0.00 14:34:22 14:30:01 ens3 0.66 0.60 0.13 0.11 0.00 0.00 0.00 0.00 14:34:22 14:30:01 docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 Average: lo 11.07 11.07 5.77 5.77 0.00 0.00 0.00 0.00 14:34:22 Average: ens3 11.18 8.20 147.41 1.29 0.00 0.00 0.00 0.00 14:34:22 Average: docker0 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 14:34:22 14:34:22 14:34:22 ---> sar -P ALL: 14:34:22 Linux 5.15.0-168-generic (prd-ubuntu2204-docker-4c-16g-19296) 02/20/26 _x86_64_ (4 CPU) 14:34:22 14:34:22 11:49:27 LINUX RESTART (4 CPU) 14:34:22 14:34:22 11:50:00 CPU %user %nice %system %iowait %steal %idle 14:34:22 12:00:01 all 37.84 0.00 2.18 2.25 0.09 57.64 14:34:22 12:00:01 0 40.03 0.00 2.28 1.59 0.09 56.01 14:34:22 12:00:01 1 38.26 0.00 2.28 2.32 0.09 57.04 14:34:22 12:00:01 2 36.11 0.00 1.93 2.17 0.09 59.70 14:34:22 12:00:01 3 36.94 0.00 2.24 2.93 0.09 57.80 14:34:22 12:10:02 all 0.59 0.00 0.07 0.04 0.02 99.28 14:34:22 12:10:02 0 0.51 0.00 0.09 0.01 0.02 99.37 14:34:22 12:10:02 1 0.98 0.00 0.07 0.01 0.03 98.90 14:34:22 12:10:02 2 0.40 0.00 0.07 0.08 0.02 99.43 14:34:22 12:10:02 3 0.46 0.00 0.05 0.06 0.02 99.40 14:34:22 12:20:02 all 0.23 0.00 0.05 0.05 0.03 99.64 14:34:22 12:20:02 0 0.04 0.00 0.04 0.02 0.02 99.89 14:34:22 12:20:02 1 0.66 0.00 0.05 0.05 0.02 99.22 14:34:22 12:20:02 2 0.15 0.00 0.07 0.08 0.05 99.65 14:34:22 12:20:02 3 0.08 0.00 0.03 0.06 0.01 99.81 14:34:22 12:30:01 all 0.28 0.00 0.04 0.02 0.02 99.63 14:34:22 12:30:01 0 0.07 0.00 0.04 0.07 0.02 99.81 14:34:22 12:30:01 1 0.93 0.00 0.06 0.00 0.03 98.97 14:34:22 12:30:01 2 0.09 0.00 0.04 0.01 0.02 99.84 14:34:22 12:30:01 3 0.05 0.00 0.03 0.01 0.01 99.91 14:34:22 12:40:02 all 23.44 0.00 1.08 0.18 0.07 75.22 14:34:22 12:40:02 0 23.19 0.00 1.07 0.36 0.07 75.31 14:34:22 12:40:02 1 22.56 0.00 1.19 0.16 0.07 76.02 14:34:22 12:40:02 2 23.76 0.00 1.05 0.12 0.07 75.00 14:34:22 12:40:02 3 24.25 0.00 1.03 0.10 0.07 74.56 14:34:22 12:50:02 all 5.27 0.00 0.34 0.07 0.06 94.27 14:34:22 12:50:02 0 5.20 0.00 0.38 0.02 0.06 94.34 14:34:22 12:50:02 1 5.34 0.00 0.37 0.12 0.06 94.11 14:34:22 12:50:02 2 5.07 0.00 0.33 0.09 0.06 94.45 14:34:22 12:50:02 3 5.46 0.00 0.28 0.05 0.07 94.15 14:34:22 13:00:01 all 22.31 0.00 0.83 0.05 0.07 76.74 14:34:22 13:00:01 0 22.15 0.00 0.79 0.02 0.07 76.98 14:34:22 13:00:01 1 22.07 0.00 0.83 0.03 0.08 76.99 14:34:22 13:00:01 2 22.11 0.00 0.80 0.09 0.07 76.93 14:34:22 13:00:01 3 22.93 0.00 0.89 0.07 0.07 76.04 14:34:22 13:10:02 all 16.63 0.00 0.65 0.06 0.07 82.58 14:34:22 13:10:02 0 16.38 0.00 0.67 0.05 0.07 82.83 14:34:22 13:10:02 1 17.30 0.00 0.64 0.04 0.07 81.94 14:34:22 13:10:02 2 16.55 0.00 0.69 0.12 0.07 82.56 14:34:22 13:10:02 3 16.30 0.00 0.61 0.02 0.07 83.00 14:34:22 13:20:02 all 19.16 0.00 0.64 1.12 0.07 79.02 14:34:22 13:20:02 0 18.82 0.00 0.62 2.88 0.07 77.61 14:34:22 13:20:02 1 19.29 0.00 0.64 0.74 0.08 79.26 14:34:22 13:20:02 2 18.86 0.00 0.69 0.50 0.08 79.87 14:34:22 13:20:02 3 19.65 0.00 0.61 0.34 0.07 79.33 14:34:22 13:30:01 all 14.69 0.00 0.57 0.10 0.06 84.57 14:34:22 13:30:01 0 14.63 0.00 0.52 0.04 0.07 84.75 14:34:22 13:30:01 1 14.49 0.00 0.60 0.04 0.06 84.81 14:34:22 13:30:01 2 15.03 0.00 0.62 0.21 0.07 84.07 14:34:22 13:30:01 3 14.61 0.00 0.55 0.12 0.06 84.65 14:34:22 13:40:02 all 7.82 0.00 0.40 0.02 0.06 91.70 14:34:22 13:40:02 0 8.22 0.00 0.43 0.01 0.06 91.27 14:34:22 13:40:02 1 7.43 0.00 0.36 0.05 0.06 92.10 14:34:22 13:40:02 2 7.96 0.00 0.42 0.02 0.06 91.54 14:34:22 13:40:02 3 7.66 0.00 0.40 0.01 0.06 91.88 14:34:22 14:34:22 13:40:02 CPU %user %nice %system %iowait %steal %idle 14:34:22 13:50:02 all 10.58 0.00 0.48 0.02 0.06 88.85 14:34:22 13:50:02 0 10.41 0.00 0.43 0.01 0.06 89.10 14:34:22 13:50:02 1 10.73 0.00 0.49 0.01 0.07 88.69 14:34:22 13:50:02 2 10.94 0.00 0.51 0.04 0.07 88.44 14:34:22 13:50:02 3 10.25 0.00 0.49 0.03 0.06 89.18 14:34:22 14:00:01 all 9.79 0.00 0.37 0.02 0.06 89.76 14:34:22 14:00:01 0 9.94 0.00 0.40 0.01 0.05 89.59 14:34:22 14:00:01 1 10.09 0.00 0.32 0.01 0.06 89.52 14:34:22 14:00:01 2 9.36 0.00 0.38 0.02 0.06 90.17 14:34:22 14:00:01 3 9.75 0.00 0.36 0.05 0.06 89.78 14:34:22 14:10:02 all 6.70 0.00 0.27 0.02 0.06 92.95 14:34:22 14:10:02 0 7.11 0.00 0.25 0.02 0.05 92.57 14:34:22 14:10:02 1 6.49 0.00 0.34 0.01 0.06 93.11 14:34:22 14:10:02 2 7.16 0.00 0.25 0.02 0.07 92.50 14:34:22 14:10:02 3 6.04 0.00 0.25 0.03 0.05 93.63 14:34:22 14:20:02 all 12.87 0.00 0.55 0.04 0.06 86.48 14:34:22 14:20:02 0 13.58 0.00 0.51 0.09 0.06 85.77 14:34:22 14:20:02 1 12.50 0.00 0.61 0.01 0.07 86.82 14:34:22 14:20:02 2 12.82 0.00 0.62 0.06 0.07 86.44 14:34:22 14:20:02 3 12.58 0.00 0.45 0.01 0.06 86.90 14:34:22 14:30:01 all 7.35 0.00 0.40 0.02 0.07 92.15 14:34:22 14:30:01 0 7.47 0.00 0.39 0.01 0.07 92.07 14:34:22 14:30:01 1 7.44 0.00 0.46 0.04 0.06 92.00 14:34:22 14:30:01 2 7.30 0.00 0.35 0.02 0.06 92.26 14:34:22 14:30:01 3 7.20 0.00 0.39 0.03 0.10 92.28 14:34:22 Average: all 12.22 0.00 0.56 0.26 0.06 86.91 14:34:22 Average: 0 12.36 0.00 0.56 0.32 0.06 86.71 14:34:22 Average: 1 12.28 0.00 0.58 0.23 0.06 86.85 14:34:22 Average: 2 12.10 0.00 0.55 0.23 0.06 87.06 14:34:22 Average: 3 12.14 0.00 0.54 0.24 0.06 87.02 14:34:22 14:34:22 14:34:22